What Steve doesn't know about Search isn't worth knowing. He also has a great ability to impart his knowledge on to others in a friendly but highly professional training environment.
You are here: home Google sandbox
In March 2004, Google allegedly introduced the Sandbox Effect in an effort to prevent spam websites achieving a high ranking soon after launch. This filter is now a thing of the past as Google's algorithm now incorporates domain authority metrics.
The Google Sandbox filter was like a probationary period for a website which typically lasts up to 12 months. During this time, affected websites may perform badly in the Google organic search results for competitive terms. During the period of the Google Sandbox, website trust metrics are assessed which then play a considerable part in determining the future ranking of the site after its release from the Sandbox.
One theory concerning the Google Sandbox is that it effectively extended the normal backlink ageing process, meaning that acquired backlinks offer reduced benefit for an extended period of time until the trustworthiness of any new website is established by Google.
Many SEO experts now dispute the continued existence of the Sandbox, preferring the theory that a new website has poor initial ranking purely due to a lack of trusted inbound links.
The effect of the Sandbox filter was to reduce the ranking of some new web domains, making it very difficult for them to compete in the Google search engine results (SERP) even for slightly competitive keywords for periods of up to 12 months (maximum).
Whilst affected by the Google Sandbox effect, it was possible for a website to rank for non competitive terms, so long tail keyword selection was recommended when SEO copywriting for new websites. Importantly, acquiring more "trusted backlinks" from established hub sites can help a new website escape the effects of the Sandbox filter earlier.
There is strong evidence that the Google Sandbox filter was triggered by unnatural link building activities and website over optimisation soon after launch or first Google indexing. Interestingly, the Sandbox seemed to only affect English language websites and did not seem to affect .gov, .edu domains.
Excessive reciprocal link building and acquiring too many inbound links at the same time in an attempt to artificially raise the ranking of newly launched sites was the most common cause of triggering the Sandbox filter. Since new web domains often start with few links, it was easy for Google to spot irregularities in backlink acquisition rates and repetitive anchor text.
To try to avoid the Google Sandbox altogether, new websites needed to avoid doing too much reciprocal link building and instead to concentrate on building one way links from high Page Rank web directories, article submission sites and most importantly for Google ranking, authority hub sites of a similar theme. It was also important to avoid site wide links pointing to your site as these are often used by spammers and can have a negative ranking effect, since many site-wides are bought links.
Websites which were heavily affected by the Google sandbox tend to demonstrate a large differential between their ranking for an allinanchor:<keyword> search or allintext:<keyword> search and the same keyword term in Google's regular organic SERP. So, to test for the Sandbox effect, choose an appropriate keyword and check the ranking difference between allinanchor and the regular Google organic search results.
Websites affected by the Google Sandbox effect may rank well in Yahoo! and MSN as these search engines will give benefit for new backlinks much sooner. They're also much less picky about link quality and relevance than Google, so pretty much any inbound link will help Yahoo! and MSN website ranking.
In contrast, many links are discounted by Google from the outset due to reciprocation, lack of trust (of link source) or lack of thematic relevance. So, all of this taken into consideration, it is common for a Google Sandboxed website to rank between 300 - 500 for a certain keyword, whilst ranking top 10 in Yahoo! and MSN for the same term!
During the sandboxed period, Google scrutinised link acquisition trends and looked at a number of website quality indicators to help establish trust. The following attributes were assessed and were known to influence site ranking:-
The rate of acquiring links to your website from other sites / domains is assessed. Google likes to see natural backlink acquisition from relevant sources with little obvious evidence of artificial attemps at SEO.
The quantity of links from other sites on the Worldwide Web - known as the "link popularity" of a site can have a major bearing on its ranking.
The number of low quality links (including reciprocal links) versus high quality one way links from authority sites and reputable directories like DMOZ.org is established. One way links from trusted sources within the same industry sector will really help a new website to improve its ranking.
Outbound linking from your website to bad neighbourhoods or the use of detectable link farms or link schemes can lead to reduced website ranking or even a search engine penalty.
Significant numbers of reciprocal links with poor quality websites and websites of dissimilar, irrelevant theme can cause ranking problems and cause Google penalties. This type of linking should be used in moderation.
Backlink anchor text for all website pages is assessed by Google. Adding too many backlinks with the same anchor text can hurt Google ranking and trigger the Sandbox filter, particularly where webmaster collusion is obvious, such as where the links are all reciprocated. We recommend that you check backlink anchor text for variance using the excellent new SEO tool at http://www.webuildpages.com/neat-o/. SEOMoz have a similar tool available to their preimum members.
Google checks for website spam, keyword stuffing or excessive keyword repetition. All of these can adversely effect ranking performance.
The greater Google trust your website acquires, the faster a website escaped the Sandbox filter and saw improved keyword ranking. Websites are typically released from the Sandbox en masse when Google is satisfied that they reasonably adhere to their quality guidelines and have built up sufficient trust.
There is some evidence that the Sandbox could be extended for websites which adopt poor SEO techniques and badly managed link building campaigns soon after launch. An over reliance on building large numbers of reciprocal links during first 12 months after website launch was equally ill advised, often leading to a massive drop in Google ranking.
Escaping the Google Sandbox early was possible, and there is reasonable catalogued evidence that early release from the Google Sandbox filter could occur if sufficient quality one way links are acquired from authority websites - these are hub sites which are leaders in their particular field.
This goes some way to explaining why websites relating (for example) to new movie releases never suffered the effects of the Google Sandbox filter as they acquired many quality links from the outset and don't need to adopt damaging artificial SEO practices to boost website ranking!
Rather than focussing too much on SEO for a new website, we strongly recommend that your initial efforts concentrate on building up your website as a unique information source with informative articles and topical news. This will really help your website to acquire a natural spread of one way links pointing to an important cross section of your internal pages as well as your homepage, helping to release you from the Google Sandbox.
Use all aspects of search engine optimization in moderation whilst a new website establishes itself. Keyword stuffed titles, Meta tags and excessive repetitive keyword usage in website content should be avoided as this can actively reduce the Google ranking of new domains, through triggering spam filters.
With a new website the threshold for spam filtering is thought to be somewhat lower. KSL Consulting have noticed web pages with natural language text and inherently low keyword density can outperform keyword stuffed pages, at least in some part because they attract more natural links.
Since a Google Sandboxed website had poor ranking against competitive keywords, it was necessary to optimise for less popular long tail keywords whilst in the Sandbox. For advice on choosing the best long tail keyword phrases using free SEO tools, read our long tail keyword advice page.
For more SEO advice for new web domains during the Google Sandbox, contact KSL Consulting for a without obligation discussion by clicking here.