Google Sandboxing
-
I have a new site with a new domain that ranked well the 1st week or so after it was indexed then it totally dropped off the SERP.
My question is, does Google Sandboxing affect new sites on new domains that don't have any incoming links? The site dropped off before I began link building - from what I've read unnatural link build is often the cause. Can you still be sandboxed without any link building?
If this is the case, are there things I can do to get out of the sandbox?
Thanks folks,
Jason
-
I would just give it a little time, and see what happens when you get a few links.
-
Thanks so much for the quick response EGOL. I'm quite sure this isn't a duplication content issue, it is a website for dental practice and all of the content is fresh and unique to them.
We're in planning stage of the link building process, this is tops on our list.
I was surprised to see the site drop completely out of the SERP for searches such as the practice name. When it does rank it is pages deep and a it is an internal page that is listed (as opposed to the index page).
In Yahoo and Bing the index page ranks within the top 5 results for the company name. In Google the index page ranked well for the 1st couple of weeks, now it doesn't rank at all except for really long tailed searches. I've created similar sites with strong results for similarly competitive keywords. But this is the 1st time I have experienced this and the only real difference was this site had a brand new domain and no existing site before.
-
First... new sites often go in and out of the SERPs for a few days to a couple of weeks. Hang in there you might be fine.
Second.. if you have very few links and especially if they are weak links you might need more power to stay in the SERPs.
Finally... is anyone republishing your feed or scraping your content? Search for a unique sentence or two from your site between quotation marks. A few years ago a powerful news site was republishing my feed and BAM.. my blog fell out of the SERPs like a ton of bricks. Almost completely gone but my content on that news site was up in the SERPs. I wrote to them and asked them to remove my feed and they were good guys about it. A couple weeks later I was back in the SERPs at full strength. I honestly believe that what people call "the sandbox" is often a duplicate content problem caused by feed republishing or scraper sites.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google indexing is slowing down?
I have up to 20 million unique pages, and so far I've only submitted about 30k of them on my sitemap. We had a few load related errors during googles initial visits, and it thought some were duplicates, but we fixed all that. We haven't gotten a crawl related error for 2 weeks now. Google appears to be indexing fewer and fewer urls every time it visits. Any ideas why? I am not sure how to get all our pages indexed if its going to operate like this... love some help thanks! HnJaXSM.png
Technical SEO | | RyanTheMoz0 -
Google Published Date - Does Google Lie?
Here's the scenario. I create a page called "ABC" and it gets published and found by Google lets say on the 13th of April. on the 15th (or 14th) i decide to update the URL, page Title, and content. (Redirect old URL to new URL as well) Will Google still show this page as being published on the 13th? or would it update the publish date according to the new URL? Greg | | | | | | <a id="question_reply-to-question-36769-description_codeblock" class="mceButton mceButtonEnabled mce_codeblock" style="color: #000000; border: 1px solid #f0f0ee; margin: 0px 1px 0px 0px; padding: 0px; background-color: transparent; cursor: default; vertical-align: baseline; width: 20px; border-collapse: separate; display: block; height: 20px;" title="Create Code Block" tabindex="-1"></a>Create Code Block | | | | | | | | | | | | | | |
Technical SEO | | AndreVanKets0 -
Pages removed from Google index?
Hi All, I had around 2,300 pages in the google index until a week ago. The index removed a load and left me with 152 submitted, 152 indexed? I have just re-submitted my sitemap and will wait to see what happens. Any idea why it has done this? I have seen a drop in my rankings since. Thanks
Technical SEO | | TomLondon0 -
Homepage disappeared from Google Serp
I redirected my domain using this code in .htaccess : RewriteCond %{HTTP_HOST} ^xxxx.com
Technical SEO | | digitalkiddie
RewriteRule (.*) http://www.xxxx.com/$1 [R=301,L]
<ifmodule mod_rewrite.c="">RewriteEngine On
RewriteBase /
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L] RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /([^/]/)index.(html?|php)(?[^\ ])?\ HTTP/
RewriteRule ^(([^/]/)*)index.(html?|php)$ http://www.xxxx.com/$1 [R=301,L]</ifmodule> A day after I did it, got an error in GWMT "Google can't find your site's robots.txt" and my homepage disappeared from the result pages. When I try to open Google cache of the homepage I got an error 404. I generated new robots.txt, uploaded it , now the error doesnt show but still my homepage is not in the serps. Its been 3 days. What should I do ? Thanks in advance "Google can't find your site's robots.txt" error? - Pro ...0 -
Ranking on google.com.au but not google.com
Hi there, we (www.refundfx.com.au) rank on google.com.au for some keywords that we target, but we do not rank at all on google.com, is that because we only use a .com.au domain and not a .com domain? We are an Australian company but our customers come from all over the world so we don't want to miss out on the google.com searches. Any help in this regard is appreciated. Thanks.
Technical SEO | | RefundFX0 -
Does Google Bot accept Cookies
I am working with a per page results refinement that stores a cookie on the users computer and then keeps that same per page as the user goes around the site. I was just wondering if that was true for Google bot or Bing bot as well. Will they keep the cookie or would they not be able to accept it. I just want to know as I dont want different urls created if they can keep the cookie. Thanks!
Technical SEO | | Gordian0 -
How to get Google to index another page
Hi, I will try to make my question clear, although it is a bit complex. For my site the most important keyword is "Insurance" or at least the danish variation of this. My problem is that Google are'nt indexing my frontpage on this, but are indexing a subpage - www.mydomain.dk/insurance instead of www.mydomain.dk. My link bulding will be to subpages and to my main domain, but i wont be able to get that many links to www.mydomain.dk/insurance. So im interested in making my frontpage the page that is my main page for the keyword insurance, but without just blowing the traffic im getting from the subpage at the moment. Is there any solutions to do this? Thanks in advance.
Technical SEO | | Petersen110 -
Sandbox Cached
Hello, I worked on my site using a sandbox, this site got cached!!! So now all my pages on my regular site have several links from this IP address. I know this isn't that big of a deal. But I would like to get it straightened out. I've had disallow on the robots.txt for several weeks. How should I go about getting this IP address out of the google index? Thanks Tyler
Technical SEO | | tylerfraser0