Accidently blocked our site for an evening?
-
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says:
Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success)When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.
-
If you have any sort of caching installed, you could try refreshing it and resubmitting the sitemap.
I checked your robots.txt file at http://tool.motoricerca.info/robots-checker.phtml and it flagged the allow line. I don't think that would cause a problem, but you could try removing the "Allow: /" line and see if that helps.
-
Hey Nick thanks for your response...i did the first part but the sitemap on resubmit of the sitemap.xmp it wont take due to this error
URL restricted by robots.txt
but my sitemap file is here http://posnation.com/sitemap.xml
and is not blocking it...any ideas on what to do next
-
No you're ok. It used to be that if your site went down for even a few hours and the spiders came around that you could get deindexed. Now I guess they understand that stuff happens and you have a pretty long grace period before you get deindexed thankfully.
Good suggestions by Nick, also you can increase the googlebot crawl rate on your site in GWMT to get Google to come around again quicker.
-
If it was just blocked overnight you should be OK. Site's do go down for extended periods of time occasionally and I would assume Google won't de-index based on a relatively short outage.
To be safe, or at least make yourself feel like you have done what you can - resubmit your xml sitemap in webmaster tools. Also go to the "Fetch as GoogleBot" section and fetch your home page. Once it is fetched, click on the submit link and tell it to submit the page and all linked pages. You are probably OK without doing that, but it couldn't hurt to resubmit.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Your search - site:domain.com - did not match any documents.
I've recently started work on a new clients website and done some preliminary work with on-page optimisation, and there is still plenty of work to be done and issues to resolve. They are ranking ok on Bing, but they are not getting any ranking on Google at all (except paid) - I tried the site:domain.com search and comes up with no results... so this confirms that something is going on with the google search rank! Can anyone shed light on what can cause this or why this would happen? My next step is to look at their webmaster tools (haven't had access yet), but if anyone has any tips to resolve this or where to look, that would be great! Thanks!
Algorithm Updates | | ElevateCreativeAU0 -
How do I figure out what's wrong with my site?
I'm fairly new to SEO and can't pinpoint what's wrong with my site...I feel so lost. I am working on revamping www.RiverValleyGroup.com and can't figure out why it's not ranking for keywords. These keywords include 'Louisville homes', 'Homes for sale in Louisville KY', etc. Any suggestions? I write new blog posts everyday so I feel there's no shortage of fresh content. I'm signed up with Moz Analytics and Google analytics
Algorithm Updates | | gohawks77900 -
New site or subdomain
what are pros and cons of launching a new product site as opposed to placing it under a subdomain of the company site? will the new site be placed in the google sandbox? the main goal is to provide credibility for the product, and by placing it under the company site that has been live for over 10 years. It is not a consumer product - more dealers. So people would be pushed to the site or find it through the brochure.
Algorithm Updates | | bakergraphix_yahoo.com0 -
Optimized site-wide internal links in footer - a problem?
Hello all - I am looking at a website with 8 heavily keyword optimized site-wide links in the footer. Yes, there are only 8 but it looks a bit spammy and I'm tempted to remove them. I imagine there's some possibility of a Google penalty too? What would your advice be? Thanks, Luke
Algorithm Updates | | McTaggart0 -
Increased 404 and Blocked URL Notifications in Webmaster Tools
In the last 45 days, I am receiving an increasing number of 404 alerts in Google Webmaster Tools. When I audit the notifications, they are not "new" broken links, these are all links that have been pointing to non-existent pages for years that for some reason Google is just notifying me about them. This has also coincided with about a 30% drop in organic traffic from late April to early May. The site is www.petersons.com and its been around for a while and the site attracts a fair amount of natural links so in the 2 years I've managed the campaign I've done very little link-building. I'm in the process of setting up redirects for these urls but why is Google now notifying me of years old broken links and could that be one of the reasons for my drop in traffic. My second issue is my I am being notified that I am blocking over 8,000 urls in my Robots file when I am not. I attached a screenshot. Here is a link to a screenshot. http://i.imgur.com/ncoERgV.jpg
Algorithm Updates | | CUnet0 -
Microsites for Local Search / Location Based sites?
Referring to the webinar on SEOMoz about Local Search that was presented by Nifty Marketing (http://www.seomoz.org/webinars/be-where-local-is-going). I have a question my client asked us regarding why we broke out their locations into microsites, and not just used subfolders. So here are the details: The client has one main website in real estate. They have 5 branches. Each branch covers about a 50 mile radius. Each branch also covers a specialized niche in their areas. When we created the main site we incorporated the full list of listings on the main site; We then created a microsite for each branch, who has a page of listings (same as the main site) but included the canonical link back to the main site. The reason we created a microsite for each branch is that the searches for each branch are very specific to their location and we felt that having only a subfolder would take away from the relevancy of the site and it's location. Now, the location sites rank on the first page for their very competitive, location based searches. The client, as we encourage, has had recommendations from others saying this is hurting them, not helping them. My question is this... How can this hurt them when the microsites include a home page specific to the location, a contact page that is optimized with location specific information (maps, text, directions, NAP, call to action, etc.), a page listing area information about communities/events/etc., a page of the location's agents, and of course real estate listings (with canonical back to the main site)? Am I misunderstanding? I understood that if the main site could support the separation of a section into a microsite, this would help local search. Local search is the bread and butter of this client's conversions. AND if you tell me we should go back to having subfolders for each location, won't that seriously hurt our already excellent rankings? The client sees significant visitors from their placement of the location URLs. THANKS!
Algorithm Updates | | gXeSEO
Darlene1 -
Is it better to build a large site that covers many verticals or many sites dedicated to each vertical
Just wondering from an seo perspective is it better to build a large site that covers many verticals or build out many sites one for each vertical?
Algorithm Updates | | tlhseo0 -
Site Usage Statistics and organic ranking
I'm not sure if anyone has tested this properly but i'm begining to suspect that google is using site usage statistics as a site quality guide and ultimately as a ranking variable. The this what i've seen so far on one of my sites (site A) Week 1= bounce rate (83.88%), Avg time on site (0:0:57), Pages/visit (1.28) no changes made to the site apart from the usual link building. Week 2: Traffic drops by 30%, Keywords generating traffic drops by 39%. Bounce rate (87.25%), Avg time on site (0:0:43), pages/visit (1.21). I replaced all affiliate links on my homepage to internal pages where the chunk of the content is and did a reconsideration request. Week 3: Traffic goes up by 30%, keywords generating traffic goes up by 65%, Bounce rate (30.41%), Avg time on site (0:3:02), Pages/visit (3.74). This is not the most scientific test but surely google must be using these variables and a ranking factor? Anyone seen something along these lines or have thoughts on it?
Algorithm Updates | | clickangel0