Accidently blocked our site for an evening?
-
Yesterday at about 5pm I switched our site to a new server and accidentally blocked our site from google for the evening. our domain is posnation.com and we are ranked in the top 3 in almost all pos related keywords. When i got in this morning i realized the mistake and went to google web tools and noticed the site was blocked so i went to fetch as google bot and corrected that. Now the message says:
Check to see that your robots.txt is working as expected. (Any changes you make to the robots.txt content below will not be saved.)
robots.txt file Downloaded Status
http://www.posnation.com/robots.txt 1 hours ago 200 (Success)When you go to google and type "pos systems" we are still #2 so i assume all is still ok. My question is will this potentially hurt our rankings and should i be worried and is there anything else I can do.
-
If you have any sort of caching installed, you could try refreshing it and resubmitting the sitemap.
I checked your robots.txt file at http://tool.motoricerca.info/robots-checker.phtml and it flagged the allow line. I don't think that would cause a problem, but you could try removing the "Allow: /" line and see if that helps.
-
Hey Nick thanks for your response...i did the first part but the sitemap on resubmit of the sitemap.xmp it wont take due to this error
URL restricted by robots.txt
but my sitemap file is here http://posnation.com/sitemap.xml
and is not blocking it...any ideas on what to do next
-
No you're ok. It used to be that if your site went down for even a few hours and the spiders came around that you could get deindexed. Now I guess they understand that stuff happens and you have a pretty long grace period before you get deindexed thankfully.
Good suggestions by Nick, also you can increase the googlebot crawl rate on your site in GWMT to get Google to come around again quicker.
-
If it was just blocked overnight you should be OK. Site's do go down for extended periods of time occasionally and I would assume Google won't de-index based on a relatively short outage.
To be safe, or at least make yourself feel like you have done what you can - resubmit your xml sitemap in webmaster tools. Also go to the "Fetch as GoogleBot" section and fetch your home page. Once it is fetched, click on the submit link and tell it to submit the page and all linked pages. You are probably OK without doing that, but it couldn't hurt to resubmit.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
Placement of /p/ in URL structure for ecommerce site product URLs
Hi, We're a discussion about how to structure a clients ecommerce site product page URLs where 12345 represent the product SKU/number: https://domain.com/Item--i-12345 https://domain.com/product-name/p/12345 https://domain.com/p/12345 It's a toss up between the second and the third URL, but the SEO company is saying the third is best because of the placement with the /p/ and creating a silo for "products" that help search engines recognize it is a product. Does anyone have thoughts on this? Thanks!
Algorithm Updates | | AliMac260 -
Are there any alternative ranking strategies for not a blog site other than on site SEO, speed improvement, building backlinks and social media engagement to improve rankings?
We own a horoscope website and looking for some SEO advice.However most of the websites are blog sites therefore most of the SEO content is about how to rank a blog site better. IE getting new quality content, use anchor text link out etc. However if your site is different by nature it is hard to find good advice on how to rank better in these scenarios. I would like to know if there are alternative ways of increasing rankings apart from the usual strategies of improving social media fan pages, building backlinks and optimising the site speed wise and making it accessible and understandable to crawlers and people too.
Algorithm Updates | | websitebuilder0 -
Recommended action for site hit by penguin ?
What is more advisable, though there surely could be debate on this? Back in '07 till sometime around a year ago it seems our site got hit by google's updates, no manual action though, and have seen in past few months disavowed what we could find as well as deleted a lot of links. We are also working on getting word out on the brand as well and trying to get on some business websites to have articles and offer some discounts. Our keyword rankings seem stuck in limbo the past year or so though. Some main keywords for example seem stuck around page 8 when they used to be on page 1. Question is, can what seems to be a penguin update be recovered from? Is Google likely to refresh the algorithm? Also could starting a new site be more worth the investment - starting fresh with natural links, etc And if googles system could pick up that the site is run from same ip, etc. would they care? Also the keyword competition one of Moz's tools said around 46% if that makes a difference for one of the main keywords. Thanks
Algorithm Updates | | xelaetaks0 -
How do I figure out what's wrong with my site?
I'm fairly new to SEO and can't pinpoint what's wrong with my site...I feel so lost. I am working on revamping www.RiverValleyGroup.com and can't figure out why it's not ranking for keywords. These keywords include 'Louisville homes', 'Homes for sale in Louisville KY', etc. Any suggestions? I write new blog posts everyday so I feel there's no shortage of fresh content. I'm signed up with Moz Analytics and Google analytics
Algorithm Updates | | gohawks77900 -
How can a site with two questionable inbound links outperform sites with 500-1000 links good PR?
Our site for years was performing at #1 for but in the last 6 months been pushed down to about the #5 spot. Some of the domains above us have a handful of links and they aren't from good sources. We don't have a Google penalty. We try to only have links from quality domains but have been pushed down the SERP's? Any suggestions?
Algorithm Updates | | northerncs0 -
How does this site rank no 1 for big terms with no optimisation?
Hi, A client recently asked me abut a site that appears to have popped up out of nowhere and is ranking for big terms within their industry: http://bit.ly/11jcpky I have looked at the site for a particular term: Cheap Beds I was using unpersonalised search on google.co.uk with location set to London. The site currently ranks no 1 for that term and other similar terms. The question is how? SEO Moz reports no backlinks (they must have blocked?) Ahrefs and Majestic report report some backlinks but not many and no anchor text with the term in. The Page title and meta do not contain the term nor does the page seem to contain the term anywhere. The domain does have some age though has no keyword match in the URL. I'm a little stumped to how they are achieving these results. Any Ideas Anyone?
Algorithm Updates | | JeusuDigital0 -
Subdomains or Subfolders for a multilingual site?
What kind of structure would you propose for a site with multiple languages, subdomains or subfolders?
Algorithm Updates | | dublinbet0