Recovering from Programmers Error
-
Hey Everybody!
Last year one of my bigger sites hit a snaffu. I was getting about 300k + hits a day from google, and then, when a developper released an update with a robots.txt file that basically blocked google from the entire site.
We didn't notice the bug until a few days later, but by then, it was already too late. My google traffic dropped to 30k a day and I've been having the hardest time coming back ever since.
As a matter of fact, hundreds of sites that were aggregating my content started outranking me for my own terms.
For over a year, I've been working on building what I lost back and everything seemed to be coming together. I was back at 100k+ hits a day
Until today... My developpers repeated the exact same error as last year. They blocked google from crawling my site for over 5 days and now I'm down to 10k se hits a day.
My question : Has anyone encountered this problem before and what did you do to come back?
-
My Friend,
I was having the exactly same problem, and finally my solution was add at operating system level the "Read-Only" attribute for this file.
Hope it help
Claudio
-
I don't have an answer for your traffic, but I've had similar experiences with developers. I ended up using Code Monitor from Pole Position at https://polepositionweb.com/roi/codemonitor/index.php. I had it monitor the contents of the robots.txt file for the live site and all dev sites. Once a day it would check the file for any changes, and email me if there were changes, so I had a max lag time of 24 hours to be notified that the devs had done something again.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How can i recover from japanes keyword hack after wordpress clean up
I'm facing a big issue on my two blog legitloaded and asirimp3 months after I patch and clean up my WordPress and plugins, too... I still see the hacked site in google search that I have asked to delete, but I still see the same character and the hacked material in google. I don't know what to do, I'm confused. I try asking google to update my page, but I can't find the URL in the search console. when checking on google with URL: site:legitloaded.com tzj6XvF
Intermediate & Advanced SEO | | frankbanny0 -
Wordpress Tags error in MOZ
Hi, We are getting an enormous amount of missing meta description only on our tag archives. When we post we fill in a description and are using the Yoast plugin ( getting green lights). Now we are finding that we're missing descriptions in tags archives. What is the best thing to do? We're finding that the tags are creating a separate url for each tag that has missing description even though the post has a full description. 1. To block spiders from crawling tags? 2. To stop using tags? 3. What do you suggest? Thank you
Intermediate & Advanced SEO | | WalterHalicki0 -
New site causes massive drop off in ranking, old site restored how long to recover?
Hello, We launched and updated version of our site, mainly design changes and some functionality. 3 days after the launch we vanished from the rankings, previous page one results were now out of the top 100. We have identified some of the issues with the new site and chose to restore the old well ranking site. My question is how long might it take for the ranking to come back, if at all? The drop happened on the third day and the site was restored on the third day. We are now on day 6. Using GWT with have used fetch as Google and resubmitted the site map. Any help would be gladly received. Thanks James
Intermediate & Advanced SEO | | JamesBryant0 -
404 errors
Hi, we have plenty of 404 errors. We just deal with those that are of the highest priority (the ones that have high page authority). We have also a lot of errors like this: http://www.weddingrings.com/www.yoy-search.com . Does it make sense to redirect those to the home page or leave them as an 404 error?
Intermediate & Advanced SEO | | alexkatalkin0 -
Duplicate Errors from Wordpress login redirects
I've some Duplicate issues showing up in Moz Analytics which are due to a Q&A plugin being used on a Wordpress website which prompts the user to login. There's a number of links looking like the one shown below, which lead to the login page: www.website.com/wp-login.php?redirect_to=http%3A%2F%2Fwww.website.com%question%2.... What's the best way to deal with this? -- extra info: this is only showing up in Moz Analytics. Google Webmaster Tools reports no duplicates.. I'm guessing this is maybe down to the 'redirect_to' parameter being effective in grouping the URLs for Googlebot. currently the wplogin and consequent redirects are 'noindex, follow' - I cannot see where this is being generated from in wp-login.php to change this to nofollow (if this will solve it).
Intermediate & Advanced SEO | | GregDixson0 -
How important is it to fix Server Errors?
I know it is important to fix server errors. We are trying to figure out how important because after our last build we have over 19,646 of them and since google only gives us a 1000 at a time the fastest way to tell them we have fixed them all is to use the api etc which will take time. WE are trying to decide is it more important to fix all these errors right now or focus on other issues and fix these errors when we have time, they are mostly ajax errors. Could this hurt our rankings? Any thoughts would be great!
Intermediate & Advanced SEO | | DoRM0 -
Duplicate Content Error because of passed through variables
Hi everyone... When getting our weekly crawl of our site from SEOMoz, we are getting errors for duplicate content. We generate pages dynamically based on variables we carry through the URL's, like: http://www.example123.com/fun/life/1084.php
Intermediate & Advanced SEO | | CTSupp
http://www.example123.com/fun/life/1084.php?top=true ie, ?top=true is the variable being passed through. We are a large site (approx 7000 pages) so obviously we are getting many of these duplicate content errors in the SEOMoz report. Question: Are the search engines also penalizing for duplicate content based on variables being passed through? Thanks!0 -
Crawl errors in GWT!
I have been seeing a large number of access denied and not found crawl errors. I have since fixed the issued causing these errors; however, I am still seeing the in webmaster tools. At first I thought the data was outdated, but the data is tracked on a daily basis! Does anyone have experience with this? Does GWT really re-crawl all those pages/links everyday to see if the errors still exist? Thanks in advance for any help/advice.
Intermediate & Advanced SEO | | inhouseseo0