Recovering old disallow file?
-
Hi guys,
We had aN SEO agency do a disallow request on one of our sites a while back. They have no trace of the disallow txt file and all the links they disallowed.
Does anyone know if there is a way to recover this file in google webmaster tools or anyway to find which links were disallowed?
Cheers.
-
Have you performed any more disavow processes since? If you have not, simply login to your Search Console and head to the disavow section.
https://www.google.com/webmasters/tools/disavow-links
If you have not added any more, simply click the profile you are working with, and it should open up a box with a link to your most recent added file. See my linked image which shows the pop up. You can then download the text file that you/they added.
Hope that helps.
If it is an older file, I would suggest talking to google about seeing previous versions of the txt file, that is if they hold onto them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Any Tips for Reviving Old Websites?
Hi, I have a series of websites that have been offline for seven years. Do you guys have any tips that might help restore them to their former SERPs glory? Nothing about the sites themselves has changes since they went offline. Same domains, same content, and only a different server. What has changed is the SERPs landscape. I've noticed competitive terms that these sites used to rank on the first page for with far more results now. I have also noticed some terms result in what seems like a thesaurus similar language results from traditionally more authoritative websites instead of the exact phrase searched for. This concerns me because I could see a less relevant page outranking me just because it is on a .gov domain with similar vocabulary even though the result is not what people searching for the term are most likely searching for. The sites have also lost numerous backlinks but still have some really good ones.
Intermediate & Advanced SEO | | CopBlaster.com1 -
Found a cache of old domain names, should I link or 301 redirect
We have found a cache of about 10 URLs, some are ranking above our main URL in Google SERPS. What is the best course of action here? a. Redirect all to the homepage?
Intermediate & Advanced SEO | | moconn
b. Link all domains to the homepage?
c. Link all domains to select pages on on main site, being careful not to anchor text spam
d. 301 redirect all to the main site. Is there any disadvantage to your recommendation? Is there likely to be a penalty incurred? I feel like we'll get the strongest increase in rankings by following option c but it feels like option d may be safer. Thanks in advance for your help!0 -
72KB CSS code directly in the page header (not in external CSS file). Done for faster "above the fold" loading. Any problem with this?
To optimize for googles page speed, our developer has moved the 72KB CSS code directly in the page header (not in external CCS file). This way the above the fold loading time was reduced. But may this affect indexing of the page or have any other negative side effects on rankings? I made a quick test and google cache seems to have our full pages cached, but may it affect somehow negatively our rankings or that google indexes fewer of our pages (here we have some problems with google ignoring about 30% of our pages in our sitemap".)
Intermediate & Advanced SEO | | lcourse0 -
301 redirects in Wordpress vs. making old posts that you no longer want your audience to see private.
I'm working on Wordpress at the moment changing the content of a page on my website. The page has a lot of educational information and each section is unique. I had to go through and edit each section on google documents and now I'm posting all the new pages and making the old pages private on wordpress. Is this a good idea? I'm worried google will still crawl my private education pages and think these are duplicates since the new pages somewhat resemble the old. Also, should I be 301 redirecting all the old education pages to the corresponding new ones even if they are private on wordpress? I understand that the 301 redirect should only be used if you want the old page to go to a new one. What i don't understand id weather this will still be relevant or work if I've already made the old page private on wordpress. Thank You!
Intermediate & Advanced SEO | | SapphireCo0 -
Moving Entire Domain to New Site with New File Extensions
I have been looking for a while for a good an clear Step by Step guide for moving a site from an old to a new domain... so I guess a good discussion here, could help many web masters have a smooth transition. So in your opinion, beside the obvious, what are the most important steps you must take? Here is what I do: 1. 301 old site to new one and TEST.
Intermediate & Advanced SEO | | dhidalgo1
2. Check Internal Links - Double Check for 404's.
3. Update your Social Profiles with new URL.
4. Let GWT and BWT of the change and request a Crawl.
5. Contact as Many of Webmaster as you possibly can to point your links to your new domain. What's missing? What have you found helpful and/or Effective?0 -
Could you use a robots.txt file to disalow a duplicate content page from being crawled?
A website has duplicate content pages to make it easier for users to find the information from a couple spots in the site navigation. Site owner would like to keep it this way without hurting SEO. I've thought of using the robots.txt file to disallow search engines from crawling one of the pages. Would you think this is a workable/acceptable solution?
Intermediate & Advanced SEO | | gregelwell0 -
Large volume of ning files in subdomain - hurting or helping?
I have a client that has 600 pages in their root domain and a subdomain that contains 7500 pages of un-seoable Ning pages. PLUS another 650 pages from Sched.com that also is contributing to a large volume of errors. My question is - should I create a new domain for the Ning content - or am I better off with the volume of pages - even if they have loads of errors? Thanks!
Intermediate & Advanced SEO | | robertdonnell0 -
How to Disallow Specific Folders and Sub Folders for Crawling?
Today, I have checked indexing for my website in Google. I found very interesting result over there. You can check that result by following result of Google. Google Search Result I aware about use of robots.txt file and can disallow images folder to solve this issue. But, It may block my images to get appear in Google image search. So, How can I fix this issue?
Intermediate & Advanced SEO | | CommercePundit0