Recovering old disallow file?
-
Hi guys,
We had aN SEO agency do a disallow request on one of our sites a while back. They have no trace of the disallow txt file and all the links they disallowed.
Does anyone know if there is a way to recover this file in google webmaster tools or anyway to find which links were disallowed?
Cheers.
-
Have you performed any more disavow processes since? If you have not, simply login to your Search Console and head to the disavow section.
https://www.google.com/webmasters/tools/disavow-links
If you have not added any more, simply click the profile you are working with, and it should open up a box with a link to your most recent added file. See my linked image which shows the pop up. You can then download the text file that you/they added.
Hope that helps.
If it is an older file, I would suggest talking to google about seeing previous versions of the txt file, that is if they hold onto them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I still see the old page in index
Hello, I have done a redirect and still see in google index my old page after 3 weeks. My new page is there also Is it normal that the old page isn't dropped for the index yet ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Old Website Build Effecting SEO
So this is a bit of a strange one. My latest website was built on a different domain, then transferred over (as opposed to being built on a subdomain). I was told that the domain which my site was built on wasn't indexed by Google, but looking at the Google Search Console I can see that the old domain name is showing up as the most linked to domain name of my current site - meaning it was indexed. The domain (and all of its pages) does have a 301 redirect to the new website home page (as opposed to their individual pages), but could this be causing me a problem with SEO? Additionally, my website has a sister (UK and US websites), both link to each other on the footer (which appears on every page). Could this be pulling my SEO efforts down if it is a do-follow link?
Intermediate & Advanced SEO | | moon-boots0 -
¿Disallow duplicate URL?
Hi comunity, thanks for answering my question. I have a problem with a website. My website is: http://example.examples.com/brand/brand1 (good URL) but i have 2 filters to show something and this generate 2 URL's more: http://example.examples.com/brand/brand1?show=true (if we put 1 filter) http://example.examples.com/brand/brand1?show=false (if we put other filter) My question is, should i put in robots.txt disallow for these filters like this: **Disallow: /*?show=***
Intermediate & Advanced SEO | | thekiller990 -
Website still not recovered from Panda # 20 (Sep 2012 update)
Hi everyone,My website was hit by Panda around the 27th September 2012 (Panda # 20 or EMD) , since then, it's no longer in Google search results for a particular keyword [wallpapers], resulting in a massive sudden traffic drop (-90%) (see the screenshot below).Despite my best efforts auditing my links, identifying unnatural backlinks, disavowing bad links, enhancing my website content, improving user experience... (I even ended up with a completely revamped website: new design, new structure and new content), I didn't see any improvement! Can you please look at It and Advise me? I am ready to give up; I am in deep despair.What are my competitors doing better than me? Competitor #1 Competitor #2Thank you in advance - I appreciate your timeMy website: http://goo.gl/maaxazLroAvD5.jpg
Intermediate & Advanced SEO | | Spinodza0 -
Is our robots.txt file correct?
Could you please review our robots.txt file and let me know if this is correct. www.faithology.com/robots.txt Thank you!
Intermediate & Advanced SEO | | BMPIRE0 -
If i disallow unfriendly URL via robots.txt, will its friendly counterpart still be indexed?
Our not-so-lovely CMS loves to render pages regardless of the URL structure, just as long as the page name itself is correct. For example, it will render the following as the same page: example.com/123.html example.com/dumb/123.html example.com/really/dumb/duplicative/URL/123.html To help combat this, we are creating mod rewrites with friendly urls, so all of the above would simply render as example.com/123 I understand robots.txt respects the wildcard (*), so I was considering adding this to our robots.txt: Disallow: */123.html If I move forward, will this block all of the potential permutations of the directories preceding 123.html yet not block our friendly example.com/123? Oh, and yes, we do use the canonical tag religiously - we're just mucking with the robots.txt as an added safety net.
Intermediate & Advanced SEO | | mrwestern0 -
Is there a way to keep sitemap.xml files from getting indexed?
Wow, I should know the answer to this question. Sitemap.xml files have to be accessible to the bots for indexing they can't be disallowed in robots.txt and can't block the folder at the server level. So how can you allow the bots to crawl these xml pages but have them not show up in google's index when doing a site: command search, or is that even possible? Hmmm
Intermediate & Advanced SEO | | irvingw0 -
Launching a new site with old, new and updated content: What’s best practice?
Hi all, We are launching a new site soon and I’d like your opinion on best practice related to its content. We will be retaining some pages and content (although the URLs might change a bit as I intend to replace under-scores with hyphens and remove .asp from some extensions in order to standardise a currently uneven URL structuring). I will also be adding a lot of new pages with new content, along with amend some pages and their content (and amend URLs again if need be), and a few pages are going to be done away with all together. Any advice from those who’ve done the same in the past as to how best to proceed? Does the URL rewriting sound OK to do in conjunction with adding and amending content? Cheers, Dave
Intermediate & Advanced SEO | | Martin_S0