Recovering old disallow file?
-
Hi guys,
We had aN SEO agency do a disallow request on one of our sites a while back. They have no trace of the disallow txt file and all the links they disallowed.
Does anyone know if there is a way to recover this file in google webmaster tools or anyway to find which links were disallowed?
Cheers.
-
Have you performed any more disavow processes since? If you have not, simply login to your Search Console and head to the disavow section.
https://www.google.com/webmasters/tools/disavow-links
If you have not added any more, simply click the profile you are working with, and it should open up a box with a link to your most recent added file. See my linked image which shows the pop up. You can then download the text file that you/they added.
Hope that helps.
If it is an older file, I would suggest talking to google about seeing previous versions of the txt file, that is if they hold onto them.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Answer Box For Old Forum Discussions
Thinking about taking some pages that are one question discussion form pages and attempting to gain an answer box in Google SERPs. Just like here on Moz, each page has one questions and often many, many, answers. This would mean making an unordered list with the question and the components of the answer in a box, maybe with a jpg. The info in the box would be kind of a summary of key points in answering the question. I'm looking for suggestions on: a) Is this a good idea? b) What might help achieve the intended search result? c) Does it matter that the box answers are summarized from a variety of folks forum posts, so not just one author? d) Might it drive up bounce rate on the actual page? e) Other considerations? These are pages that have been around for awhile and aren't currently active discussions. Thanks!
Intermediate & Advanced SEO | | 945010 -
Should you delete old blog posts for SEO purposes?
Hey all, When I run crawl diagnostics I get around 500 medium-priority issues. The majority of these (95%) come from issues with blog pages (duplicate titles, missing meta desc, etc.). Many of these pages are posts listing contest winners and/or generic announcements (like, "we'll be out of the office tomorrow"). I have gone through and started to fix these, but as I was doing so I had the thought: what is the point of updating pages that are completely worthless to new members (like a page listing winners in 2011, in which case I just slap a date into the title)? My question is: Should I just bite the bullet and fix all of these or should delete the ones that are no longer relevant? Thanks in advance, Roman
Intermediate & Advanced SEO | | Dynata_panel_marketing1 -
Redirects in .htaccess file
I am using yoast seo plugin for wordpress. I have done several link changes and placed approximately 80-100 301 redirect into .htaccsses file. Is this good practice? Do they have to be there forever? Is it not slowing down site? Or should this be done in different way? Thank you very much Iris
Intermediate & Advanced SEO | | Rebeca10 -
Issue with Robots.txt file blocking meta description
Hi, Can you please tell me why the following error is showing up in the serps for a website that was just re-launched 7 days ago with new pages (301 redirects are built in)? A description for this result is not available because of this site's robots.txt – learn more. Once we noticed it yesterday, we made some changed to the file and removed the amount of items in the disallow list. Here is the current Robots.txt file: # XML Sitemap & Google News Feeds version 4.2 - http://status301.net/wordpress-plugins/xml-sitemap-feed/ Sitemap: http://www.website.com/sitemap.xml Sitemap: http://www.website.com/sitemap-news.xml User-agent: * Disallow: /wp-admin/ Disallow: /wp-includes/ Other notes... the site was developed in WordPress and uses that followign plugins: WooCommerce All-in-One SEO Pack Google Analytics for WordPress XML Sitemap Google News Feeds Currently, in the SERPs, it keeps jumping back and forth between showing the meta description for the www domain and showing the error message (above). Originally, WP Super Cache was installed and has since been deactivated, removed from WP-config.php and deleted permanently. One other thing to note, we noticed yesterday that there was an old xml sitemap still on file, which we have since removed and resubmitted a new one via WMT. Also, the old pages are still showing up in the SERPs. Could it just be that this will take time, to review the new sitemap and re-index the new site? If so, what kind of timeframes are you seeing these days for the new pages to show up in SERPs? Days, weeks? Thanks, Erin ```
Intermediate & Advanced SEO | | HiddenPeak0 -
Disallow my store in robots.txt?
Should I disallow my store directory in robots.txt? Here is the URL: https://www.stdtime.com/store/ Here are my reasons for suggesting this: SEOMOZ finds crawl "errors" in there that I don't care about I don't think I care if the search engines index those pages I only have one product, and it is not an impulse buy My product has a 60 day sales cycle, so price is less important than features
Intermediate & Advanced SEO | | raywhite0 -
Hit by Penguin, Can I move the content from the old site to a new domain and start again with the same content which is high quality
I need some advice please. My website got the unnatural links detected message and was hit by penguin.. hard. Can I move the content from the current domain to a new domain and start again or does the content need to be redone also. I will obviously turn of the old domain once its moved. The other option is to try and identify the bad links and change my anchor profile which is a hit and miss task in my opinion. Would it not be easier just to identify the good links pointing to the old domain and get those changed to point to the new domain with better anchors. thanks Warren
Intermediate & Advanced SEO | | warren0071 -
301 redirects from old to new pages whit a lot of changes
Hello all, We are going to restyle and change CMS so all the urls will change. We are also updating content, adding much more content to the old pages trying to be more user and SEO friendly. My doubt is about doing 301 redirects from old to new pages when the content has changed a lot. Does it will mantain the ranking of the page or will crawlers thought that is a total diferent page. For example: one page new page will change from the old one the url, title, headers, meta description, content text and images. Should i maintain old content and do the CMS change with the 301 redirects and later change the content, that means a lot of work, or do it all at once? Thanks in advance Tomas
Intermediate & Advanced SEO | | tomas.guemes0 -
Using 2 wildcards in the robots.txt file
I have a URL string which I don't want to be indexed. it includes the characters _Q1 ni the middle of the string. So in the robots.txt can I use 2 wildcards in the string to take out all of the URLs with that in it? So something like /_Q1. Will that pickup and block every URL with those characters in the string? Also, this is not directly of the root, but in a secondary directory, so .com/.../_Q1. So do I have to format the robots.txt as //_Q1* as it will be in the second folder or just using /_Q1 will pickup everything no matter what folder it is on? Thanks.
Intermediate & Advanced SEO | | seo1234560