Get a list of robots.txt blocked URL and tell Google to crawl and index it.
-
Some of my key pages got blocked by robots.txt file and I have made required changes in robots.txt file but how can I get the blocked URL's list.
My webmaster page Health>blocked URL's shows only number not the blocked URL's.My first question is from where can I fetch these blocked URL's and how can I get them back in searches,
One other interesting point I see is that blocked pages are still showing up in searches.Title is appearing fine but Description shows blocked by robots.txt file.
I need urgent recommendation as I do not want to see drop in my traffic any more.
-
"changing the lastmod of those pages to today".
How can I make these changes?
Right now the news is that Resubmitted the Sitemap and no warnings this time.
-
I imagine that since you've got a robots txt error you'll probably ended closing a whole directory to bots which you wanted to be indexed. You can easily spot the directory and resubmit a sitemap to google changing the lastmod of those pages to today and the priority to 1 but only of those pages.
If you still receive warnings it may be due to errors in your sitemap. You're probably including some directory you don't want. You can try it in GWT putting in the box at the bottom the url you want to maintain in the index and then trying to see if some urls are being blocked by your robots.
If you want you can post here your robots and the URIs you want to be indexed without knowing the domain so that won't be public. Hope this may help you
-
Ok Resubmitted it.but even with updated file it gives a lot of errors.I think it takes some time.20,016 warnings
I have not added no index attribute in my header region.It was all messy stuff with robots.txt file.It means that with site showing up in SERP the rank will probably be the same or it was deranked?
-
Go into GWMT and resubmit sitemap.xml files (with the URLs you want indexed) for recrawling and Google will digest the sitemaps again, instead of waiting for Googlebot to come around on their own, you are requesting it to come around, also include those new sitemap files in your robots.txt file.
-
In Google Webmaster Tools, go to Health -> Fetch As Google. Then add the previously blocked URL and click Fetch. Once you've done that, refresh the page and click "Submit to index". That should get Google indexing those pages again.
Getting external links to your pages also helps get pages crawled & indexed, so it may be worth submitting your pages to social bookmarking sites, or get other types of backlinks to your previously blocked pages if possible.
-
Since you fixed your robots.txt file you should be good to. It will probably take a few days for Google to recrawl your site and update the index with the URLs they are now allow to crawl.
Blocked URLs can still show up in SERPs if you haven't defined the no-index attribute in your section.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Do custom tracking codes affect indexing?
Hello, My company uses a tracking system that allows our employees to apply a short code snippet to the end of our URLs for marketing attribution. An example of such a code would be: https://www.schoolofmotion.com/?ref_id=moz-test However, in Google Analytics we are seeing duplicates of our content, where the pages with the individual tracking codes are counted separately from the pages without. From a reporting perspective, this is annoying and definitely worth a fix. However, I'm curious if this problem is affecting our search potential as well. Could this tracking system be splitting traffic in Google's eyes? From an SEO perspective, how should we approach this? Would canonical tags fix this duplication issue in Google Analytics? Is there something else that we should use? Thanks in advance. The Moz community is incredible.
Reporting & Analytics | | CalebWardSoM1 -
On Google Analytics, Pages that were 301 redirected are still being crawled. What's the issue here?
URL that we redirected are being crawled on Google Analytics. Since they dont exist, they have high bounce rates. What can the issue be?
Reporting & Analytics | | prestigeluxuryrentals.com0 -
Is Google doing anything about the spam referrer plague in Google Analytics?
My client manages 15+ websites and we have observed the spam referrer issue has gotten steadily worse and worse in the last few months. I took the time to weed out the worst offenders via traffic filters but our referral traffic source garden was quickly full of weeds again a few weeks later. Does anybody know what the Google Analytics development team is doing to combat the spam referrer issue? I find it quite amazing that Google goes through the asinine effort to block out organic keyword data ("not found" & "not provided) but they don't seem to care about blacklisting/filtering spam referrers. RHnE6Wx
Reporting & Analytics | | RosemaryB3 -
How often does google content experiments stats update?
From my experience it seems to update once per day (every 24 hours), can anyone confirm this is the case or have a link to an official announcement which confirms how often the data updates? It would be handy to know when it updates so we can see the latest information as it comes in.
Reporting & Analytics | | Twist3600 -
How to safely exclude search result pages from Google's index?
Hello everyone,
Reporting & Analytics | | llamb
I'm wondering what's the best way to prevent/block search result pages from being indexed by Google. The way search works on my site is that search form generates URLs like:
/index.php?blah-blah-search-results-blah I wanted to block everything of that sort, but how do I do it without blocking /index.php ? Thanks in advance and have a great day everyone!0 -
Google Analytics: Okay to change domain?
So, we are a long time user of GA and we're planning a domain change.
Reporting & Analytics | | jmueller0823
Does anyone know if I can 'change the domain' in GA so we don't lose our past data?
Thanks!0 -
Strange Traffic Spike In Google Analytics
Over the last two days I have seen a huge spike in the pageviews to my site www.learnbonds.com. Up to this point my average page views per visit has been less than 2. For the last couple of days it has jumped to 5 but the average time on the site has not increased. Upon further investigation under the browser and OS page of google analytics I found that traffic coming from internet explorer and IE with Chrome Frame is reporting 10+ pageviews per visit. Below is a screenshot. Any insight into what might be causing this and how to fix it that the community can provide would be appreciated. Thanks
Reporting & Analytics | | davidwaring
DaveZUaQWJd.png?1?21800 -
Google is listing my site using IP also, is it normal?
https://www.google.com/search?sourceid=chrome&ie=UTF-8&q=site%3A50.97.XXX.XXX About 7,050 results (0.24 seconds) when we do list by domain we get : About 10,400,000 results (0.29 seconds) is it ok? would google smart enough to count IP address not as duplicate content?
Reporting & Analytics | | tpt.com0