Blocked URL parameters can still be crawled and indexed by google?
-
Hy guys,
I have two questions and one might be a dumb question but there it goes. I just want to be sure that I understand:
IF I tell webmaster tools to ignore an URL Parameter, will google still index and rank my url?
IS it ok if I don't append in the url structure the brand filter?, will I still rank for that brand?
Thanks,
PS: ok 3 questions :)...
-
If you want to permanently remove URLs from the index, this is the basic process:
Have your developer implement NoIndex, Follow to all pages that have the URL parameter you want removed. For example, if the URL contains categoryFilter= (like above), then add the NoIndex, Follow tag to the of the page. Do this for all URL paramters you want removed from the index.
Make sure Google is allowed to crawl those pages. If they are blocked by robots.txt or told not to crawl them via Google Webmaster Tools, Google will not be able to see the newly implement NoIndex, Follow tag.
Then, give it some time and wait. It may take Google a long time to crawl all of these paramtered URLs again. Fallout of the index might be slow.
Once the URLs are gone, consider blocking the crawling of them via robots.txt or in GWT parameter handling.
-
Hi Anthony,
What if we are trying to permanently remove e-commerce website URL's that have multiple parameters from (Google) index. How would we apply noindex to all these URL's with parameters??
The aim is to recrawl and rebuild the index of the whole website using appropriate robots, canonical's & meta-tags, rather than using GWT.
Many thanks
-
Parameter handling in Google Webmaster Tools won't get a URL out of the index if it is already indexed.
You need to use the NoIndex robots meta tag in the of your page. Once you add this tag, be sure you are allowing Google to crawl the page. Make sure it is Not blocked via robots.txt or with Parameter handling.
Once the pages have left the index, you can block them from being crawled.
-
If you want a page or url not crawled then you should use the robots.txt file and robots meta tags. Then, in WMT, make sure those same pages are actually not being crawled
Hope that answers your question
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved Google Search Console Still Reporting Errors After Fixes
Hello, I'm working on a website that was too bloated with content. We deleted many pages and set up redirects to newer pages. We also resolved an unreasonable amount of 400 errors on the site. I also removed several ancient sitemaps that listed content deleted years ago that Google was crawling. According to Moz and Screaming Frog, these errors have been resolved. We've submitted the fixes for validation in GSC, but the validation repeatedly fails. What could be going on here? How can we resolve these error in GSC.
Technical SEO | | tif-swedensky0 -
Google indexing is slowing down?
I have up to 20 million unique pages, and so far I've only submitted about 30k of them on my sitemap. We had a few load related errors during googles initial visits, and it thought some were duplicates, but we fixed all that. We haven't gotten a crawl related error for 2 weeks now. Google appears to be indexing fewer and fewer urls every time it visits. Any ideas why? I am not sure how to get all our pages indexed if its going to operate like this... love some help thanks! HnJaXSM.png
Technical SEO | | RyanTheMoz0 -
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Google Indexing Pages with Made Up URL
Hi all, Google is indexing a URL on my site that doesn't exist, and never existed in the past. The URL is completely made up. Anyone know why this is happening and more importantly how to get rid of it. Thanks 🙂
Technical SEO | | brian-madden0 -
Google Indexing of Site Map
We recently launched a new site - on June 4th we submitted our site map to google and almost instantly had all 25,000 URL's crawled (yay!). On June 18th, we made some updates to the title & description tags for the majority of pages on our site and added new content to our home page so we submitted a new sitemap. So far the results have been underwhelming and google has indexed a very low number of the updated pages. As a result, only a handful of the new titles and descriptions are showing up on the SERP pages. Any ideas as to why this might be? What are the tricks to having google re-index all of the URLs in a sitemap?
Technical SEO | | Emily_A0 -
How long after google crawl do you need 301 redirects
We have just added 301's when we moved our site. Google has done a crawl & spat back a few errors. How long do I need to keep those 301's in place? I may need to change some. Thanks
Technical SEO | | Paul_MC0 -
Slash at end of URL causing Google crawler problems
Hello, We are having some problems with a few of our pages being crawled by Google and it looks like the slash at the end of the URL is causing the problem. Would appreciate any pointers on this. We have a redirect in place that redirects the "no slash" URL to the "slash" URL for all pages. The obvious solution would be to try turning this off, however, we're unable to figure our where this redirect is coming from. There doesn't appear to be an instruction in our .htaccess file doing this, and we've also tried using "DirectorySlash Off" in the .htaccess file, but that doesn't work either. (if it makes a difference it is a 302 redirect doing this, not a 301) If we can't get the above to work, then the other solution would be to somehow reconfigure the page so that it is recognizable with the slash at the end by Google. However, we're not sure how this would be done. I think the quickest solution would be to turn off the "add slash" redirect. Any ideas on where this command might be hiding, and how to turn it off would be greatly appreciated. Or any tips from people who have had similar crawl problems with google and any workarounds would be great! Thanks!
Technical SEO | | onetwentysix0 -
How to remove a sub domain from Google Index!
Hello, I have a website having many subdomains having same copy of content i think its harming my SEO for that site since abc and xyz sub domains do have same contents. Thus i require to know i have already deleted required subdomain DNS RECORDS now how to have those pages removed from Google index as well ? The DNS Records no more exists for those subdomains already.
Technical SEO | | anand20100