Best way to handle indexed pages you don't want indexed
-
We've had a lot of pages indexed by google which we didn't want indexed. They relate to a ajax category filter module that works ok for front end customers but under the bonnet google has been following all of the links.
I've put a rule in the robots.txt file to stop google from following any dynamic pages (with a ?) and also any ajax pages but the pages are still indexed on google.
At the moment there is over 5000 pages which have been indexed which I don't want on there and I'm worried is causing issues with my rankings.
Would a redirect rule work or could someone offer any advice?
-
Gavin Since you have added the noindex in the pages, the best way is to let Google crawl those pages, see the noindex and remove them. The other option is to keep everything as is and request these parameter pages via your Google Webmaster Console. Option 1: You never know how long it takes Option 2: This should happen relatively fast I would therefore suggest keeping everything as is and doing a removal request.
-
Right... We think we've been able to get the code noindex code into the dodgy pages. The only way we could think of doing it without breaking the user interface was to put this rule into the PHP.
if(!empty($_SERVER['HTTP_X_REQUESTED_WITH']) && strtolower($_SERVER['HTTP_X_REQUESTED_WITH']) == 'xmlhttprequest')
{normal code
}
else
{echo '';
echo '';
echo '';
echo '';
echo '';
echo '404';
echo '';
echo '';
}Its rendering ok for us front end, if anyone would like to test... I'm just hopeful it would work for google?
http://www.outdoormegastore.co.uk/cycling/cycling-clothing/protective-clothing.html?ajax=1
One thing I am not sure about is how google is going to revisit the said pages. I have put in various rules to the robots.txt files as well as the url parameter handling in webmaster tools to prevent any future pages from being followed... Would these rules need to be removed?
-
The AJAX URLs are used by the site, though, right (for visitors)? If you 404 them, you may be breaking the functionality and not just impacting Google.
Another problem is that, if these pages are no longer crawlable, and you add a page-level directive (whether it's a 404, 301, canonical, NOINDEX, etc.), Google won't process those new instructions. So, they could get stuck in the index. If that's the case, ti may actually be more effective to block the "ajax=" parameter with parameter handling in Google Webmaster Tools (there's a similar option in Bing).
If you know the path is cut and this isn't a recurrent problem, that could be the fastest short-term solution. You do need to monitor, though, as they can re-enter the index later.
-
Gavin, that's a more generic response. In this scenario, unless you can make a 404 happen, it won't work and therefore is not applicable. Noindex and / or the canonical tag are the choices and I would try and get those going if possible.
-
Thanks for all of the replies... My best option seems to be the meta noindex rule but the nature of the pages that are getting indexed are just one long ajax string with no access to the header are. I hope I have already 'prevented' google from following the links in the future by adding the rules to robots.txt but I'm now desperate to clean up (cure) the existing ones.
My next thought would be to put a rule in htaccess and redirect anything with ajax in the url to a 404 page?
I'm worried that this may have even worse side effects with rankings but its based on this article that google publish: https://support.google.com/webmasters/bin/answer.py?hl=en&answer=59819
"To remove a page or image, you must do one of the following:
- Make sure the content is no longer live on the web. Requests for the page must return an HTTP 404 (not found) or 410 status code
What would your thoughts be on this?
-
Definitely review George's comment as you need to figure out why they're being crawled. As Andrea said, any solution takes time, I'm sorry to say. Robots.txt is not a good solution for getting pages removed that are already indexed, especially in bulk. It's better at prevention than cure.
META NOINDEX can be effective, or you could rel=canonical these pages to the appropriate non-AJAX URL - not sure exactly how the structure is set up. Those are probably the two fastest and more powerful approaches. Google parameter handling (in Webmaster Tools) is another option, but it's a bit unpredictable whether they honor it and how quickly.
You can only do mass removal if everything is in a folder, if I recall. There's no way to bulk remove unless all of the pages are structurally under one root URL.
-
I'm not sure if you're aware or not, but I think I know why Google is indexing these pages.
Right now, you are outputting URLs into your source code of your page in the form of a JavaScript function call similar to the following:
I believe this is because your page (and this function call) is programmatically created. Instead of outputting the whole URL to the page, you could output only what needs to be there.
For example:
Then change the signature of the JavaScript function so that it accepts this new input and builds the URL from your inputs:
function initSlider(price, low, high, category, subcategory, product, store, ajax, ?) {
// build URL
var URL = 'http://www.outdoormegastore.co.uk/' + category + '/' + subcategory + '/' + product + '.html?_' + store + '&' + ajax;
// continue...
}
Right now, because that URL is being outputted to the page, I think Google sees it as a URL it should follow and index. If you build this URL with the function in an external JavaScript file, I don't think it will be indexed.
Your developer(s) should know what I'm talking about.
Hope this helps!
-
If they are already indexed, it's going to take time for Google to recrawl, read the tag and get them to fall out, so patience will be key. It's not a quick thing to undo.
If the pages are all in one location, you can add a disallow robots/text to Webmaster Tools command to prevent that entire folder from being indexed, but again, it's already done so you are going to have to wait for all those pages to fall out.
-
Thanks for the quick reply! I'm desperate to get these removed as soon as possible now. I've got webmaster tools access but requesting over 5,000 pages to be removed one by one will take too long. You can't do page removal in bulk can you?
I'm going to work on the noindex option
-
OMG, that does not look good. I completely understand. The best way in my opinion would be to add a noindex meta tag on these pages and let Google crawl them. Once they re-index them with the noindex, that should take care of the problem. However, be careful since you want to make sure that noindex tag does not appear on your real pages, just the AJAX ones.
Another option might be to consider the canonical tag, but then technically these pages are not duplicate pages, they just should not exist. Are you verified and using the Google Webmaster Console ? If yes, see if you can get some of these pages excluded via the URL removal tool. The best way is to add the noindex tag in my opinion.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why does Google's search results display my home page instead of my target page?
Why does Google's search results display my home page instead of my target page?
Technical SEO | | h.hedayati6712365410 -
What's the best way for users to upload their images to my wordpress site to promote UGC
I have looked at lots of different plugins and wanted a recommendation for an easy way for patients of ours to upload pictures of them out partying and having fun and looking beautiful so future users can see the final results instead of sometimes gory or difficult to understand before and after images. I'd like to give them the opportunity to write captions (like facebook or insta posts and would offer them incentives to do so. I don't want it to be too complicated for them or have too many steps or barriers but I do want it to look nice and slick and modern. Also do you think this would have a positive impact on SEO? I was also thinking of a Q&A app where dentists could get Q&A emails and respond - i've been doing AMA sessions and they've been really successful and I would like to bring it into out site and make it native. Thanks in advance 🙂
Technical SEO | | Smileworks_Liverpool1 -
What's the best way to handle Overly Dynamic Url's?
So my question is What the best way to handle Overly Dynamic Url's. I am working on a real estate agency website. They are selling/buying properties and the url is as followed. ttp://www.------.com/index.php?action=calculator&popup=yes&price=195000
Technical SEO | | Angelos_Savvaidis0 -
Error msg 'Duplicate Page Content', how to fix?
Hey guys, I'm new to SEO and have the following error msg 'Duplicate Page Content'. Of course I know what it means, but my question is how do you delete the old pages that has duplicate content? I use to run my website through Joomla! but have since moved to Shopify. I see that the duplicated site content is still from the old Joomla! site and I would like to learn how to delete this content (or best practice in this situation). Any advice would be very helpful! Cheers, Peter
Technical SEO | | pjuszczynski0 -
Indexed pages and current pages - Big difference?
Our website shows ~22k pages in the sitemap but ~56k are showing indexed on Google through the "site:" command. Firstly, how much attention should we paying to the discrepancy? If we should be worried what's the best way to find the cause of the difference? The domain canonical is set so can't really figure out if we've got a problem or not?
Technical SEO | | Nathan.Smith0 -
What is the most effective way of indexing a localised website?
Hi all, I have a website, www.acrylicimage.com which provides products in three different currencies, $, £ and Euro. Currently a user can click on a flag to indicate which region they are in, or if the user has not manually selected the website looks at the users Locale setting and sets the region for them. The website also has a very simple content management system which provides ever so slightly different content depending on which region the user is in. The difference in content might literally be a few words per page, like contact details, measurements i.e. imperial to metric. I dont believe that GoogleBot, or any other bot for that matter, sets a Locale, and therefore it will only ever be indexing the content on our default region - the UK. So, my question really is if I need to be able to index different versions of content on the same page, is the best route to provide alternate urls i.e.: /en/about-us
Technical SEO | | dotcentric
/us/about-us
/eu/about-us The only potential downside I see to this is there are currently a couple of pages that do have exactly the same content regardless of whether you have selected the UK or USA regions - could this be considered content duplication? Thanks for your help. Al0 -
Rel canonical or 301 the Index Page?
Still a bit confused on best practice for /index.php showing up as duplicate for www.mysite.com. What do I need to do and How?
Technical SEO | | bozzie3110 -
Over 1000 pages de-indexed over night
Hello, On my site (www.bridgman.co.uk) we had a lot of duplicate page issues as reported by the Seomoz site report tool - this was due to database driven URL strings. As a result, I sent an excel file with all the duplicate pages to my web developer who put rel canonical tags on what I assumed would be all the correct pages. I am not sure if this is a coincidence, or a direct result of the canonical tags, but a few days after (yesterday) the amount of pages indexed by google dropped from 1,200 to under 200. The number is still declining, and other than the canonical tags I can't work out why Google would just start de-indexing most of our pages. If you could offer any solutions that would be greatly appreciated. Thanks, Robert.
Technical SEO | | 87ROB0