Removing duplicate &var=1 etc var name urls from google
-
Hi I had a huge drop in traffic around the 11th of july over 50% down with no recovery as yet... ~5000 organic visits per day down to barley over 2500.
I fixed up a problem that one script was introducing that had caused high bounce rates.
Now i have identified that google has indexed the entire news section 4 times, same content but with var=0 var=1 2 3 etc around 40,000 urls in total.
Now this would have to be causing problems.
I have fixed the problem and those url's 404 now, no need for 301's as they are not linked to from anywhere.
How can I get them out of the index? I cant do it one by one with the url removal request.. I cant remove a directory from url removal tool as the reuglar content is still there..
If I ban it in robots.txt those urls, wont it never try to index them again and thus not ever discover they are 404ing?
These urls are no longer linked to from anywhere, so how can google ever reach them by crawling to find them 404ing?
-
yes
-
Hi thanks, so if it cant find a page and finds no more links to a page. does that mean that it should drop out of the index within a month?
-
The definition of a 404 page is a page which cannot be found. So in that sense, no Google can't find the page.
Google's crawlers follow links. If there is not a link to the page, then there is no issue. If Google locates a link, they will attempt to follow that link.
-
Hi Thanks, so if a page is 404'ing but not linked to from anywhere google will still find it?
-
Hi Adam.
The preferred method to handle this issue would have been to only offer one version of the URL. Once you realized the other versions were active, you have a couple options to deal with the problem:
Use a 301 to redirect all the versions of the page to the main URL. This method would have allowed your existing Google links to work. Users would still find the correct page. Google would have noticed the 301 and adjusted their links.
Another option to consider IF the pages were helpful would be to keep them and use the canonical tag to indicate the URL of the primary page. This method would offer the same advantages mentioned above.
By removing the pages and allowing them to 404, everyone loses for the next month. Users who click on a search result will be taken to a 404 page rather then finding the content they seek. Google wont be offering the search results users are seeking. You will experience a high bounce rate as many users do not like 404 pages, and it will take a month for an average site to be fully crawled and the issue corrected.
If you block the pages in robots.txt, then Google wont attempt to crawl the links. In general, your robots.txt should not be used in this manner.
My recommendation is to fix this issue either with the proper 301s. If that is not an option, be sure your 404 page is helpful and as user friendly as possible. Include a site search option along with your main navigation. Google will crawl a small percent of your site each day. You will notice the number of 404 links diminish over time.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removed URLs
recently my site has got some problem some of my URLs are repeating in the SERP ! I removed them by search console and also site : but they show up again Does anyone know what is wrong?
Technical SEO | | talaabshode20200 -
Bulk URL Removal in Webmaster Tools
One of Wordpress sites was hacked (for about 10 hours), and Google picked up 4000+ urls in the index. The site is fixed, but I'm stuck with all those urls in the index. All the urls of of the form: walkerorthodontics.com/index.php?online-payday-cash-loan.htmloncewe The only bulk removal option I could find was to remove an entire folder, but I can't do that, as it would only leave the homepage and kill off everything else. For some crazy reason, the removal tools doesn't support wildcards, so that obvious solution is right out. So, how do it get rid of 4000 results? And no, waiting around for them to 404 out of the index isn't an option.
Technical SEO | | MichaelGregory0 -
Removed Subdomain Sites Still in Google Index
Hey guys, I've got kind of a strange situation going on and I can't seem to find it addressed anywhere. I have a site that at one point had several development sites set up at subdomains. Those sites have since launched on their own domains, but the subdomain sites are still showing up in the Google index. However, if you look at the cached version of pages on these non-existent subdomains, it lists the NEW url, not the dev one in the little blurb that says "This is Google's cached version of www.correcturl.com." Clearly Google recognizes that the content resides at the new location, so how come the old pages are still in the index? Attempting to visit one of them gives a "Server Not Found" error, so they are definitely gone. This is happening to a couple of sites, one that was launched over a year ago so it doesn't appear to be a "wait and see" solution. Any suggestions would be a huge help. Thanks!!
Technical SEO | | SarahLK0 -
URL gets cut off in Google
Hi everybody, I got a question concerning my website URLs. It's a large WordPress website and we've got a lot of categorised pages ('parent' / 'child'). Now when I search for a specific page I only get to see the 'parent' name in the URL. The page which I am looking for isn't visible. Only a small arrow which shows me 2 options (in cache and compare). The URLs are not too long. Does anybody know why this happens, and how I can solve it? I added a image for reference. (Where /partners/ is the parent page and /partners/aruba/ isn't visible) Thank you very much. LSsT1Ua
Technical SEO | | SecureLink0 -
Drupal infinite URL depth? SEOMOZ treating as duplicate content
I'm monitoring a subdirectory of my site on SEOMOZ but with catastrophic results. It's finding infinite duplicate content e.g.www.example.co.uk/product/samples/product/product/productand so on... The website is running on Drupal. Do you have any ideas on how I can solve this?
Technical SEO | | dresults0 -
Duplicate url problem causing me problems
Hi, i am working with a joomla site and i am using the sh404sef plugin. I have contacted the developer of the plugin who has not been very helpful so i am hoping to get help here. The problem i am having is, the description of the page showing in google listings is not the same as what i have put into the meta tag description. for example, for this page http://www.clairehegarty.co.uk/virtual-gastric-band-with-hypnotherapy the meta tag description should be Gastric Band Hypnotherapy to lose weight guaranteed. Free Gastric Band Hypnosis Consultations with Well Known Gastric Hypno Band expert as seen on TV. Hypno Gastric Band Works. We offer full support after your Gastric Band Hypnotherapy but in google it is showing Gastric Band Hypnotherapy Works. If you would like a slimmer and healthier body with all the benefits of weight loss surgery without any of the risks that can be ... now one thing i have noticed is: in the sh404sef control panel, i have noticed that i have the following index.php?option=com_content&Itemid=190&id=153&lang=en&view=article the above is the original url from day one but then i have the one below which is not the original index.php?option=com_content&Itemid=190&catid=150&id=153&lang=en&view=article i keep deleting the above which is not the original but it keeps coming back and i have been told this could be the fault can anyone please help me with this and solve how to stop it from coming back so google shows the correct description please.
Technical SEO | | ClaireH-1848860 -
How to keep a URL social equity during a URL structure/name change?
We are in the process of making significant URL name/structure change to one of our property and we want to keep the social equity (likes, share, +1, tweets) from the old to the new URL. We have been trying many different option without success. We are running our social "button" in an iframe. Thanks
Technical SEO | | OlivierChateau0 -
Same URL in "Duplicate Content" and "Blocked by robots.txt"?
How can the same URL show up in Seomoz Crawl Diagnostics "Most common errors and warnings" in both the "Duplicate Content"-list and the "Blocked by robots.txt"-list? Shouldnt the latter exclude it from the first list?
Technical SEO | | alsvik0