Getting rid of duplicate content remaining from old misconfiguration
-
Hi Friends,We have recently (about a month ago) launched a new website, and during the review of that site spotted a serious misconfiguration of our old terrible WP siteThis misconfiguration, which may have come from either sitemaps or internal links or both lead to displaying our french german and english sites on each others’ domains. This should be solved now, but they still show in SERPS: The big question is: What’s the best way to safely remove those from SERPS?We haven’t performed as well as we wanted for a while and we believe this could be one of the issues:Try to search for instance“site:pissup.de stag do -junggesellenabschied” to find english pages on our german domain, each link showing either 301 or 404.This was cleaned to show 301 or 404 when we launched our new site 4 weeks ago, but I can still see the results in SERPS, so I assume they still count negatively?Cheers!
-
Yep, this one I fixed just now as you send it.
I think the issue with wrong redirects is mostly me not spotting them all rather than a problem with the ones I already set not redirecting correctly.
I expect there to be thousand + wrong pages, but when I use site:domain.tld and a word in wrong language, for instance "evg" (french word for bachelor party) Google spots only up to 300 (suspiciously the same maximum amount for all sites).
-
Hey Rasmus - I honestly think it's an issue with the redirects. I would double check them.
I did just visit https://www.pissup.de/package/basic-survival-4/ and it looks like it's redirecting. Were you able to get those shored up? If you are still having trouble, I would contact your web host to make sure those are shored up.
-
Hi John
Yes, the idea is that https://www.pissup.de/package/basic-survival-4/ should redirect to a german equivalent where we have one.
It's strange that it isn't as it has not been more than a week since I uploaded all the redirects. Perhaps this is down to the site: search not providing all results, and perhaps if it's limiting the amount of results, when some are removed, it starts showing others that were not showing before?
-
Hey Rasmus,
Just so I understand - a url like this: https://www.pissup.de/package/basic-survival-4/, should not be displaying on the german site. The german site should just have german right?
I found that page doing the site search listed in your initial question.
What's interesting is that this page isn't redirecting. Let me know your thoughts. I have feedback but I want to make sure of a few things before I share it.
Thanks!
John
-
Hi John
Thanks for taking your time to answer!
The URL's were already showing 301 or 404 when we discovered them after launching new site
What we did so far was this:
- set up 301 redirect from pissup.com/german-url to pissup.com/english-equivalent where available or closest similar page
- added a sitemap with these URL's with the hope they'd be crawled faster
- Wait
We were advised it was better to redirect than to ask for removal. Do you disagree with this advice, and what makes you think so?
We're really seeing an increase yet for these issues in the SERPS. Some decrease by 5-10%, but some don't. Can it be because we are not seeing them all in SERPS, and in that case is there anywhere else we could find them (all url's indexed by google on our domain)?
-
Hey Rasmus,
In finding these index pages, I'm assuming that you did the following:
1. no-indexed the pages from the domain you are concerned about
2. dis-allowed them in robots.txt (just another step to help speed up things)
3. Used the URL removal tool in Google Search Console
Unfortunately, it does take time for Google to process these URL's out of the SERPS. Hopefully, you are seeing a decrease in the URLs shown in the SERPS
Also, don't forget to do this via the Bing Search Console too!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content on recruitment website
Hi everyone, It seems that Panda 4.2 has hit some industries more than others. I just started working on a website, that has no manual action, but the organic traffic has dropped massively in the last few months. Their external linking profile seems to be fine, but I suspect usability issues, especially the duplication may be the reason. The website is a recruitment website in a specific industry only. However, they posts jobs for their clients, that can be very similar, and in the same time they can have 20 jobs with the same title and very similar job descriptions. The website currently have over 200 pages with potential duplicate content. Additionally, these jobs get posted on job portals, with the same content (Happens automatically through a feed). The questions here are: How bad would this be for the website usability, and would it be the reason the traffic went down? Is this the affect of Panda 4.2 that is still rolling What can be done to resolve these issues? Thank you in advance.
Intermediate & Advanced SEO | | iQi0 -
How do you reduce duplicate content for tags and categories in Wordpress?
Is it possible to avoid a duplicate content error without limiting a post to only one category or tag?
Intermediate & Advanced SEO | | Mivito0 -
Duplicate Content in News Section
Our clients site is in the hunting niche. According to webmaster tools there are over 32,000 indexed pages. In the new section that are 300-400 news posts where over the course of a about 5 years they manually copied relevant Press Releases from different state natural resources websites (ex. http://gfp.sd.gov/news/default.aspx). This content is relevant to the site visitors but it is not unique. We have since begun posting unique new posts but I am wondering if anything should be done with these old news posts that aren't unique? Should I use the rel="canonical tag or noindex tag for each of these pages? Or do you have another suggestion?
Intermediate & Advanced SEO | | rise10 -
Duplicate content on ecommerce sites
I just want to confirm something about duplicate content. On an eCommerce site, if the meta-titles, meta-descriptions and product descriptions are all unique, yet a big chunk at the bottom (featuring "why buy with us" etc) is copied across all product pages, would each page be penalised, or not indexed, for duplicate content? Does the whole page need to be a duplicate to be worried about this, or would this large chunk of text, bigger than the product description, have an effect on the page. If this would be a problem, what are some ways around it? Because the content is quite powerful, and is relavent to all products... Cheers,
Intermediate & Advanced SEO | | Creode0 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030 -
Duplicate content even with 301 redirects
I know this isn't a developer forum but I figure someone will know the answer to this. My site is http://www.stadriemblems.com and I have a 301 redirect in my .htaccess file to redirect all non-www to www and it works great. But SEOmoz seems to think this doesn't apply to my blog, which is located at http://www.stadriemblems.com/blog It doesn't seem to make sense that I'd need to place code in every .htaccess file of every sub-folder. If I do, what code can I use? The weirdest part about this is that the redirecting works just fine; it's just SEOmoz's crawler that doesn't seem to be with the program here. Does this happen to you?
Intermediate & Advanced SEO | | UnderRugSwept0 -
Duplicate Content Issue
Why do URL with .html or index.php at the end are annoying to the search engine? I heard it can create some duplicate content but I have no idea why? Could someone explain me why is that so? Thank you
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Cross-Domain Canonical and duplicate content
Hi Mozfans! I'm working on seo for one of my new clients and it's a job site (i call the site: Site A).
Intermediate & Advanced SEO | | MaartenvandenBos
The thing is that the client has about 3 sites with the same Jobs on it. I'm pointing a duplicate content problem, only the thing is the jobs on the other sites must stay there. So the client doesn't want to remove them. There is a other (non ranking) reason why. Can i solve the duplicate content problem with a cross-domain canonical?
The client wants to rank well with the site i'm working on (Site A). Thanks! Rand did a whiteboard friday about Cross-Domain Canonical
http://www.seomoz.org/blog/cross-domain-canonical-the-new-301-whiteboard-friday0