Anyone else seeing increased duplication of domains since Penguin?
-
Hi Is it just me or are the Google SERPs showing more duplication of domains since the penguin update. As an example if I search for "business Christmas cards" on google.co.uk then results 2, 3 and 17 are from the same domain. Similarly results 4, 20, 21 and 22 are the same domain. All results are "reasonable" in that they are designed to catch traffic for variations on this term BUT I'm sure google used to filter this duplication per-penguin. Am I imagining this increased duplication of domains? Gary
-
I just thought I would share an astonishingly good (or is that poor) example of the problem at the moment....
Do a search on www.google.co.uk for "wedding venues in buckinghamshire" (no quotes needed) and most of the results from page two onwards are coming from a single domain!
Simply terrible results, IMHO! How broken is that!
-
That's encouraging. Even if the results are a bit dodgy at the moment, I'm sure Google will sort this out as it creates such a poor experience.
-
Hi Gary, there is a lot of chat on this on the Google Webmaster Forum . It's happening all over. Some people seeing the same domain dominate 8 out of the first 10 search results on Page 1 . Great if that's your domain - not so good for everyone else. Also not such a great user experience in my opinion.
-
Thanks for this. Funny, I'm looking from the perspective of "breaking in" right now. It's just a bit of a shame because many of the duplicated pages are really only designed for search engines and are variations on a theme. Hopefully this is an unintended consequence of the latest updates and we will see it reversed. I don't think it improves the results. Gary
-
** Is it just me or are the Google SERPs showing more duplication of domains since the penguin update.**
I noticed this too. I believe that Google is making it much more difficult to get two (or more) listings on the first page of the SERPs. However, the number you can get on other pages has really gone up.
I was reading about a study that looked at the first 1000 places in google. Usually, only about 200 domains are present in the top 1000. When I think about that I wonder how a new website in some moderately competitive niche is supposed to break in.
-
Yeah I have noticed the same, while searching on a keyword "sugar daddy oregon" on google.com, I noticed the SEO work I had done secured the top 3 positions for seekingarrangement.com
This only happened after those major updates.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content question
Hey Mozzers! I received a duplicate content notice from my Cycle7 Communications campaign today. I understand the concept of duplicate content, but none of the suggested fixes quite seems to fit. I have four pages with HubSpot forms embedded in them. (Only two of these pages have showed up so far in my campaign.) Each page contains a title (Content Marketing Consultation, Copywriting Consultation, etc), plus an embedded HubSpot form. The forms are all outwardly identical, but I use a separate form for each service that I offer. I’m not sure how to respond to this crawl issue: Using a 301 redirect doesn’t seem right, because each page/form combo is independent and serves a separate purpose. Using a rel=canonical link doesn’t seem right for the same reason that a 301 redirect doesn’t seem right. Using the Google Search Console URL Parameters tool is clearly contraindicated by Google’s documentation (I don’t have enough pages on my site). Is a meta robots noindex the best way to deal with duplicate content in this case? Thanks in advance for your help. AK
Technical SEO | | AndyKubrin0 -
Duplicate Content
I am trying to get a handle on how to fix and control a large amount of duplicate content I keep getting on my Moz Reports. The main area where this comes up is for duplicate page content and duplicate title tags ... thousands of them. I partially understand the source of the problem. My site mixes free content with content that requires a login. I think if I were to change my crawl settings to eliminate the login and index the paid content it would lower the quantity of duplicate pages and help me identify the true duplicate pages because a large number of duplicates occur at the site login. Unfortunately, it's not simple in my case because last year I encountered a problem when migrating my archives into a new CMS. The app in the CMS that migrated the data caused a large amount of data truncation Which means that I am piecing together my archives of approximately 5,000 articles. It also means that much of the piecing together process requires me to keep the former app that manages the articles to find where certain articles were truncated and to copy the text that followed the truncation and complete the articles. So far, I have restored about half of the archives which is time-consuming tedious work. My question is if anyone knows a more efficient way of identifying and editing duplicate pages and title tags?
Technical SEO | | Prop650 -
Handling of Duplicate Content
I just recently signed and joined the moz.com system. During the initial report for our web site it shows we have lots of duplicate content. The web site is real estate based and we are loading IDX listings from other brokerages into our site. If though these listings look alike, they are not. Each has their own photos, description and addresses. So why are they appear as duplicates – I would assume that they are all too closely related. Lots for Sale primarily – and it looks like lazy agents have 4 or 5 lots and input the description the same. Unfortunately for us, part of the IDX agreement is that you cannot pick and choose which listings to load and you cannot change the content. You are either all in or you cannot use the system. How should one manage duplicate content like this? Or should we ignore it? Out of 1500+ listings on our web site it shows 40 of them are duplicates.
Technical SEO | | TIM_DOTCOM0 -
Redirect non www. domain to WWW. domain for established website?
Hey guys, The website in question has been online for more than 5 years but there are still 2 versions of the website. Both versions are indexed by Google and of course, this will result in duplicate content. Is it necessary to redirect the non-www domain to the www. domain. What are the cons and advantages? Will the www. links replace the non-www links when it comes to keyword rankings? Thanks.
Technical SEO | | BruLee0 -
Migrating a better performing domain to a less well performing domain
I have a customer who has many domain names and assets but she's wanting to consolidate some of them to help her simplify things for her customers but mostly she wants to build up her website through which she sells products. Grief Reflection - www.griefreflection.com is a personal journal that she's keeping to process the impending death of her husband and it's also linked to her business website which sells healing from grief types of products. Storybooks for Healing - www.storybooksforhealing.com is the website through which she sells workbooks and memory books for people who want to keep the memory of their loved one alive after they've gone. I've setup both of these domains as campaigns and have been looking at the metrics for both. The grief reflection blog out performs the storybooks for healing website. If we merge the two then the Grief Reflection blog would likely become a subdirectory under www.storybooksforhealing.com and be more fully integrated which she thinks will help her visitors not get confused while navigating her website. www.griefreflection.com has 12,637 links while www.storybooksforhealing.com has 1,462. Also, Google has indexed 380 pages of Grief Reflection and only 100 pages for Storybooks for Healing, though that may be because there are fewer pages to index. Grief reflection also has a 4.36 mozRank and 5.30 mozTrust, where Storybooks has 4.13 mozRank and 5.15 mozTrust. Should I counsel her to keep these domains separate? If not, would simply setting up 301 redirects from the www.griefreflection.com domain name to the new subdirectory under www.storybooksforhealing.com be the way to go? Thank you ever so much for any wisdom anyone can provide.
Technical SEO | | ChristiMc0 -
Redirect Multiple Domains
This is a follow-up question from one posted earlier this month. I can't linked to that because it's a private question so I'm trying to summarize it below. We have a number of domains – about 20 - (e.g. www.propertysharp.com) that point to our main domain ip adress (www.propertyshark.com) and share the same content. This is no black-hat strategy whatsoever, the domains were acquired several years ago in order to help people who mistyped the websites url to reach their desired destination. The question was whether to redirect them to our main domain or not. Pros were the reportedly millions of incoming links from these domains - cons was the fact that lots of issues regarding duplicate content could arise and we actually saw lots of some pages from these domains ranking in the search engines. We were recommended to redirect them, but to take it gradually. I have a simple question - what does gradually mean - one domain per week, per month?
Technical SEO | | propertyshark0 -
Is this considered as duplicate content?
One of my clients has a template page they have used repeatedly each time they have a new news item. The template includes a two-paragraph customer quote/testimonial for the company. So, they now have 100+ pages with the same customer quote. The rest of the page content / body copy is unique. Is there any likelihood of this being considered duplicate content?
Technical SEO | | bjalc20110 -
Sub Domain SEO
I am thinking to Add Sub Domains to get better rankings for Local Searches. So I will develop City Specific Sites with Specific Language. For Example qatar.wisnetsol.com. IT will be in Arabic. If my Good standing and Ranking on Google for wisnetsol.com will help my subdomain to rank better? if we setup wisnetsol.com/qatar, how it can target Qatar in Google Webmaster tools? Will links for qatar.wisnetsol.com and wisnetsol.com are seprate? What do you think about this strategy? Is it good or bad?
Technical SEO | | Khuram0