Issue with duplicate content
-
Hello guys, i have a question about duplicate content. Recently I noticed that MOZ's system reports a lot of duplicate content on one of my sites. I'm a little confused what i should do with that because this content is created automatically. All the duplicate content comes from subdomain of my site where we actually share cool images with people. This subdomain is actually pointing to our Tumblr blog where people re-blog our posts and images a lot.
I'm really confused how all this duplicate content is created and what i should do to prevent it. Please tell me whether i need to "noindex", "nofollow" that subdomain or you can suggest something better to resolve that issue.
Thank you!
-
Peter, i'm trying to PM you but i have no idea what to place in the "recepient" field. Thank you for assistance.
-
We only crawl your own site, so we wouldn't surface a duplicate with Tumblr, unless something really, really weird is going on. This is why I need to look at the campaign - what you're describing shouldn't happen, in theory, so I have a feeling this is a pretty unusual situation.
-
Hello Peter, thank you for helping!
Peter, why do you say that neither Moz nor Webmaster Tools are going to detect the duplicates between your subdomain and Tumblr? MOZ is detecting it now. Can you more elaborate on it?
THanks
-
My gut feeling is that you have 2+ issues going on here. Neither Moz nor Webmaster Tools are going to detect the duplicates between your subdomain and Tumblr. So, we/they, must be seeing duplicates in the subdomain itself. This sounds like an overly-complex setup that is likely to be causing you some harm, but without seeing it in play, it's really hard to diagnose.
Could you PM me with the domain - I can log into Moz Analytics as you and check, but you have a few campaigns set up.
(Sorry, I originally was logged in as Marina and my reply posted - I apologize for the confusion)
-
Hello Kane,
Thank you for trying to help me!
I added a link to three screenshots. Two of them are from my MOZ account showing exponential increase of duplicate content and the second one is the subdomain where that duplicate content is coming from. The third screenshot is from my gmail account showing notification from GWT about deep links issue. I'm not sure whether these two issues have anything in common but i fell that they do. Please let me know what you think.
Thanks
-
Hi Marina, a few questions for you:
Can you possibly post screenshots of the Google Webmaster Tools warning that you're seeing?
Does your website have an app associated with it?
Assuming your Tumblr content isn't reposted somewhere on your main domain, it doesn't seem like a duplicate content issue to me, it seems like the GWT message is related to deep linking for a mobile app. I can't imagine why you'd get that if you don't have an app.
-
Thank you for help!
Answering your questions:-
My subdomain look like this: photos.domain.com and it was poined to Tumblr platform (our blog on Tumblr) because it is very image-fiendly platform as well as they host all the images.
-
We use this subdomain only for images posting. We don't use this content on our root domain at all.
I'm really confused what Android app they are talking about. Do the consider Tumblr as Android app?
Thanks
-
-
Hi there
Do you have a web development team or a web developer? What I would do is pass this notification over to them, along with your notifications from Moz, and see if they have a means to correct these issues. I am assuming that Google passed along resources in their notification; I would look into those and see what your options are.
If you do not have a web development team, I would check out the Recommended List to find a company that does web development as well as SEO that can assist in this. What it sounds like to me is that you are linking off to an app with a subdomain and it's creating a different user experience than the one generated by your website.
If I were you, I would find a suitable blogging platform that you can bring your sharing capabilities onto, and create a consistent and seamless experience for your users. Two questions:
- Is your subdomain blog.domain.com? Or is it named differently?
- Do you have your blog posts on your website and copied word for word on your subdomain?
Here are a couple of more resources to review with your team:
App Indexing for Google Search Overview What is App Indexing?
App Indexing for Google Search Technical Details Enabling Deep Links for App ContentLet me know if any of this helps or if you have any more comments - good luck!
-
Thank you for replies!
I'm fairly well aware about duplicate content issue but i have never faced such particular issue. As Lesley said i don't have access to head sections of each post because those posts are practically not on my property but on Tumblr's. And i have no idea how it is created. I assume that it is cause by Tumblr's feature that allows users to re-blog my blog posts.
Moreover, i've just received a warning from Google Webmaster Tools specifically pertaining this subdomain. I'm really confused. Please help
Fix app deep links to ....com/ that dont match the content of the web pages
Dear webmaster
When indexing the deep links to your app, we detected that the content of 1 app pages doesnt match the content of the corresponding web page. This is a bad experience for your users because they wont find what they were looking for on your app page. We wont show deep links for these app pages in our smartphone search results. This is an important issue that needs your immediate attention.
Take these actions to fix this issue:
- Check the Android Apps section of the Crawl Errors report in Webmaster Tools to find examples of app URIs whose content doesnt match their corresponding web page.
- Use these examples to debug the issue:
- Open the corresponding web page to have it ready.
- Use Android debug bridge to open the app page.
- Make sure the content on both your web page and your app page is the same.
- If necessary, change the content on your app (or change your sitemap / or rel=alternate element associations to make sure the each app page is connected to the right web page).
- If necessary, change your robots.txt file to allow crawling of relevant resources. This mismatch might also be due to the fact that some of the resources associated with the app page are disallowed from crawling though robots.txt.
-
I am not very experienced with tumblr personally, but I am pretty sure it cannot be done because they don't give you access to what you would need. You would need access to the head section of each page so that you could put the canonical tag in.
One thing that MIGHT could work, but would be tricky and I would also consult with someone else about too, to see what they though. Is that if the url's are the same minus the sub domain, you could get apache to rewrite a canonical in the actual request header and send it over. I do not know if google would respect this, so I would ask others advice.
-
Hi there
The only time you should noindex a site is if it's not supposed to be seen by search engines - if that's the case, then noindex it.
However, if this content is supposed to be seen by search engines, I would make use of your canonical tags on the subdomain and point it to the original content on the domain.
I would also think of a way to build a community with your website - it sounds like you have opportunities to do so and are getting some attention from your audience and how they are sharing your posts and information.
Also, look into sitemap opportunities with your images and how you can help crawlers understand the information on your website.
You can read more about duplicate content here.
Hope this helps a bit! Let me know if you have any questions or comments!
-
Hello Lesley,
Thank you for response! Well, the subdomain is pointing to our Tumblr blog. I have access to both main domain and Tumblr. Where should i add canonical?Thanks
-
Do you have control over the sub domain to add a canonical to it and point the canonical to the original content?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate content
Hello mozzers, I have an unusual question. I've created a page that I am fully aware that it is near 100% duplicate content. It quotes the law, so it's not changeable. The page is very linkable in my niche. Is there a way I can build quality links to it that benefit my overall websites DA (i'm not bothered about the linkable page being ranked) without risking panda/dupe content issues? Thanks, Peter
Technical SEO | | peterm21 -
Duplicate Content Issue WWW and Non WWW
One of my sites got hit with duplicate content a while ago because Google seemed to be considering hhtp, https, www, and non ww versions of the site all different sites. We thought we fixed it, but for some reason https://www and just https:// are giving us duplicate content again. I can't seem to figure out why it keeps doing this. The url is https://bandsonabudget.com if any of you want to see if you can figure out why I am still having this issue.
Technical SEO | | Michael4g1 -
Duplicate content on user queries
Our website supports a unique business industry where our users will come to us to look for something very specific (a very specific product name) to find out where they can get it. The problem that we're facing is that the products are constantly changing due to the industry. So, for example, one month, one product might be found on our website, and the next, it might be removed completely... and then might come back again a couple months later. All things that are completely out of our control - and we have no way of receiving any sort of warning when these things might happen. Because of this, we're seeing a lot of duplicate content issues arise... For Example... Product A is not active today... so www.mysite.com/search/productA will return no results... Product B is also not active today... so www.mysite.com/search/productB will also return no results. As per Moz Analytics, these are showing up as duplicate content because both pages indicate "No results were found for {your searched term}." Unfortunately, it's a bit difficult to return a 204 in these situations (which I don't know if a 204 would help anyway) or a 404, because, for a faster user experience, we simultaneously render different sections of the page... so in the very beginning of the page load - we start rendering the faster content (template type of content) that says "returning 200 code, we got the query successfully & we're loading the page".. the unique content results finish loading last since they take the longest. I'm still very new to the SEO world, so would greatly appreciate any ideas or suggestions that might help with this... I'm stuck. 😛 Thanks in advance!
Technical SEO | | SFMoz0 -
Duplicate Content Issues on Product Pages
Hi guys Just keen to gauge your opinion on a quandary that has been bugging me for a while now. I work on an ecommerce website that sells around 20,000 products. A lot of the product SKUs are exactly the same in terms of how they work and what they offer the customer. Often it is 1 variable that changes. For example, the product may be available in 200 different sizes and 2 colours (therefore 400 SKUs available to purchase). Theese SKUs have been uploaded to the website as individual entires so that the customer can purchase them, with the only difference between the listings likely to be key signifiers such as colour, size, price, part number etc. Moz has flagged these pages up as duplicate content. Now I have worked on websites long enough now to know that duplicate content is never good from an SEO perspective, but I am struggling to work out an effective way in which I can display such a large number of almost identical products without falling foul of the duplicate content issue. If you wouldnt mind sharing any ideas or approaches that have been taken by you guys that would be great!
Technical SEO | | DHS_SH0 -
Duplicate Title and Content. How to fix?
So this is the biggest error I have. But I don't know how to fix it. I get that I have to make it so that the duplicats redirect to the source, but I don't know how to do that. For example, this is out of our crawl diagnostic: | On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3 1 1 0 On The Block - Page 3 http://www.maddenstudents.com/forumdisplay.php?57-On-The-Block/page3&s=8d631e0ac09b7a462164132b60433f98 | 1 | 1 | 0 | That's just an example. But I have over 1000+ like that. How would I go about fixing that? Getting rid of the "&s=8d631e0ac09b7a462164132b60433f98"? I have godaddy as my domain and web hoster. Could they be able to fix it?
Technical SEO | | taychatha0 -
Advice on Duplicate Page Content
We have many pages on our website and they all have the same template (we use a CMS) and at the code level, they are 90% the same. But the page content, title, meta description, and image used are different for all of them. For example - http://www.jumpstart.com/common/find-easter-eggs
Technical SEO | | jsmoz
http://www.jumpstart.com/common/recognize-the-rs We have many such pages. Does Google look at them all as duplicate page content? If yes, how do we deal with this?0 -
Fix duplicate content caused by tags
Hi everyone, TGIF. We are getting hundreds of duplicate content errors on our WP site by what appears to be our tags. For each tag and each post we are seeing a duplicate content error. I thought I had this fixed but apparently I do not. We are using the Genesis theme with Yoast's SEO plugin. Does anyone have the solution to what I imagine is this easy fix? Thanks in advance.
Technical SEO | | okuma0 -
WordPress Duplicate Content Issues
Everyone knows that WordPress has some duplicate content issues with tags, archive pages, category pages etc... My question is, how do you handle these issues? Is the smart strategy to use robots meta and add no follow/ no index category pages, archive pages tag pages etc? By doing this are you missing out on the additional internal links to your important pages from you category pages and tag pages? I hope this makes sense. Regards, Bill
Technical SEO | | wparlaman0