Issue with duplicate content
-
Hello guys, i have a question about duplicate content. Recently I noticed that MOZ's system reports a lot of duplicate content on one of my sites. I'm a little confused what i should do with that because this content is created automatically. All the duplicate content comes from subdomain of my site where we actually share cool images with people. This subdomain is actually pointing to our Tumblr blog where people re-blog our posts and images a lot.
I'm really confused how all this duplicate content is created and what i should do to prevent it. Please tell me whether i need to "noindex", "nofollow" that subdomain or you can suggest something better to resolve that issue.
Thank you!
-
Peter, i'm trying to PM you but i have no idea what to place in the "recepient" field. Thank you for assistance.
-
We only crawl your own site, so we wouldn't surface a duplicate with Tumblr, unless something really, really weird is going on. This is why I need to look at the campaign - what you're describing shouldn't happen, in theory, so I have a feeling this is a pretty unusual situation.
-
Hello Peter, thank you for helping!
Peter, why do you say that neither Moz nor Webmaster Tools are going to detect the duplicates between your subdomain and Tumblr? MOZ is detecting it now. Can you more elaborate on it?
THanks
-
My gut feeling is that you have 2+ issues going on here. Neither Moz nor Webmaster Tools are going to detect the duplicates between your subdomain and Tumblr. So, we/they, must be seeing duplicates in the subdomain itself. This sounds like an overly-complex setup that is likely to be causing you some harm, but without seeing it in play, it's really hard to diagnose.
Could you PM me with the domain - I can log into Moz Analytics as you and check, but you have a few campaigns set up.
(Sorry, I originally was logged in as Marina and my reply posted - I apologize for the confusion)
-
Hello Kane,
Thank you for trying to help me!
I added a link to three screenshots. Two of them are from my MOZ account showing exponential increase of duplicate content and the second one is the subdomain where that duplicate content is coming from. The third screenshot is from my gmail account showing notification from GWT about deep links issue. I'm not sure whether these two issues have anything in common but i fell that they do. Please let me know what you think.
Thanks
-
Hi Marina, a few questions for you:
Can you possibly post screenshots of the Google Webmaster Tools warning that you're seeing?
Does your website have an app associated with it?
Assuming your Tumblr content isn't reposted somewhere on your main domain, it doesn't seem like a duplicate content issue to me, it seems like the GWT message is related to deep linking for a mobile app. I can't imagine why you'd get that if you don't have an app.
-
Thank you for help!
Answering your questions:-
My subdomain look like this: photos.domain.com and it was poined to Tumblr platform (our blog on Tumblr) because it is very image-fiendly platform as well as they host all the images.
-
We use this subdomain only for images posting. We don't use this content on our root domain at all.
I'm really confused what Android app they are talking about. Do the consider Tumblr as Android app?
Thanks
-
-
Hi there
Do you have a web development team or a web developer? What I would do is pass this notification over to them, along with your notifications from Moz, and see if they have a means to correct these issues. I am assuming that Google passed along resources in their notification; I would look into those and see what your options are.
If you do not have a web development team, I would check out the Recommended List to find a company that does web development as well as SEO that can assist in this. What it sounds like to me is that you are linking off to an app with a subdomain and it's creating a different user experience than the one generated by your website.
If I were you, I would find a suitable blogging platform that you can bring your sharing capabilities onto, and create a consistent and seamless experience for your users. Two questions:
- Is your subdomain blog.domain.com? Or is it named differently?
- Do you have your blog posts on your website and copied word for word on your subdomain?
Here are a couple of more resources to review with your team:
App Indexing for Google Search Overview What is App Indexing?
App Indexing for Google Search Technical Details Enabling Deep Links for App ContentLet me know if any of this helps or if you have any more comments - good luck!
-
Thank you for replies!
I'm fairly well aware about duplicate content issue but i have never faced such particular issue. As Lesley said i don't have access to head sections of each post because those posts are practically not on my property but on Tumblr's. And i have no idea how it is created. I assume that it is cause by Tumblr's feature that allows users to re-blog my blog posts.
Moreover, i've just received a warning from Google Webmaster Tools specifically pertaining this subdomain. I'm really confused. Please help
Fix app deep links to ....com/ that dont match the content of the web pages
Dear webmaster
When indexing the deep links to your app, we detected that the content of 1 app pages doesnt match the content of the corresponding web page. This is a bad experience for your users because they wont find what they were looking for on your app page. We wont show deep links for these app pages in our smartphone search results. This is an important issue that needs your immediate attention.
Take these actions to fix this issue:
- Check the Android Apps section of the Crawl Errors report in Webmaster Tools to find examples of app URIs whose content doesnt match their corresponding web page.
- Use these examples to debug the issue:
- Open the corresponding web page to have it ready.
- Use Android debug bridge to open the app page.
- Make sure the content on both your web page and your app page is the same.
- If necessary, change the content on your app (or change your sitemap / or rel=alternate element associations to make sure the each app page is connected to the right web page).
- If necessary, change your robots.txt file to allow crawling of relevant resources. This mismatch might also be due to the fact that some of the resources associated with the app page are disallowed from crawling though robots.txt.
-
I am not very experienced with tumblr personally, but I am pretty sure it cannot be done because they don't give you access to what you would need. You would need access to the head section of each page so that you could put the canonical tag in.
One thing that MIGHT could work, but would be tricky and I would also consult with someone else about too, to see what they though. Is that if the url's are the same minus the sub domain, you could get apache to rewrite a canonical in the actual request header and send it over. I do not know if google would respect this, so I would ask others advice.
-
Hi there
The only time you should noindex a site is if it's not supposed to be seen by search engines - if that's the case, then noindex it.
However, if this content is supposed to be seen by search engines, I would make use of your canonical tags on the subdomain and point it to the original content on the domain.
I would also think of a way to build a community with your website - it sounds like you have opportunities to do so and are getting some attention from your audience and how they are sharing your posts and information.
Also, look into sitemap opportunities with your images and how you can help crawlers understand the information on your website.
You can read more about duplicate content here.
Hope this helps a bit! Let me know if you have any questions or comments!
-
Hello Lesley,
Thank you for response! Well, the subdomain is pointing to our Tumblr blog. I have access to both main domain and Tumblr. Where should i add canonical?Thanks
-
Do you have control over the sub domain to add a canonical to it and point the canonical to the original content?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Duplicate Content/Similar Pages
Hello, I'm working on our site and I'm coming into an issue with the duplicate content. Our company manufactures heavy-duty mobile lifts. We have two main lifts. They are the same, except for capacity. We want to keep the format similar and the owner of the company wants each lift to have its own dedicated page. Obviously, since the layout is the same and content is similar I'm getting the duplicate content issue. We also have a section of our accessories and a section of our parts. Each of these sections have individual pages for the accessory/part. Again, the pages are laid out in a similar fashion to keep the cohesiveness, and the content is different, however similar. Meaning different terminology, part numbers, stock numbers, etc., but the overall wording is similar. What can I do to combat these issues? I think our ratings are dropping due to the duplicate content.
Technical SEO | | slecinc0 -
Tricky Duplicate Content Issue
Hi MOZ community, I'm hoping you guys can help me with this. Recently our site switched our landing pages to include a 180 item and 60 item version of each category page. They are creating duplicate content problems with the two examples below showing up as the two duplicates of the original page. http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=180&p=1 http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=60&p=1 The original page is http://www.uncommongoods.com/fun/wine-dine/beer-gifts I was just going to do a rel=canonical for these two 180 item and 60 item pages to the original landing page but then I remembered that some of these landing pages have page 1, page 2, page 3 ect. I told our tech department to use rel=next and rel=prev for those pages. Is there anything else I need to be aware of when I apply the canonical tag for the two duplicate versions if they also have page 2 and page 3 with rel=next and rel=prev? Thanks
Technical SEO | | znotes0 -
Duplicate Content Question
I have a client that operates a local service-based business. They are thinking of expanding that business to another geographic area (a drive several hours away in an affluent summer vacation area). The name of the existing business contains the name of the city, so it would not be well-suited to market 'City X' business in 'City Y'. My initial thought was to (for the most part) 'duplicate' the existing site onto a new site (brand new root domain). Much of the content would be the exact same. We could re-word some things so there aren't entire lengthy paragraphs of identical info, but it seems pointless to completely reinvent the wheel. We'll get as creative as possible, but certain things just wouldn't change. This seems like the most pragmatic thing to do given their goals, but I'm worried about duplicate content. It doesn't feel as though this is spammy though, so I'm not sure if there's cause for concern.
Technical SEO | | stevefidelity0 -
Duplicate content issue
Moz crawl diagnostic tool is giving me a heap of duplicate content for each event on my website... http://www.ticketarena.co.uk/events/Mint-Festival-7/ http://www.ticketarena.co.uk/events/Mint-Festival-7/index.html Should i use a 301 redirect on the second link? i was unaware that this was classed as duplicate content. I thought it was just the way the CMS system was set up? Can anyone shed any light on this please. Thanks
Technical SEO | | Alexogilvie0 -
Duplicate content vs. less content
Hi, I run a site that is currently doing very well in google for the terms that we want. We are 1,2 or 3 for our 4 targeted terms, but havent been able to jump to number one in two categories that I would really like to. In looking at our site, I didn't realize we have a TON of duplicate content as seen by SEO moz and I guess google. It appears to be coming from our forum, we use drupal. RIght now we have over 4500 pages of duplicate content. Here is my question: How much is this hurting us as we are ranking high. Is it better to kill the forum (which is more community service than business) and have a very tight site SEO-wise, or leave the forum even with the duplicate content. Thanks for your help. Erik
Technical SEO | | SurfingNosara0 -
Bad Duplicate content issue
Hi, for grappa.com I have about 2700 warnings of duplicate page content. My CMS generates long url like: http://www.grappa.com/deu/news.php/categoria=latest_news/idsottocat=5 and http://www.grappa.com/deu/news.php/categoria%3Dlatest_news/idsottocat%3D5 (this is a duplicated content). What's the best solution to fix this problem? Do I have to set up a 301 redirect for all the duplicated pages or insert the rel=canonical or rel=prev,next ? It's complicated becouse it's a multilingual site, and it's my first time dealing with this stuff. Thanks in advance.
Technical SEO | | nico860 -
If two websites pull the same content from the same source in a CMS, does it count as duplicate content?
I have a client who wants to publish the same information about a hotel (summary, bullet list of amenities, roughly 200 words + images) to two different websites that they own. One is their main company website where the goal is booking, the other is a special program where that hotel is featured as an option for booking under this special promotion. Both websites are pulling the same content file from a centralized CMS, but they are different domains. My question is two fold: • To a search engine does this count as duplicate content? • If it does, is there a way to configure the publishing of this content to avoid SEO penalties (such as a feed of content to the microsite, etc.) or should the content be written uniquely from one site to the next? Any help you can offer would be greatly appreciated.
Technical SEO | | HeadwatersContent0 -
Duplicate content domains ranking successfully
I have a project with 8 domains and each domain is showing the same content (including site structure) and still all sites do rank. When I search for a specific word-string in google it lists me all 8 domains. Do you have an explanation, why Google doesn't filter those URLs to just one URL instead of 8 with the same content?
Technical SEO | | kenbrother0