Sub-category considered duplicate content?
-
Hello,
My craw diagnostics from the PRO account is telling me that the following two links have duplicate content and duplicate title tag:
-
http://www.newandupcoming.com/new-blu-ray-releases (New Blu-ray Releases)
-
http://www.newandupcoming.com/new-blu-ray-releases/action-adventure (New Action & Adventure Releases | Blu-ray)
I am really new to the SEO world so I am stuck trying to figure out the best solution for this issue. My question is how should I fix this issue.
I guess I can put canonical tag on all sub-categories but I was worried that search engines would not craw the sub-categories and index potentially valuable pages.
Thanks for all the help.
-
-
Thank you very much for your thorough thoughts, explanation, and your answer to my problem.
Just want to put my thoughts into words here so maybe this conversation can help others that are in the same shoes. The answer to the one question that you've raised, “do users benefit from your site?”, is an idea that I'm trying to prove. You are correct, the site does not provide real unique content except trying to improve user experience to help explore recent product releases on Amazon.com.
This is a niche product because most people are interested in the popular releases and not interested in browsing hundreds of releases (popular or not) by their release date but I believe there are enthusiasts that may benefit from this site. I want to benefit the people that are not just looking for the popular releases. But again, this is yet to be proven.
Thanks for sharing some great ideas to help improve my site. I will keep the review and user generated content concept in mind.
-
The challenge you face is your site does not provide any content. It seems you are an affiliate site for Amazon.com.
The question is, do users benefit from your site? When a user enters a search term in Google, are they ever better off for having found your site? Or would users be better off going directly to amazon?
I am not criticizing your site personally, but sharing google's view. If you offered a good, independent review of each movie, that is great. If you encourage users to generate content by asking their opinions, that is another positive way to go. But to simply link to thousands of movies on another site is not offering value to users or the internet.
The first step is answering the question...what does my site offer users? Why would users feel fortunate to have found my site? Once you have that answer, build upon it.
For your category page problem, you have two options. You can add the "noindex, follow" meta tag to the page. This tag tells search engines that you are not providing unique content and the pages should not be indexed.
Another option is to add unique content to the category pages. For your Action and Adventure page you can describe what movies are part of that genre. "Action and Adventure category includes movies ranging from Die Hard to Indiana Jones. Any fast-paced, heart pumping action movie fits well into this category....." Your content should be at least a few hundred words, something you wrote yourself, and something users might find helpful.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG **Regarding: **https://six9ja.com/
Reporting & Analytics | | Kingsmart1 -
Tracking links and duplicate content
Hi all, I have a bit of a conundrum for you all pertaining to a tracking link issue I have run into on a clients site. They currently have over duplicate content. Currently, they have over 15,000 pages being crawled (using Screaming Frog) but only 7,000+ are legitimate pages in the sense of they are not duplicates of themselves. The client is using Omniture instead of Google Analytics and using an advanced tracking system on their site for internal and external links (ictids and ectids) in the URL parameters. This is creating thousands of duplicated pages being crawled by Google (as seen on their Search Console and on Screaming Frog). They also are in the middle of moving over from http to https and have thousands of pages currently set up for both, again, creating a duplicate content issue. What I have suggested for the tracking links is setting up a URL parameter in Search Console for these tracking links. I've also suggested they canonical all tracking links to point to the clean page so the pages that have already been indexed point to the correct clean url. Does this seam like the appropriate strategy? Additionally, I've told them before they submit a new sitemap to Google, they need to switch their website over to https to avoid worsening their duplicate content issue. They have not submitted a sitemap to Google Search Console since March 2015. Thank you for any help you can offer!
Reporting & Analytics | | Rydch410 -
641 Crawl Errors In My Moz Report - 190 are high priority Duplicate Content
Hi everyone, There are high and medium level errors. I was surprised to see any especially since Google Analytics shows no errors whatsoever.190 errors - duplicate content.A lot of images are showing in the Moz Crawl Report as errors, and when I click on one of these links in the report, it directs to the image which displays on a blog post on the site unusually since I haven't started blogging yet.. So it looks like all those errors are because the images are appearing on their own post.So for example a picture of a mountain would be referred to with www.domain.com/mountains ; the image would be included in the content on a page but why give an image a page/post all of it's own when that was not my intention. Is there a way I can change this?# ----------------------------------------
Reporting & Analytics | | SEOguy1
These are things I first see at the top of the Moz Report: There are 2 similar home urls at the top of the report: http status code is 200 for both (1) and (2) Link Count for (1) is 71. Link count for (2) is 60. No client or server errors Rel Canonical Rel-Canonical Target
Yes http:// domain. co.uk/home
Yes http:// domain. co.uk/home/ Does this mean that the home page is being seen as a duplicate by Google and the search engines?http status codes on every page is 200.Your help would be appreciated.Best Regards,0 -
Been stuck on seo duplication issues shopify
hey there we have been working on some of our webshops and recently started with analytics/moz,but we have basicly hit a brick wall when it comes to www.krawattenwelt.de since we have had 5k high priority issues (duplicate content) and 20k medium priority issues now i have tried a large amount of solutions regarding the duplicate content issues but it didnt work so we basicly reverted it back to for now and i have the feeling i am really running out of options is there anyone who has an idea on how to do this? duplicate content issues are as follows example:http://krawattenwelt.de/collections/budget-9-15 issues with:http://krawattenwelt.de/collections/budget-9-15/modell_normal and with:http://krawattenwelt.de/collections/budget-9-15/modell_normal?page=1
Reporting & Analytics | | WebMaster2050 -
Someone mentioned us on facebook and brought 10k clicks to homepage.. how do I find the content?
So this morning our website started to go crazy with hits all coming from Facebook, and I am trying to figure out how I can see who was talking about us? All the traffic landed on one page specifically. When I look in analytics it just shows social, and then I can drill down to see they are coming from facebook, but that's it.
Reporting & Analytics | | DemiGR0 -
How often does google content experiments stats update?
From my experience it seems to update once per day (every 24 hours), can anyone confirm this is the case or have a link to an official announcement which confirms how often the data updates? It would be handy to know when it updates so we can see the latest information as it comes in.
Reporting & Analytics | | Twist3600 -
Duplicate content and ways to deal with it.
Problem I queried back a year for the portal and we can see below that the SEO juice is split between the upper and lowercase. You can see the issue in the attached images. http://i.imgur.com/OXnPp.png Solutions: 1) Quick: Change the link on the pages above to be lowercase 2) Use canonical link tag http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps The tag is part of the HTML header on a web page, the same section you'd find the Title attribute and Meta Description tag. In fact, this tag isn't new, but like nofollow, simply uses a new rel parameter. For example: http://www.darden.virginia.edu/MBA" /> ''This would tell Yahoo!, Live & Google that the page in question should be treated as though it were a copy of the URL http://www.darden.virginia.edu/MBA and that all of the link & content metrics the engines apply should technically flow back to that URL.'' 3) See if there is any Google Analytics filters at the site level I can apply. I will check into this and get back to you. What do you all think?????? OXnPp voJdp.png OXnPp.png
Reporting & Analytics | | Darden0 -
Duplicate Url with Google shopping feed
In webmaster tool I have many duplicate url tagged as google_shopping Obviously i'm tagging the url with the goog url builder Url: elettrodomestici.yeppon.it/cura-corpo/tagliacapelli/remington-tagliacapelli-funzionamento-rete-ricaricabile-lame-in-acciaio-inox-hc5150-garanzia/ Duplicate url: elettrodomestici.yeppon.it/cura-corpo/tagliacapelli/remington-tagliacapelli-funzionamento-rete-ricaricabile-lame-in-acciaio-inox-hc5150-garanzia/?utm_source=google_shopping&utm_medium=web&utm_content=Elettrodomestici+e+Clima+%3E+Cura+del+corpo+%3E+Tagliacapelli&utm_campaign=google_shopping How can I solve it? Thanks
Reporting & Analytics | | yeppon0