Minimising duplicate content
-
From a minimising duplicate content perspective is it best to create all blog posts with a single tag so google doesn't think the same post being returned via a different tag search is duplicate content. I.e. the urls below return the same blog post; or doesn't it matter.
for example
http://www.ukholidayplaces.co.uk/blog/?tag=/stay+in+Margate
http://www.ukholidayplaces.co.uk/blog/?tag=/Margate+on+a+budget
are the same posts...
thanks
-
Hi!
Little late to the party here - thanks Geoff for helping out!!
While certainly creating excerpts on for the tag pages would be great - I'd suggest doing a crawl of your own site with something like Screaming Frog SEO Spider
I just did a crawl, and see a bunch of issues needing attention:
- Just about all of your meta descriptions are exactly the same
- Your H1s are all the same
- Bunch of duplicate titles (because for example, all the author archive subpages are being given the same title)
- I don't see any meta robots or canonical tags in use at all, which would be good to help control what pages you want indexed or counted for value.
- You have tons of meta keywords, mostly all duplicates, and the meta keywords tag should not be used anymore.
You've got some additional issues to work out besides just the tags thing.
Check webmaster tools to confirm this as well, Google webmaster tools will show you everything you need to fix!
-Dan
-
You're welcome Jonathan.
Feel free to see how a lot of other successful organisations implement this on their blogs on the web. Take Mashable for example, see their topics pages, these are essentially what blog articles are tagged with. Looks like they cut off their snippets at about 170 characters.
Also, ensure that you're using the canonical link element for blog article pages too to let search engines know that those are the originals and where you want the weight placed.
-
Thanks Geoff,
I wasn't sure after the recent updates.
Copy scape finds loads of matches but google didn't....
-
No, assigning multiple tags to multiple pages on your website is good practice. (Providing they are of relevance of course).
What you should think about doing is only displaying excerpts for tag / search result pages so that it doesn't flag as duplicate content. You don't need to be displaying the entire post(s) for a tag page, a small snippet with a 'Read More' or similar link will ensure the full original is only ever at one location, it's specific URI.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Hiding content until user scrolls - Will Google penalize me?
I've used: "opacity:0;" to hide sections of my content, which are triggered to show (using Javascript) once the user scrolls over these sections. I remember reading a while back that Google essentially ignores content which is hidden from your page (it mentioned they don't index it, so it's close to impossible to rank for it). Is this still the case? Thanks, Sam
Web Design | | Sam.at.Moz0 -
How to deal with 100s of Wordpress media link pages, containing images, but zero content
I have a Wordpress website with well over 1000 posts. I had a SEO audit done and it was highlighted that every post had clickable images. If you click the image a new webpage opens containing nothing but the image. I was told these image pages with zero content are very bad for SEO and that I should get them removed. I have contacted several Wordpress specialists on People Per Hour. I have basically been offered two solutions. 1 - redirect all these image pages to a 404, so they are not found by Google 2 - redirect each image page to the main post page the image is from. What's my best option here? Is there a better option? I don't care if these pages remain, providing they are not crawled by Google and classified as spam etc. All suggestions greatly received!
Web Design | | xpers0 -
Fixing my sites problem with duplicate page content
My site has a problem with duplicate page content. SEO MOZ is telling me 725 pages worth. I have looked a lot into the 301 Re direct and the Rel=canonical Tag and I have a few questions: First of all, I'm not sure which on I should use in this case. I have read that the 301 Redirect is the most popular path to take. If I take this path do I need to go in and change the URL of each of these pages or does it automatically change with in the redirect when I plug in the old URL and the new one? Also, do I need to just go to each page that SEO MOZ is telling me is a duplicate and make a redirect of that page? One thing that I am very confused about is the fact that some of these duplicates listed out are actually different pages on my site. So does this just mean the URL's are too similar to each other, and there fore need the redirect to fix them? Then on the other hand I have a log in page that says it has 50 duplicates. Would this be a case in which I would use the Canonical Tag and would put it into each duplicate so that the SE knew to go to the original file? Sorry for all of the questions in this. Thank you for any responses.
Web Design | | JoshMaxAmps0 -
Is it common to have some of error/warning(currency duplicate,redirect, etc...) in most website that rank well?
Hi could any body could give me some idea on 'on page optimisation' Currently in my campaign I have around 3000+ errors, 14,000+ warning, 7000+ notices for the following reasons: Overly-Dynamic URL
Web Design | | LauraHT
Temporary Redirect
Title Element Too Long (> 70 Characters)
Duplicate Page Title
etc... First of all I know these have negative effect on SEO. Now to fix towards those issues it involve lots of works and times. At the same time most of our important keywords/url rank position have not changed over the last 12 months. Does that mean the above has only limited negative effect? I just want to know is it worthy to invest the man/hour/money to clean those issues. As it involves decent development time. Is it common to have some of error/warning in most website that rank well? (e.g. I 've seem may big website have duplicate title/meta-desc on their currency variant page)0 -
Duplicate content issue
I have recently built a site that has a main page intended to rank for national coverage. This site also has a number of pages targeted at local searches, these pages are slight variations of each other with town specific keywords. Does anyone know if google will see this as spam and quarantine my site from ranking? Thanks
Web Design | | stebutty0 -
How will engines deal with duplicate head elements e.g. title or canonicals?
Obviously duplicate content is never a good thing...on separate URL's. Question is, how will the engines deal with duplicate meta tags on the same page. Example Head Tag: <title>Example Title - #1</title> <title>Example Title - #2</title> My assumption is that Google (and others) will take the first instance of the tag, such that "Example Title - #1" and canonical = "http://www.example.com" would be considered for ranking purposes while the others are disregarded. My assumption is based on how SE's deal with duplicate links on a page. Is this a correct assumption? We're building a CMS-like service that will allow our SEO team to change head tag content on the fly. The easiest solution, from a dev perspective, is to simply place new/updated content above the preexisting elements. I'm trying to validate/invalidate the approach. Thanks in advance.
Web Design | | PCampolo0 -
Website using javascript to serve up content - SEO Friendly?
I'm checking out a dentist website http://www.sagedentalnj.com/ I was referred by a friend so just taking a little peek at it. When you click on the menu items, the url at the top doesn't change. When you view source, the page titles are all the same. when I do site:http://www.sagedentalnj.com/ none of his pages are indexed by google. What can be done with his site so that google sees his pages? Maybe submit sitemap?
Web Design | | Czubmeister0 -
Duplicate content.
Hi there....we're dealing with a duplicate content mess. We're a franchisor(www.kitchensolvers.com), and each of our franchises have their own landing page. The trouble is, the way the landing pages are set up, it's causing all the links available on the national level of the website to be re-indexed each time for every franchise! We don't have an in-house developer and was wondering if anyone else has had similar issues and point me in the right direction.
Web Design | | tafkat0