Duplicate content and ways to deal with it.
-
Problem
I queried back a year for the portal and we can see below that the SEO juice is split between the upper and lowercase. You can see the issue in the attached images.
Solutions:
1) Quick: Change the link on the pages above to be lowercase
2) Use canonical link tag http://www.seomoz.org/blog/canonical-url-tag-the-most-important-advancement-in-seo-practices-since-sitemaps
The tag is part of the HTML header on a web page, the same section you'd find the Title attribute and Meta Description tag. In fact, this tag isn't new, but like nofollow, simply uses a new rel parameter. For example:
http://www.darden.virginia.edu/MBA" />
''This would tell Yahoo!, Live & Google that the page in question should be treated as though it were a copy of the URL http://www.darden.virginia.edu/MBA and that all of the link & content metrics the engines apply should technically flow back to that URL.''
3) See if there is any Google Analytics filters at the site level I can apply. I will check into this and get back to you.
What do you all think??????
-
Because that is just filtering your data in your report. That will not stop this from happening.
-
I think (2) - the canonical tag - is a solid solution if just a few URLs are out of whack, but if you're using the mixed-case version internally, then you may need to change your structure as well. If you change your structure, then I'd probably look at a full-scale system of 301-redirects to preserve inbound link-juice.
It sounds like you're linking to mixed-case internally, so you may need to set up the redirects. Make sure that, depending on your platform, the case-specific redirects work properly (and don't create an endless loop). There is some risk to making the switch, so I'd probably only do it if you're seeing this happen a lot. Unfortunately, mixed-case URLs are often more trouble than they're worth.
-
Why would I not just do this?
http://support.google.com/googleanalytics/bin/answer.py?hl=en&answer=90397
-
I would stick to using the Rel=Canonical tag.
You could also check in Google Webmaster Tools and look at the URL parameter handling tool.
In this you will be able to:
1. Recognize duplicate content on your website.
2.Determine your preferred URLs.
3.Apply 301 permanent redirects where necessary and possible.
4.Implement the rel="canonical" link element on your pages where you can.
5.Use the URL parameter handling tool in Google Webmaster Tools where possible.
Further reading: http://googlewebmastercentral.blogspot.co.uk/2009/10/reunifying-duplicate-content-on-your.htmlI hope this helps
Ally
-
Option "2," using rel=canonical seems like the best course of action to me. You may also want to apply a 301 redirect.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is there an easy way to switch hundreds of websites to https in GSC?
My company has hundreds of websites setup in Google Search Console but will soon be moving them all to secure domains. Is there an easy way to make the switch in GSC or do we have to change the address one by one?
Reporting & Analytics | | MJTrevens0 -
Is there a way to apply the same google analytics filter to multiple properties?
I manage WordPress multiple sites for my clients. Some of these clients are SEO customers. One issue I have with the analytics reports is the occurrence of spam and ghost spam. I know how to create filters to block however there are always more and more. Is there away to export the filter and import it to all the other properties i manage? Or do they just have to be done manually every time i need to block spam?
Reporting & Analytics | | donsilvernail1 -
What is the best way to eliminate this specific image low lying content?
The site in question is www.homeanddesign.com where we are working on recovering from some big traffic loss. I finally have gotten the sites articles properly meta titled and descriptioned now I'm working on removing low lying content. The way there CMS is built, images have their own page (every one that's clickable). So this leads to a lot of thin content that I think needs to be removed from the index. Here is an example: http://www.homeanddesign.com/photodisplay.asp?id=3633 I'm considering the best way to remove it from the index but not disturb how users enjoy the site. What are my options? Here is what I'm thinking: add Disallow: /photodisplay to the robots.txt file See if there is a way to make a lightbox instead of a whole new page for images. But this still leaves me with 100s of pages with just an image on there with backlinks, etc. Add noindex tag to the photodisplay pages
Reporting & Analytics | | williammarlow0 -
Best way to generate analytics reports for listing style website
I'm working on a website that includes dedicated pages for ~40 local businesses, and I need to be able to generate and export some basic reports I can send to each business. The data I need for each report is split between general sitewide data: total number of visitors to the site for month. number of visitors to each main category page what country they are from (main countries) - top 5 traffic source / keywords avg time on site As well as specific data for each individual page: how many people viewed specific total pages Time spent on individual page. Would this be possible with custom reports in analytics? I can see the number of different reports being difficult to maintain, especially as site grows. Anyone had expereince on a similar site of ideas on the best way to do this? Thanks
Reporting & Analytics | | zeald0 -
Google Analytics Content Experiments don't deliver 50/50?
Our A/B test is actually delivering at about a 70/30 page view rate. 70% in favor of the original version and only 30% of the new. We are sending 100% of our traffic to this homepage test. Has anyone else experienced this? There seems to be a lot of folks experiencing this.....anyone know why?
Reporting & Analytics | | VistageSEO0 -
Duplicate Content From My Own Site?!
When I ran the SEO Moz report it says that I have a ton of duplicate content. The first one I looked at was my home page. http://www.kisswedding.com/ http://www.kisswedding.com/index.html http://kisswedding.com/index.html All of the above 3 have varying internal links, page authority, and link root domains. Only the first has any external links. All of the others only seem to have 1 other duplicate page. It's a difference between the www and the non-www version. I have a verified acct for www.kisswedding.com in google webmaster tools. The non-www version is in there too but has not been verified. Under settings for the verified account (www.kisswedding.com), "Don't set a preferred domain" is checked off. Is that my mistake. And if so, which should I select? The www version or the non-www version? Thanks!
Reporting & Analytics | | annasus0 -
Regex in Analytics for longtail content
I have spent quit some time now, trying to figure this out. I wannt to segment landingoage traffic to the longtail content. That's content in the 3rd or 4th subdirectorylevel. I would like to do that by saying google: "Just show me traffic that arrives on landingpages with 3 OR 4 slashes "/" But I can't get a solution, can anybody help in the seoMOZ community?
Reporting & Analytics | | viventuraSEO0 -
Campaign tracking and duplicate content
Hi all, When you set up campaign tracking in Google Analytics you get something like this "?variable=value parameters" in the URL. If you place such a link on your site as an internal link, will it be considered as a different URL and will have its own link value? The question I have is, since Google knows it's a Google link and knows the original URL (by stripping the tags), does it pass link value to the original URL? If not, what can be done to pass link value? Thanks in advance. Henry
Reporting & Analytics | | hnydnn0