Fixed "lower-case/mixed-case" Internal Links causing duplicate- Now What?
-
Hi,
So after a site re-launch, Moz crawled it and reported over 150 duplicate content errors. It was determined that it was because of incorrect uses of capitalization in internal links. Using screaming frog, I found all (500+) internal links and fixed them to match the actual URL. Now the site is100% consistent across the board as best I can tell.
I am unsure what to do next though. We launched the site with all the internal link errors, and now many of the pages that are indexed and ranked are with the incorrect URL form. Some have said to use a canonical tag. But how can I use a canonical tag on a page doesn't even exist? Same thing with 301. Can I redirect /examplepage to /ExamplePage if only /ExamplePage actually exists?
I would really appreciate some advice on what to do. After I fixed the internal links, I waited a week and Moz crawled the site again and reported all the same errors, and then even more. All capitalization. Seems like it's a mess. After I did another Screaming Frog crawl, it showed no duplicates, so I know I was successful in fixing the internals.
Help!!
-
Thanks, this helps!
-
Just to add, you don't need a page to physically exist to add a 301 redirect Kristin, so you can set the redirects from these old pages fine.
-
My advice would be:
- Resubmit key pages to index through WMT (Google & Bing)
- Make sure your Sitemaps are up to date with the proper page URLs, and submitted to WMT
- Set up the 301s if you have the time. You can create redirections using your .htaccess file, as long as you have an Apache-based site. This page is a great guide to writing .htaccess commands, including 301s and others
I had a similar problem with a slightly different manifestation, related to trailing slashes on my URLs. It was hard, but after doing the first two steps and setting up redirects, it was just a question of waiting for Google and MOZ to catch-up. Redirects took care of bad SERP links, and at this point (just over 2-weeks after launch with bad links), everything has been updated with the correct page URLs. Good luck!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I'm setting up my online store in wordpress/woocommerce and want to avoid duplicate content.
Hi Mozers, Apparently I'm using unique content in the short description area and it displays on the pages next to the product photo which is great how it is, but adding informational description repeating on every product page going to hurt us in SEO? A. See here an actual product - (flagged for thin content in OSE)
Intermediate & Advanced SEO | | melinmellow
B. This is how i would like to set each product page to improve them: See here a sample product with additional information/content.
Here's my question: Setting my product pages to the B version would be considered as duplicate content by google?0 -
Can too many "noindex" pages compared to "index" pages be a problem?
Hello, I have a question for you: our website virtualsheetmusic.com includes thousands of product pages, and due to Panda penalties in the past, we have no-indexed most of the product pages hoping in a sort of recovery (not yet seen though!). So, currently we have about 4,000 "index" page compared to about 80,000 "noindex" pages. Now, we plan to add additional 100,000 new product pages from a new publisher to offer our customers more music choice, and these new pages will still be marked as "noindex, follow". At the end of the integration process, we will end up having something like 180,000 "noindex, follow" pages compared to about 4,000 "index, follow" pages. Here is my question: can this huge discrepancy between 180,000 "noindex" pages and 4,000 "index" pages be a problem? Can this kind of scenario have or cause any negative effect on our current natural SEs profile? or is this something that doesn't actually matter? Any thoughts on this issue are very welcome. Thank you! Fabrizio
Intermediate & Advanced SEO | | fablau0 -
How to avoid too many "On Page Links"?
Hi everyone I don't seem to be able to keep big G off my back, even though I do not engage in any black hat or excessive optimization practices. Due to another unpleasant heavy SERP "fluctuation" I am in investigation mode yet again and want to take a closer look at one of the warnings within the SEOmoz dashboard, which is "Too many on page links". Looking at my statistics this is clearly the case. I wonder how you can even avoid that at times. I have a lot of information on my homepage that links out to subpages. I get the feeling that even the links within the roll-over menus (or dropdown) are counted. Of course, in that case then you will end up with a crazy amount of on page links. What about blog-like news entries on your homepage that link to other pages as well? And not to forget the links that result from the tags underneath a post? What am I trying to get at? Well, do you feel that a bad website template may cause this issue i.e. are the links from roll-over menus counted as links on the homepage even though they are not directly visible? I am not sure how to cut down on the issue as the sidebar modules are present on every page and thus up the links count wherever you are on the site. On another note, I've seen plenty of homepages with excessive information and links going out, would they be suffering from the search engines' hammer too? How do you manage the too many on page links issue? Many thanks for your input!
Intermediate & Advanced SEO | | Hermski0 -
To "Guest Blog" or "Ghost Blog"?
To "Guest Blog" or "Ghost Blog"? I've been wondering which would be better given G's "authorship" tracking program. "Onreact.Com" indirectly raised this issue in a recent blog post "Google Authorship Markup Disadvantages Everybody Ignores" as : "Google might dismiss your guest articles. Your great guest blogging campaign on dozens of other blogs might fail because Google will count the links all as one as the same author has written all the posts and linked to himself. So maybe the links won't count at all." Assuming all other things are equal, would you use "Guest Author" with G Authorship attribution (if allowed) or just ghost the article and include an in-text link without attribution to you as the author?
Intermediate & Advanced SEO | | JustDucky1 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Internal Search / Faceted Navigation
Hi there, I'm working on an e-learning site with the following content pages: main page, category pages, course pages, author pages, tag pages. We will also have an internal search for users to search by keyword for courses & authors & categories. Is it still recommend to "noindex, follow" and disallow in robots.txt internal search results? Or for a site like this, is it better to use faceted navigation? It seems that faceted navigation is mostly for e-commerce sites. What is the latest thinking on SEO best practices for internal search result pages?
Intermediate & Advanced SEO | | mindflash0 -
Duplicate Content http://www.website.com and http://website.com
I'm getting duplicate content warnings for my site because the same pages are getting crawled twice? Once with http://www.website.com and once with http://website.com. I'm assuming this is a .htaccess problem so I'll post what mine looks like. I think installing WordPress in the root domain changed some of the settings I had before. My main site is primarily in HTML with a blog at http://www.website.com/blog/post-name BEGIN WordPress <ifmodule mod_rewrite.c="">RewriteEngine On
Intermediate & Advanced SEO | | thirdseo
RewriteBase /
RewriteRule ^index.php$ - [L]
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule . /index.php [L]</ifmodule> END WordPress0 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0