Do you loose Link Equity when using RanDom CasE?
-
I seen a site linking internally using Caps from the home page to sub pages, the rest of the site links in lower-case. Are there any disadvantages in terms of link juice or duplication for doing this?
Example link from homepage: /blah/Doctors.aspx
Example link from other internal page: /blah/doctors.aspx
The site is on a Windows based server and not Linux.
Thanks in advance
-
If your web server is set up to fix case issues (ie turning them all to lower case) then you've got no problem.
From the search engines point of view, case in the URL matters (case for keywords do not, big difference). Each different version is considered a different resource, and would be cached and indexed separately, leading to potential duplicate content penalties.
If you're trying to fix a large site that has mixed up their upper and lower case in their URLs, the easiest fix is to do it in the webserver, .htaccess if you're using apache.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Recovering from spam links on MY site
Hey guys, Having a weird situation and wondering if anybody can help. I run a sizable WordPress site with a number of content writers. One of the writer's accounts was hacked and was used to post several dozens of complete spam posts with spun content and links to all sorts of shady sites. Recently the site has begun losing rankings on all sorts of pages. There's no manual penalty or anything, but I'm concerned that we're being penalized for having had these links on the site. Of course, as soon as we found the content, we immediately removed it, reset passwords, etc. But a decent number of the pages were indexed. Does anybody have any experience with this or ideas of what to do about it? Is there somewhere we can talk to Google about it or some way to show that we are not part of bad neighborhoods? Thanks so much for any thoughts, Yon
Intermediate & Advanced SEO | | yon230 -
What Links to Disavow?
I am looking through my website's link profile that I pulled directly from Google Webmaster Tools. What is the best way to determine the links to disavow? Maybe the Webmaster Tools list is not the best list for this process but I really need to clean up the links that are hurting the site's SEO. Does anyone have any insight?
Intermediate & Advanced SEO | | PartyStore0 -
Internal Links - Dofollow or Nofollow and why?
Hey there Mozzers, I am a question about internal links. If I am writing a article about something and want to link to another one of my articles inside my blog, do i have to make that link nofollow or dofollow? If possible tell me why also. Thanks in advance
Intermediate & Advanced SEO | | Angelos_Savvaidis0 -
Question about using abbreviation
Hello, I have this abbreviation inside my domain name, ok? now for a page URL name, do you recommend me to use the actual word (which shortened form of it is inside domain name) in a page name? Or when have abbreviation in domain name, then using its actual word in a page name is not good? It's all about how much google recognize abbreviation as the actual word and gives the same value of word to it? do I risk not using the actual word? Hope made myself clear ) thanks.
Intermediate & Advanced SEO | | mdmoz0 -
Dummy links in posts
Hi, Dummy links in posts. We use 100's of sample/example lnks as below http://<domain name></domain name> http://localhost http://192.168.1.1 http:/some site name as example which is not available/sample.html many more is there any tag we can use to show its a sample and not a link and while we scan pages to find broken links they are skipped and not reported as 404 etc? Thanks
Intermediate & Advanced SEO | | mtthompsons0 -
Excessive navigation links
I'm working on the code for a collaborative project that will eventually have hundreds of pages. The editor of this project wants all pages to be listed in the main navigation at the top of the site. There are four main dropdown (suckerfish-style) menus and these have nested sub- and sub-sub-menus. Putting aside the UI issues this creates, I'm concerned about how Google will find our content on the page. Right now, we now have over 120 links above the main content of the page and have plans to add more as time goes on (as new pages are created). Perhaps of note, these navigation elements are within an html5 <nav>element: <nav id="access" role="navigation"> Do you think that Google is savvy enough to overlook the "abundant" navigation links and focus on the content of the page below? Will the <nav>element help us get away with this navigation strategy? Or should I reel some of these navigation pages into categories? As you might surmise the site has a fairly flat structure, hence the lack of category pages.</nav> </nav> </nav>
Intermediate & Advanced SEO | | boxcarpress1 -
Meta Keywords: Should we use them or not?
I am working through our site and see that meta keywords are being used heavily and unnecessarily. Each of our info pages will have 2 or 3 keyword phrases built into them. Should we just duplicate the keyword phrases into the meta keyword field, should put in additional keywords beyond or not use it at all? Thoughts and opinions appreciated
Intermediate & Advanced SEO | | Towelsrus1 -
How do you implement dynamic SEO-friendly URLs using Ajax without using hashbangs?
We're building a new website platform and are using Ajax as the method for allowing users to select from filters. We want to dynamically insert elements into the URL as the filters are selected so that search engines will index multiple combinations of filters. We're struggling to see how this is possible using symfony framework. We've used www.gizmodo.com as an example of how to achieve SEO and user-friendly URLs but this is only an example of achieving this for static content. We would prefer to go down a route that didn't involve hashbangs if possible. Does anyone have any experience using hashbangs and how it affected their site? Any advice on the above would be gratefully received.
Intermediate & Advanced SEO | | Sayers1