The use of foreign characters and capital letters in URL's?
-
Hello all,
We have 4 language domains for our website, and a number of our Spanish landing pages are written using Spanish characters - most notably: ñ and ó.
We have done our research around the web and realised that many of the top competitors for keywords such as Diseño Web (web design) and Aplicaión iPhone (iphone application) DO NOT use these special chacracters in their URL structure.
Here is an example of our URL's
EX: http://www.twago.es/expert/Diseño-Web/Diseño-Web
However when I simply copy paste a URL that contains a special character it is automatically translated and encoded.
EX: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone
(When written out long had it appears: http://www.twago.es/expert/Aplicación-iPhone/Aplicación-iPhone
My first question is, seeing how the overwhelming majority of website URL's DO NOT contain special characters (and even for Spanish/German characters these are simply written using the standard English latin alphabet) is there a negative effect on our SEO rankings/efforts because we are using special characters?
- When we write anchor text for backlinks to these pages we USE the special characteristics in the anchor text (so does most other competitors). Does the anchor text have to exactly
I know most webbrowsers can understand the special characters, especially when returning search results to users that either type the special characters within their search query (or not). But we seem to think that if we were doing the right thing, then why does everyone else do it differently?
My second question is the same, but focusing on the use of Capital letters in our URL structure.
NOTE: When we do a broken link check with some link tools (such as xenu) the URL's that contain the special characters in Spanish are marked as "broken". Is this a related issue?
Any help anyone could give us would be greatly appreciated!
Thanks,
David from twago
-
Hi,
We have 4 foreign language sites, one in spanish and just remove the special characters in all and rank very highly, so there is no harm in doing this, it actually makes it harder.
I would stick with all lower cases or at least have the same logic in the URL - as long as it is consistent, then no biggie.
No matter what you do, make sure if you make changes to any of this that you 301 all of the old pages to their new version otherwise you will be starting from scratch!
Hope this helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Have you changed 100's of links on your site? Tell me the why's, the how's and what's!
Hello there. If you've changed 100's of links, then I'd like for you to contribute to this thread. I've created a new URL structure for a website with 500+ posts in an effort to make it more user friendly, and more accessible to crawlers. I was just about to pull the trigger, when I started reading up on the subject and found that I might have a few surprises waiting for me around the corner. The status of my site. 500 posts 10 different categories 50+ tags No Backlinks No recent hits (according to Google Analytics) No rankings. I'm going to keep roughly 75% of the posts, and put them in different (new) categories to strengthen SEO for the topic which I'd like to rank multiple categories for, and also sorted a list with content which I'd like to 410. Created new structure created new categories Compiled list of old URLs, and new URLs New H1, Meta Title & Descriptions New tags It looks simple on paper, but I've got problems executing it. **Question 1. **What do I need to keep in mind when deleting posts, categories, and tags - besides 410, Google URL removal? Question 2. What do I do with all the old posts that I am going to re-direct? Each post has between 10-15 internal links. I've started manually removing each link in old posts before 301'ing them. The reason I'm doing this is control the UX, as well as internal link juice to strengthen main categories. Am I on the right path? On a side note, I've prepared for the 301'ing by changing the H1's, meta data and adding alt text to images. But I can't help but to think that just deleting the old posts, and copying over the content to the new url (with the original dates set) would be a better alternative. Any contribution to this thread would be greatly appreciated. Thank you in advance.
Web Design | | Dan-Louis1 -
Can't Hyperlink After the WP 4.0 Update?
Anyone else who runs a WP site have problems hyperlinking after the 4.0 update? I read I could deactivate all my plugins, and go through them one-by-one, but before I go to that step, I want to know if there's either an easier way to regain this functionality or if there is a specific plugin that's known to cause the problem. Thanks, Ruben
Web Design | | KempRugeLawGroup0 -
Should Blog Category Archive URLs be Set to "No-Index" in Wordpress?
It appears that Google Webmaster Tools is listing about 120 blog archives URLs in Google Index>Index Status that should not be listed. Our site map contains 650 pages, but Google shows 860. Pages like: <colgroup><col width="464"></colgroup>
Web Design | | Kingalan1
| http://www.nyc-officespace-leader.com/blog/category/manhattan-office-space | With Titles Like: <colgroup><col width="454"></colgroup>
| Manhattan Office Space Archives - Metro Manhattan Office Space | Are listed when in the Rogerbot crawl report for the site. How can we remove such pages from Google Webmaster Tools, Index Status? Our site map shows about 650 pages, yet Google show these extra pages. We would prefer that they not be indexed. Note that these pages do not appear when we run a site:www.nyc-officespace-leader.com search. The site has suffered a drop in ranking since May and we feel it prudent to keep Google from indexing useless URLs. Before May 650 pages showed on the Webmaster Tools Index status, and suddenly in early June when we upgraded the site the index grew by about 175 pages. I suspect the 120 blog archives URLs may have something to do with it. How can we get them removed? Can we set them to "No-Index", or should the robot text be used to remove them? Or can some type of removal request be made to Google? My developers have been struggling with this issue since early June. The bloat on the site is about 175 URLs not on the site map. Is there any go to authority on this issue (it is apparently rather complicated) that can provide a definitive answer? Thanks!!
Alan0 -
Help needed on URL structures
I am busy structuring URL's for a client and an issue i have come across is as follows: i have a URL that is a long one, we cant remove words in it so the question is which one is better structurally: root/courses/businessmanagementandadministration.aspx or root/courses/business-management-and-administration.aspx please help.
Web Design | | nick_pageone0 -
Using H1 in a carousel
Hi, I have a homepage with a carousel rotator that has text in it. My question is what's the best practice in using H1 tags within the carousel. Will placing H1 tags in each be considered excessive H1 use and if so can this still cause SEO problems? Thank you
Web Design | | mirel0 -
Nofollow links to resources used to save bandwidth?
I have a site on volusion, www.ecowindchimes.com. Until recently I was doing fairly well (top 3 for keyword(s) for 6 years) in serps. I was hit by the an update around july of last year, and did a full page redesign in november. My site has been losing ranking for its main keyword "wind chimes".
Web Design | | sbetzen
One change I noticed is that no-follow was removed (when the designers added a lightbox popup for the sound) from the many many links I have for my sound files of the wind chimes (I house them on a separate server to save bandwidth, which is expensive at volusion). The webaddress the sound-files are on doesn't even have a page... it is just there for the files. (there are ~100 files linked to on almost every page of the site where a product listing shows). Should I go through and no-follow all of these links again? Is that hurting me?
I suspect it is, but it is a lot of work for nothing if that is not the problem.0 -
Using H1 Headings - More than 1?
I've known about avoiding the use of more than 1 H1 Heading Tags, however, with HTML5 is this going to change... at least that's how I understand it. According to HTML5 Specs, Each 'section' can have an H1 heading, which at least theoretically means certain web pages that have multiple "sectioning elements" can have more than 1 H1 heading... true? False? What I'm looking for here is some insight into the ramifications HTML5 will have on the use of H1 tags. And would like to know how search engines currently handle this and are they anticipated to change as the HTML5 outline algorithm becomes widely supported? thanks in advance Kelly
Web Design | | KellysTutorials0 -
Any reaction to the announcement from Google that 'signed in' searches won't pass through search query info to analytics?
Seems like SEO is about to get that much harder: http://analytics.blogspot.com/2011/10/making-search-more-secure-accessing.html Any thoughts on this?
Web Design | | PaulM011