URL contain special character
-
Hello,
I am using URLs which contain special character such as ', ". I found in the Google Webmaster Tool report errors related URL contain ' character. Google have indexed partial URL from beginning to ' character and cut off the rest of URL. For example: I submitted URL www.example.com/vietnam-visa-corp'-test-page.html, then google report error NOT FOUND for URL www.example.com/vietnam-visa-corp
I don't know why and how to fix it? Please help!
Thanks,
-
I am extremely happy I could be of help.
Sincerely,
Thomas
-
Thank you so much !
Both advice are helpful. I agree with both of you, I googled and found some a useful article http://www.isoc.org/inet95/proceedings/PAPER/016/html/paper.html#S4
and then I'd prefer to remove the special character from my URLs. I will ask a developer to do this.
Thanks again!
-
I completely agree you should have a developer do the work for you. I can make some recommendations Web dev studios is fantastic and so is happy cog that are both on the expensive side however they deliver fantastic work. Another great developer that is not as expensive would be Greg http://www.gregreindel.com/
I hope this helps sincerely,
Thomas
-
You may want to consider rewriting those URLs. The character escape codes shared by Thomas above may meet your needs, but I'd rewrite URLs like that. Assuming you are on a Linux server (as opposed to Windows, which would require ISAPI ) you'll do this by editing the htaccess file. I'd advise getting a developer to consult with you on this, as rewriting URls often involves writing regular expressions that can be easy to goof up.
Here's a good Moz article on the topic: http://moz.com/blog/down-and-dirty-write-your-own-url-rewrite
And here's another great introductory resource: http://coding.smashingmagazine.com/2011/11/02/introduction-to-url-rewriting/
If you want to try doing it yourself feel free to post your website, with the example URLs, so other members can help you with the rewrite code. I strongly suggest working with a developer on this though.
-
Use the chart below to figure out which character to replace the current character
Character Escape Code
Ampersand & &
Single Quote ' '
Double Quote " "
Greater Than > >
Less Than < <https://support.google.com/webmasters/answer/35653?hl=en
http://www.w3schools.com/html/
http://wordpress.org/support/topic/problem-with-urls-containing-special-character
I hope I have been of help,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
From an SEO perspective, which is preferable in the URL for a non-English site: local language or English?
For example, the article is titled ' गर्मी में त्वचा की देखभाल कुछ यूँ करें' (English Meaning: Skin Care Tips for Summer), which one of these three URLs is best for SEO purpose? 1. example.com/ गर्मी-में-त्वचा-की-देखभाल/ (URL in local language, Hindi in this case) 2. example.com/summer-skin-care-tips/ (URL uses English words with English translation) 3. example.com/garmi-mein-twacha-ki-dekhbhaal/ (Hindi words, but written in Roman English) A response will be much appreciated, Amit
On-Page Optimization | | DusBus0 -
Best Tool for Retrieving Multiple URL Word Counts in Bulk?
I am doing some content analysis with over 200 URLs to go through! Does anybody know of, or can recommend any bulk on-page word count checkers which would help with the heavy lifting? Any suggestions are greatly appreciated. Thanks!
On-Page Optimization | | NickG-1230 -
URL SEO: Better directory structure vs. exact keyword phrase
I am trying to understand how to best optimise a url for a page to rank high for specific keywords. Example: a top keyword search is "rental properties in new york". Question is does this keyword need to appear as this exact phrase in the url or should it be broken up into different directories for a better structure e.g.: www.abc.com/en/properties/new-york/rental OR www.abc.com/en/rental-properties-in-new-york Which will help the page rank higher (given all other things on the page are exactly the same)? Thanks!
On-Page Optimization | | MH190 -
URL for a new website
Hi, I am creating a new website for a client. Is it best to include the keywords from the most common search in the domain name, they would like: forenamesurname.com but should I be recommending: weddingmakeupbyforename.com Does it make much difference to search rankings if the keyword is in the domain name? Thanks v much
On-Page Optimization | | danieldunn100 -
URL / Meta info for Author bio pages
When you create a biography page for authors that point back to their G+ account, do you include any other signals in the URL other than the authors name? For example, would you use www.domain.com/sam-spade or www.domain.com/sam-spade-biography, etc? Similar question for the meta description. Do you add any signals there? Or, is rel=author strong enough. Best,
On-Page Optimization | | ChristopherGlaeser
Christopher0 -
How to exclude URL filter searches in robots.txt
When I look through my MOZ reports I can see it's included 'pages' which it shouldn't have included i.e. adding filtering rules such as this one http://www.mydomain.com/brands?color=364&manufacturer=505 How can I exclude all of these filters in the robots.txt? I think it'll be: Disallow: /*?color=$ Is that the correct syntax with the $ sign in it? Thanks!
On-Page Optimization | | neenor0 -
Recommendation: Add a canonical URL tag referencing this URL to the header of the page.
Please clarify: In the page optimization tool, seomoz recommends using the canonical url tag on the unique page itself. Is it the same canonical url tag used when want juice to go to the original page? Although the canonical URL tag is generally thought of as a way to solve duplicate content problems, it can be extremely wise to use it on every (unique) page of a site to help prevent any query strings, session IDs, scraped versions, licensing deals or future developments to potentially create a secondary version and pull link juice or other metrics away from the original. We believe the canonical URL tag is a best practice to help prevent future problems, even if nothing is specifically duplicate/problematic today. Please give example.
On-Page Optimization | | AllIsWell0 -
Long Url but makes no sense
Hi Just joined. Crawl states that I am getting a lot of errors, looks like the spider is getting confused and looping back on itself ? Is there a way to see where the crawl was formulated (ie where from) ? It is generating urls like: http://www.wickman.net.au/wineauction/wine_auction_alert.aspx/auction/auction/auction/auction/auction/auction/Default.aspx from http://www.wickman.net.au/wineauction/wine_auction_alert.aspx
On-Page Optimization | | blinkybill0