OnPage Issues with UTF-8 and ISO-8859-1
-
Hi guys,
I hope somebody can help me figure this out. On one of my sites I set the charset to UTF-8 in the content-type meta-tag. The file itself is also UTF-8. If I type german special chars like ä, ö, ß and the like they get displayed as a tilted square with a questionmark inside.
If I change the charset to iso-8859-1 they are getting displayed properly in the browser but services like twitter are still having the issues and stop "importing" content once they reach one of those specialchars.
I would like to avoid having to htmlencode all on-page content, so my preference would be using UTF-8..
You can see it in action when you visit this URL for example: http://www.skgbickenbach.de/aktive/1b/artikel/40-minuten-fußball-reichen-nicht_1045?charset=utf-8
Remove the ?charset parameter and the charset it set to iso-8859-1.
Hope somebody has an answer or can push me into the right direction.
Thanks in advance and have a great day all.
Jan
-
Hi Jan,
I'm following up on old questions. Did Theo's answer solve your problem for you?
-
Wow good answer!
-
Getting a web page to display your content as TRUE utf-8 requires everything to be set at the utf-8 encoding. 'Everything' includes: your database, your database tables, your database fields, your connection from php to your database, you header as set by php, your header as set by html, your content itself etc.
The following resources were extremely helpful to me when I was switching to utf-8 (which is by far the better encoding over ISO):
http://www.phpwact.org/php/i18n/utf-8
http://www.phpwact.org/php/i18n/utf-8/mysql
Bonus tip: make sure your content and files are saved as utf-8 without BOM (Byte Order Mark), this will save you lots of trouble later!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
#1 rankings on both HTTP and HTTPS vs duplicate content
We're planning a full migrate to HTTPS for our website which is accessible today by both **www.**website.com, **http://**www.website.com as well as **https://**www.website.com. After the migrate the website will only be accessible by https requests and every other request (Ex. www or http) will be redirected to the same page but in HTTPS by 301 redirects. We've taken a lot of precautions like fixing all the internal links to HTTPS instead of HTTP etc. My questions is: What happened to your rankings for HTTP after making a full migrate to HTTPS?
Technical SEO | | OliviaStokholm0 -
Sitemap issue
How can I create XML as well as HTML sitemaps for my website (both eCommerce and non - eCommerce )Is there any script or tool that helps me making perfect sitemapPlease suggest
Technical SEO | | Obbserv0 -
I am having an issue with my rankings
I am having an issue with my rankings. I am not sure if there are issues with onpage dup content or with the way wordpress is behaving but there is no reason based upon the sites back link profile that the site shouldn't be ranking well. The site is mesocare.org. If anyone can help it would be appreciated.
Technical SEO | | weitzluxenberg0 -
.htaccess Redirect 301 issues
I have completely rewritten my web site, adding structure to the file directories. Subsequently added was Redirect information within the .htaccess file. The following example ...
Technical SEO | | Cyberace
Redirect 301 /armaflex.html http://www.just-insulation.com/002-brands/armaflex.html
Returns this response in the URL bar of ...
http://www.just-insulation.com/002-brands/armaflex.html?file=armaflex
I am at a loss to understand why the suffix "?file=armaflex" is added The following code is inserted at the top of the file ...
RewriteEngine On redirect html pages to the root domain RewriteRule ^index.html$ / [NC,R,L] Force www. prefix in URLs and redirect non-www to www RewriteCond %{http_host} ^just-insulation.com [NC]
RewriteRule ^(.*)$ http://www.just-insulation.com/ [R=301,NC] Any advice would be most welcome.0 -
Duplicate page issue
Hi, i have a serious duplicate page issue and not sure how it happened and i am not sure if anyone will be able to help as my site was built in joomla, it has been done through k2, i have never come across this issue before i am seem to have lots of duplicate pages under author names, example http://www.in2town.co.uk/blog/diane-walker this page is showing the full articles which is not great for seo and it is also showing that there are hundreds more articles at the bottom on the semoz tool i am using, it is showing these as duplicates although there are hundreds of them and it is causing google to see lots of duplicate pages. Diane Walker
Technical SEO | | ClaireH-184886
http://www.in2town.co.uk/blog/diane-walker/Page-2 5 1 0
Diane Walker
http://www.in2town.co.uk/blog/diane-walker/Page-210 1 1 0
Diane Walker
http://www.in2town.co.uk/blog/diane-walker/Page-297 1 1 0
Diane Walker
http://www.in2town.co.uk/blog/diane-walker/Page-3 5 1 0
Diane Walker can anyone please help me to sort this important issue out.0 -
Development Website Duplicate Content Issue
Hi, We launched a client's website around 7th January 2013 (http://rollerbannerscheap.co.uk), we originally constructed the website on a development domain (http://dev.rollerbannerscheap.co.uk) which was active for around 6-8 months (the dev site was unblocked from search engines for the first 3-4 months, but then blocked again) before we migrated dev --> live. In late Jan 2013 changed the robots.txt file to allow search engines to index the website. A week later I accidentally logged into the DEV website and also changed the robots.txt file to allow the search engines to index it. This obviously caused a duplicate content issue as both sites were identical. I realised what I had done a couple of days later and blocked the dev site from the search engines with the robots.txt file. Most of the pages from the dev site had been de-indexed from Google apart from 3, the home page (dev.rollerbannerscheap.co.uk, and two blog pages). The live site has 184 pages indexed in Google. So I thought the last 3 dev pages would disappear after a few weeks. I checked back late February and the 3 dev site pages were still indexed in Google. I decided to 301 redirect the dev site to the live site to tell Google to rank the live site and to ignore the dev site content. I also checked the robots.txt file on the dev site and this was blocking search engines too. But still the dev site is being found in Google wherever the live site should be found. When I do find the dev site in Google it displays this; Roller Banners Cheap » admin dev.rollerbannerscheap.co.uk/ A description for this result is not available because of this site's robots.txt – learn more. This is really affecting our clients SEO plan and we can't seem to remove the dev site or rank the live site in Google. In GWT I have tried to remove the sub domain. When I visit remove URLs, I enter dev.rollerbannerscheap.co.uk but then it displays the URL as http://www.rollerbannerscheap.co.uk/dev.rollerbannerscheap.co.uk. I want to remove a sub domain not a page. Can anyone help please?
Technical SEO | | SO_UK0 -
Geo Domains & SEO Issues
Hi Is there issues to duplicate content on geo specific domains. For example we have a client with a .co.uk site who wants to create a .ie website for the local market, rather than sending them to the UK site. The content would be duplicated with limited customization. What issues would you for-see and how could they be overcome? Thank you
Technical SEO | | RadicalMedia0 -
XML Sitemap Issue or not?
Hi Everyone, I submitted a sitemap within the google webmaster tools and I had a warning message of 38 issues. Issue: Url blocked by robots.txt. Description: Sitemap contains urls which are blocked by robots.txt. Example: the ones that were given were urls that we don't want them to be indexed: Sitemap: www.example.org/author.xml Value: http://www.example.org/author/admin/ My issue here is that the number of URL indexed is pretty low and I know for a fact that Robot.txt aren't good especially if they block URL that needs to be indexed. Apparently the URLs that are blocked seem to be URLs that we don't to be indexed but it doesn't display all URLs that are blocked. Do you think i m having a major problem or everything is fine?What should I do? How can I fix it? FYI: Wordpress is what we use for our website Thanks
Technical SEO | | Tay19860