HTTP Compression -- Any potential issues with doing this?
-
We are thinking about turning on the IIS-6 HTTP Compression to help with page load times. Has anyone had any issues with doing this, particularly from an SEO or site functionality standpoint? We just want to double check before we take this step and see if there are any potential pitfalls we may not be aware of. Everything we've read seems to indicate it can only yield positive results.
Any thoughts, advice, comments would be appreciated.
Thank-you,
Matt & Keith
-
Thanks.
-
Thanks.
-
I am aware that IE6 is old and many sites have dropped support for it. It's usage will vary by market. If the fix required 10 minutes of your time, you wouldn't do that for 1% or more of your potential customers?
If you have any Chinese users for instance, you'd want to make it work. Or if you're targeting people who are less tech-savvy or older in age, your IE6 usage numbers are bound to be higher. I agree that for most sites, it's probably not a huge issue. Since I experienced it on our site, I thought I'd mention it. If there is an issue, there is also likely a published fix that would require minimal effort.
-
You do realize that Microsoft has been trying to kill IE6 off, and just recently celebrated IE6 usage in the US dropping below 1%, right?
I wouldn't consider IE6 in your business plans.
-
Once you implement it, I'd check is that Internet Explorer 6 likes it. I can't remember the details, but when we added compression on our site, there were instances where IE6 didn't like it.
-
According to Google's Webmaster blog, Googlebot supports gzip and deflate
Googlebot: Sure. All major search engines and web browsers support gzip compression for content to save bandwidth. Other entries that you might see here include "x-gzip" (the same as "gzip"), "deflate" (which we also support), and "identity" (none).An incompatible compression would be the only downside to turning on compression.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Squarespace still have issues with adding Schema through Tag Manager?
I see in a forum posting from 2016 that Squarespace had issues with adding custom code via body tags, and am trying to troubleshoot some schema I've added via GTM using JSON-LD and Yoast's converter tool to a Squarespace website. Is the general consensus to still add JSON-LD script directly into the head? And if so, where?
Technical SEO | | ogiovetti1 -
Http:// vs Https:// in Og:URL
Hi, Recently, we have migrated our website from http:// to https://. Now, every URL is in https:// and we have used 301 permanent redirection for redirecting OLD URL's to New Ones. We have planned to include http:// link in og:url instead of https:// due to some social share issues we are facing. My concern is, if Google finds the self http:// URL on every page of my blog, will Google gets confused with http and https:// as we are providing the old URL to Google for crawling. Please advice. Thanks
Technical SEO | | SameerBhatia0 -
Tricky Duplicate Content Issue
Hi MOZ community, I'm hoping you guys can help me with this. Recently our site switched our landing pages to include a 180 item and 60 item version of each category page. They are creating duplicate content problems with the two examples below showing up as the two duplicates of the original page. http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=180&p=1 http://www.uncommongoods.com/fun/wine-dine/beer-gifts?view=all&n=60&p=1 The original page is http://www.uncommongoods.com/fun/wine-dine/beer-gifts I was just going to do a rel=canonical for these two 180 item and 60 item pages to the original landing page but then I remembered that some of these landing pages have page 1, page 2, page 3 ect. I told our tech department to use rel=next and rel=prev for those pages. Is there anything else I need to be aware of when I apply the canonical tag for the two duplicate versions if they also have page 2 and page 3 with rel=next and rel=prev? Thanks
Technical SEO | | znotes0 -
RSS feed issue
My Wordpress blog RSS feed is not working correctly and I can't figure why. This is the error I am getting in my sidebar where the RSS feed used to work properly. My Blog is http://www.seadwellers.com/key-largo-diving-blog/ RSS Error: This XML document is invalid, likely due to invalid characters. XML error: not well-formed (invalid token) at line 274, column 32 Any insight would be appreciated greatly! Rob
Technical SEO | | sdwellers0 -
HTTP 500 Internal Server Error, Need help
Hi, For a few days know google crawlers have been getting 500 errors from our dedicated server whenever they try to crawl the site. Using the "Fetch as Google" tool under health in webmaster tools, I get "Unreachable page" every time I fetch the homepage. Here is exactly what the google crawler is getting: <code>HTTP/1.1 500 Internal Server Error Date: Fri, 21 Jun 2013 19:52:27 GMT Server: Apache/2.2.15 (CentOS) X-Powered-By: PHP/5.3.3 X-Pingback: [http://www.communityadvocate.com/xmlrpc.php](http://www.communityadvocate.com/xmlrpc.php) Connection: close Transfer-Encoding: chunked Content-Type: text/html; charset=UTF-8 http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> My url is [http://www.communityadvocate.com](http://www.communityadvocate.com/)</code> and here's the screenshot from Goolge webmater http://screencast.com/t/FoWvqRRtmoEQ How can i fix that? Thank you
Technical SEO | | Vmezoz0 -
Duplicate page issue in website
i found duplicate pages in my website. seomoz is showing duplicate web pages this is issue or not please tell me?
Technical SEO | | learningall0 -
An odd duplicate content issue...
Hi all, my developers have just assured me that nothing has changed form last week but in the today's crawl I see all the website duplicated: and the difference on the url is the '/' so basically the duplicated urls are: htts://blabla.bla/crop htts://blabla.bla/crop/ Any help in understanding why is much appreciated. thanks
Technical SEO | | LeadGenerator0 -
I have a site that has both http:// and https:// versions indexed, e.g. https://www.homepage.com/ and http://www.homepage.com/. How do I de-index the https// versions without losing the link juice that is going to the https://homepage.com/ pages?
I can't 301 https// to http:// since there are some form pages that need to be https:// The site has 20,000 + pages so individually 301ing each page would be a nightmare. Any suggestions would be greatly appreciated.
Technical SEO | | fthead90