HTTP Vary:User-Agent Server or Page Level?
-
Looking for any insights regarding the usage of the Vary HTTP Header. Mainly around the idea that search engines will not like having a Vary HTTP Header on pages that don't have a mobile version, which means the header will be to be implemented on a page-by-page basis.
Additionally, does anyone has experience with the usage of the Vary HTTP Header and CDNs like Akamai?Google still recommends using the header, even though it can present some challenges with CDNs.
Thanks!
-
hey burnseo - if you're still getting notifications from this thread, would you happen to recall where you ended up finding info. that google recommends placing the vary header at page level? running into the same question myself. if you have links you could post to where you found the answer, that'd be great. thanks!
-
I would go by what Google recommends I cannot imagine Akamai being something bad for website or overwhelming it anyway. You may try using a C name with your www. straight to the CDN & if you're using a mobile subdomain like m. also having that go directly into your content delivery network.
I hope this is better help.
sincerely,
Thomas
-
I found some information that suggests that it is recommended to avoid using the Vary HTTP Header by User-Agent site-wide because search engines and (and this is Google) would assume the other version simply hadn't yet been discovered and perhaps keep looking for it. There is also a recommendation to implement the Vary Header on a page-level only when there is a mobile version. This only applies to sites that are serving mobile HTML content dynamically based in the user-agent. Additionally, there is some controversy around using the header when a CDN network like Akamai is in place because it can overload the site. Despite this controversy Google still recommends using the header. These seem to be two important points to consider before implementing the Vary HTTP Header.
-
Very true I shoud have compleated it woun't use a cell phone to Q&A
-
Thomas, it appears that this is taken from http://stackoverflow.com/questions/1975416/trying-to-understand-the-vary-http-header. Q&A is for original answers; if you are referring to another blog post, it's best to just put a link into the blog post and let people go there rather than copy work (that may be copyright) and use that as your answer. Thanks for understanding!
-
-
The
cache-control
header is the primary mechanism for an HTTP server to tell a caching proxy the "freshness" of a response. (i.e., how/if long to store the response in the cache) -
In some situations,
cache-control
directives are insufficient. A discussion from the HTTP working group is archived here, describing a page that changes only with language. This is not the correct use case for the vary header, but the context is valuable for our discussion. (Although I believe the Vary header would solve the problem in that case, there is a Better Way.) From that page:
Vary
is strictly for those cases where it's hopeless or excessively complicated for a proxy to replicate what the server would do.- This page describes the header usage from the server perspective, this one from a caching proxy perspective. It's intended to specify a set of HTTP request headers that determine uniqueness of a request.
A contrived example:
Your HTTP server has a large landing page. You have two slightly different pages with the same URL, depending if the user has been there before. You distinguish between requests and a user's "visit count" based on Cookies. But -- since your server's landing page is so large, you want intermediary proxies to cache the response if possible.
The URL, Last-Modified and Cache-Control headers are insufficient to give this insight to a caching proxy, but if you add
Vary: Cookie
, the cache engine will add the Cookie header to it's caching decisions.Finally, for small traffic, dynamic web sites -- I have always found the simple
Cache-Control: no-cache, no-store
andPragma: no-cache
sufficient.Edit -- to more precisely answer your question: the HTTP request header 'Accept' defines the Content-Types a client can process. If you have two copies of the same content at the same URL, differing only in Content-Type, then using
Vary: Accept
could be appropriate.Update 11 Sep 12:
I'm including a couple links that have appeared in the comments since this comment was originally posted. They're both excellent resources for real-world examples (and problems) with Vary: Accept; Iif you're reading this answer you need to read those links as well.
The first, from the outstanding EricLaw, on Internet Explorer's behavior with the Vary header and some of the challenges it presents to developers: Vary Header Prevents Caching in IE. In short, IE (pre IE9) does not cache any content that uses the Vary header because the request cache does not include HTTP Request headers. EricLaw (Eric Lawrence in the real world) is a Program Manager on the IE team.
The second is from Eran Medan, and is an on-going discussion of Vary-related unexpected behavior in Chrome:Backing doesn't handle Vary header correctly. It's related to IE's behavior, except the Chrome devs took a different approach -- though it doesn't appear to have been a deliberate choice.
-
-
Hey Thomas, thank you for your interest in answering my question. However, the question isn't really about using a CDN. It is more around how using the Vary HTTP Header can affect the CDN performance. In addition, I wanted to find guidance on where to implement the Vary HTTP Header as it was brought to my attention that search engines don't like it when this is implemented site wide even on pages that don't have a mobile version.
-
Hi Keri,
Thank you for the heads up on that. I definitely was having some technical issues. I have cleaned it up let me know if you think it is a need any more work.
Thank you for letting me know.
Sincerely,
Thomas
-
Thomas, I think the voice recognition software botched some of your reply. Could you go through and edit it a little? There are some words that seem to be missing. Thanks!
-
Hi,
For insights regarding the usage of the Vary HTTP Header.
I would check out this blog post right here
As far as using a content delivery network. I love them and have used quite a few. Depending on your budget there is a wide range
Use Anycast DNS with CDN's here is what I think of them.
#1 DNS DynECT (my fav)
#2 DNS Made Easy (great deal $25 for 10 domains for the YEAR)
#3 UltraDNS
#4 VerisignDNS
CDN's many have anycast DNS built in already
Check out this website it will give you a good view of what's going on this site
http://www.cdnplanet.com/cdns/
I don't know what you want for data however if you want a great CDN with support & killer price Max CDN it's only $39 for the first terabyte performs Amazon's cloudflaire Rackspace clouldfiles
My list of CDN's I would use the cost is anywhere form $39 a year to $4,000 a month if you said you where going to use video it will cost more as data adds up fast.
#1 Level 3 personal favorite content delivery network
http://www.level3.com/en/products-and-services/data-and-internet/cdn-content-delivery-network/
http://www.edgecast.com/free-trial/
http://mediatemple.net/webhosting/procdn/ You get 200 gb's a month for $20 it is 100% EdgeCast (just a reseller)
https://presscdn.com/ PRESSCDN is 50GB's for $10 month & gives you FOUR CDN's it has Max CDN, Edgecast, Akamai & cloudfront price for 150GB a month is $19
http://www.rackspace.com/cloud/files/
http://aws.amazon.com/cloudfront/
Look a thttp://cloudharmony.com/speedtest for speed testing
However please remember that coding makes a huge difference on websites and it is not really a fair depiction of speed.
You could use CloudFlare it is free I don't like it for for anything other than site protection it's not very fast and my opinion and is simply a proxy reverse proxy server
You get CloudFlare with Railgun already on
https://www.cloudflare.com/railgun cost is now $200 a month (Use Level 3 if paying that much)
Edge cast is a great content delivery network. However,you will have to buy it through a third-party that you want a full enterprise version. You can buy to media temple that you must use their DNS and it is only $20 a month.
However if you're going to spend over $20 a month I would strongly consider talking to Level 3. There notoriously high-priced however they just lowered their prices and you can negotiate some very sweet deals.
I would simply sign up for DNS made easy and MaxCDN if you don't have a content delivery network already & just convenient fast
It's also faster. It is faster than AWS cloudfront & rack space cloudfiles.
Max CDN is faster than anything else I have compared to the it's price range for almost double
But inexpensive service you will get Anycast DNS for $25 and the CDN would be $39 and that's for the year not the month
I hope this is been of help to you,and it answers your question. Please let me know if I could be of any more help.
Sincerely,
Thomas
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What to do with old conversion pages
Hey folks! I have a ton of old conversion pages from past trade shows, old webinars, etc that are either getting no traffic or very little. Wondering if I should just 404 them out? Here's an example: http://marketing.avidxchange.com/rent-manager-user-conference-demo-request-2015 For the pages getting traffic (from PPC, referral links, organic) my presumption is to keep those. The only problem is we have multiple instances of the same asset (prior marketers would just clone them for different campaigns), so in those cases should I 301 them to one version? Looking for advice on best practices here for future instances. Such as future trade shows, after we use the conversion pages at an event, should I just delete/404 them? Cleaning up old pages should I just delete/404? They don't have any value really and they're annoying to have hanging around. Thanks!
Technical SEO | | Bill_King0 -
Page Speed or Size?
Hi everyone. I have a client who really wants to add a 1min html5 video to the background of their homepage. I have managed to reduce the size of the video to 20MB and I have tested the page in pingdom. The results are 1.85 s to load, and weighed in at 21.2 MB. My question is does Google factor page load speed or size in it's ranking factors? I am also mindful of the negative effect this could have on bounce rate. Thanks.
Technical SEO | | WillWatrous0 -
Old Product Pages
Hi Issue: I have old versions of a product page in the Google index for a product that I still carry. Why: The URLs were changed when we updated this product page a few years ago. There are four different URLs for this product -- no duplicate content issues b/c we updated the product info, Title tags, etc. So I have a few pages indexed by Google for a particular product. Including a current, up-to-date page. The old pages don't get any traffic, but if I type in google search: "product name" site:store.com then all of the versions of this page appear. The old pages don't have any links to them, only one has any PA, and as I said they don't get any traffic, and the current page is around #8 in google for its keyword. Question: Do these old pages need 301 redirects, should I ask google to remove the old URLs? It seems like Google picks the right version of this page for this keyword query, is it possible that the existence of these other pages (that are not nearly as optimized for the keyword) drag it down a bit in the results? Thanks in advance for any help
Technical SEO | | IOSC0 -
Two of Pages Have Been SendBoxed
Hello, I was number 1-2 for my local keyword term, but now im nowhere, those two urls dont even show up in Google search results, my other pages DO, so that is obvious Google sendboxed them, i dont remember doing aggressive non quality link building, and its not a competitive term, since i was number 1 in Google for over 3 months or so i checked this tool and found that two of my urls are in sendbox http://www.searchenginegenie.com/sandbox-checker.htm I was never sendboxed before, can you help me how can i get out of this, since its my client's website, and i have to get those pages up as soon as possible Thank You
Technical SEO | | tonyklu0 -
How many pages should my site have?
Right now I think I only have 36. What is a good amount of pages to have? Any ideas on ways to add relevant pages to my site? I was thinking about starting a message board. Also, I have a free tech support chat room, and was thinking about posting the logs somewhere on the site. Does that sound like a good idea? Thanks.
Technical SEO | | eugenecomputergeeks0 -
Page not Accesible for crawler in on-page report
Hi All, We started using SEOMoz this week and ran into an issue regarding the crawler access in the on-page report module. The attached screen shot shows that the HTTP status is 200 but SEOMoz still says that the page is not accessible for crawlers. What could this be? Page in question
Technical SEO | | TiasNimbas
http://www.tiasnimbas.edu/Executive_MBA/pgeId=307 Regards, Coen SEOMoz.png0 -
301 lots of old pages to home page
Will it hurt me if i redirect a few hundred old pages to my home page? I currently have a mess on my hands with many 404's showing up after moving my site to a new ecommerce server. We have been at the new server for 2 years but still have 337 404s showing up in google webmaster tools. I don't think it would affect users as very few people woudl find those old links but I don't want to mess with google. Also, how much are those 404s hurting my rank?
Technical SEO | | bhsiao1 -
Page that has no link is being crawled
http://www.povada.com/category/filters/metal:Silver/nstart/1/start/1.htm I have no idea how the above page was even found by google but it seems that it is being crawled and Im not sure where its being found from. Can anyone offer a solution?
Technical SEO | | 13375auc30