Cloaking & IP Delivery - Has Rand Gone to the Dark Side?
The author's views are entirely their own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
Cloaking is a basic part of the SEO world, yet I think many of us ascribe its use to purely black-hat pursuits like spamming or tricking the search engines. In point of fact, cloaking can be used for entirely legitimate purposes. For example:
- Sites like the NYTimes, Salon.com & the Wall Street Journal will often show different content to search spiders than to human visitors. Advertisements, pop-ups and drop down ads that hide content are frequently avoided just by changing your agent to googlebot.
- Occassionally, when issuing a 301-redirect, you may want only visitors to be sent elsewhere, while search engines find a 404. This can happen in the event that you want humans to be routed to an appropriate resource, but no longer want to have search traffic coming your way for a given query or pass on link value to the forwarded page.
- Search engines aren't very good at filling out questionairres or registration forms, so its often good to let them bypass these features, particularly if you're giving visitors the option to skip it, too.
- If you're experimenting with multi-variate or A/B testing on your site, it's wise to show the search engines only a single version of a page, rather than having them see changed content every time they visit what is, in essence, a static URL. Once you've completed the testing, you can change the page for everyone and search engines and visitors will both see the same content.
- Unique tracking systems that measure which specific link a visitor hit on a page to get to a URL might require complex URL strings or javascript actions that can severely affect search engine rankings and crawling. Using cloaking here will deliver identical experiences for engines and visitors, while providing the data you want (and avoiding the data you don't care about - from automated visits).
What's the best way to do this?
The simple answer is IP delivery - another term usually reserved for the black hats. Showing content to bots based on their IP address is, generally speaking, the most efficient and problem free way to ensure that you don't have snafus in crawling. You can obtain lists of IP addresses from many sources, but a good one is usually your weblogs or those of your host (many sell this data). If you have concerns that you're not hitting all the right IP blocks, contacting a specialist isn't a bad idea - it's actually something we've done when we've been in need.
While it may be odd to hear a "white hat" (as those of us at SEOmoz are generally referred to) issuing advice like this, it shouldn't come as a surprise. I remember being shocked when we realized that the only way to get a project done correctly and efficiently was to cloak, but in reality, cloaking is much more about intent than technology. Serving different data to search engines and visitors isn't evil unless you're baiting and switching or hurting the user/searcher experience. It's wise to remember how well stocked your arsenal of tools in search marketing can be when you remove the limitations of negative terminology.
What's your take on IP delivery & cloaking?
Comments
Please keep your comments TAGFEE by following the community etiquette
Comments are closed. Got a burning question? Head to our Q&A section to start a new conversation.