Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
-
This one is tough, and I've asked it once here, http://www.quora.com/Search-Engine-Optimization-SEO/Is-having-rich-snippets-placed-below-a-review-that-is-pulled-via-javascript-considered-bad-grey-hat-SEO, but I feel that the response was sided with the company.
As an SEO or digital marketer, it seems that if we are pulling in our reviews via iframe for our users, but serving them through a nonscript tag when the user agent is a search engine, that this could be considered cloaking.
I understand that the "intent" may be to show the same thing to the bots as the user sees, but if you look at the view source, you'll never see the reviews, because it would only be delivered to the search engine bot.
What do you think?
-
I can't speak to the BV implementation aspect as I have no experience with it, however I will echo and agree with Takeshi on the other points as they are the best practice scenario
-
BV does provide a newer format for their reviews, if your server allows server side scripting such as PHP. I believe it's called "Cloud SEO". This is the simplest solution.
If you can't run PHP, then I would recommend talking to YourStoreWizards (http://www.yourstorewizards.com/). They provide customized solutions that can automate the data pulling and updating process.
As far as reviews.mysite.com, you want to get that de-indexed as soon as you get the HTML reviews on your site. Otherwise, not only will the subdomain compete with your main site for traffic, bu tall the reviews on your site will be seen as duplicate content.
-
Alright, this is where we are with this. What Takeshi recommended is a work around. Yes, it works, but it takes more man hours to constantly upload the info. If someone wanted to do this more seamlessly, how could we do that? I don't have an answer quite yet (but wanted to post our follow-up in case someone else stumbles upon this Q&A), but we're going to the company with these questions:
- We need someone on the phone that understands SEO and the BV installation on our site being SEO friendly; i.e. not a developer that knows about implementing BV, but an SEO person that understands the difference between cloaking and duplicate content with product pages.
- We want to know how we can get our product reviews on our product pages that can be seen in the html of the page; i.e. I can view source and see the review content there. This is inline with Takeshi's work around, but is there an easier way to do this where it's automatic?
- Having the reviews sent over via javascript when the bot requests the info seems to be inline with cloaking behavior that is considered bad with the search engines.
- We don’t want to add a ~1.5 second delay to getting the info pulled over for the bots to see it, as this will increase our PageSpeed. However, this seems to be the next best solution for getting up-to-date reviews in the code of the product page.
I know, not every tool is perfect, but if there is a gap here, I'd imagine that one of largest companies in review management would be able to tackle this - don't you think?
To me, this feels like our content is being hi-jacked. I have my reviews in iframes (sort of speak) on my product pages, but also at reviews.mysite.com/product-reviews, which is essentially duplicating my product pages... we're competing with ourselves. Is the best fix to NOINDEX that subdomain and not let the reviews be seen at all, or keep the pages up and just compete with ourselves in the SERPs? Or is there an easy way to get those reviews (our reviews) seen on our site from users and bots?
-
Perfect! Thanks again for the follow-up!
-
Yup. Once you have the GWT verification in the header, you should be able to deindex the entire subdomain instantly.
-
That sounds like the answer Takeshi! We were worried about manually doing it because the user wouldn't see their reviews instantly, but with how you're doing it, it doesn't matter, and (like you said) it shouldn't muddy the user experience.
Are you referring to the "Remove URLs" tool in Google Webmaster for deindexing?
-
Yes, we manually bulk upload the HTML reviews every couples weeks or so to keep them fresh. We also had BV noindex the review.subdomain so that it wasn't competing with us in the SERPs (have them add a noindex tag in the header as well as your Google Webmaster Tools verification code, so you can instantly deindex all the pages).
-
Great idea for having a link to an html version. How do you keep those updated? Is it manual? And do you just block the pages that they create over on the review.mysite.com sub-domain?
That is actually where we started looking at fixing things. I see that sub-domain they created as basically competing with our product pages. Why would that ever be the way a business would want to operate their site, it doesn't make sense to do that. But all I keep hearing is name drops of big brands. It's frustrating really.
-
I'm pretty sure that it's structured markup, but I will definitely be double checking before simply guessing on this one! Thanks Alan.
-
We use BazaarVoice reviews for our ecommerce site too. What we do is right below the iframe reviews, we have a link that says "click here to see more reviews". When you click the link, it opens up a div with the html version of the reviews. So similar idea to what you are proposing, but less "cloaky" than a noscript tag, and it doesn't impact user experience much.
BazaarVoice can also do html reviews that are not iframed if you have a server that can handle server side scripting like PHP (which unfortunately our legacy Yahoo store does not).
-
Ah to have 100% guarantees for anything related to SEO.
Alas, that's not the world we live in. However, we can apply critical thinking to each choice and with that, we are more likely to be safe from the wrath of Google.
SO - for this question let's consider the following:
A "Noscript" version of a site is designed first and foremost for people who have scripts turned off, including those who have browsers set up for either security reasons or for visual impairment needs.
So if you provide content within a noscript block that essentially mirrors what visitors get when scripts are turned on, you are not likely in violation of any Google cloaking policy.
Cloaking comes into play when you generate content purely for Googlebot exclusively or Googlebot and Bingbot.
So if the content you are provided via that zip file (which I assume you then need to manually cut and paste into the noscript portion of the code) is pure content and not over-optimized, you can proceed with confidence that you'll be okay.
Where I DO have concern is this:
"The daily snapshot files contain UGC with SEO-friendly markup tags." (emphasis mine). Exactly what do they mean by that specific wording? That's the concern point. Are they referring to proper structured markup for reviews, from Schema.org or at the very least RDFa reviews markup? If not, that would be problematic because only proper "reviews" specific structured markup should wrap around reviews content.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Massive Drop in Users
A client of mine is seeing crazy drops in users. Feb. 2018 was an all-time high in organic users, over 38,000. However, there has been a steep drop off since then. In November 2018, the number of organic users was at a little over 1,000. The client said there have been no major changes to the site. I have no idea what is happening. Below are things I have done already: - Submitted a disavow file- Fixed URL parameters- Performed an entire site audit. Fixed all site errors and re-wrote new metadata. - Made sure no main pages have been deleted. 301 redirects are in place where necessary. - Started to consolidate low-quality pages. If I had to guess, I would assume this is an E.A.T. related drop. I have no way of knowing though. This is a YMYL site, so we are working to increase E.A.T. However, any insight would be helpful. Rankings are dropping off so quick I'm not sure what else I can do. Please comment if you need more context. PLEASE HELP!!
White Hat / Black Hat SEO | | BryanPhelps-BigLeapWeb0 -
Url suddenlly diappeared from Google search results
Hi, I am facing a big problem wheel Google stop showing a basic url of my site, It was ranked good for more than 35 keywords from 1st to 8st positions, and suddenly I can find it indexed in Google , this is the URL : http://tv1.alarab.com/view-8/مسلسلات-عربية Thnaks
White Hat / Black Hat SEO | | alarab.net0 -
Press Releases. What are the reasons (if any) that I should even consider them?
For the last couple years, i believed common knowledge that press releases was NOT the way to go for SEO. So I avoided it like crazy! Now, I'm doing some research and I realize that some (or alot) companies still create press releases to get the word out so that they can spread the message. I am going to be signing with some clients that are bigger brands and I want to do everything in our power to get the word out. I've done social outreach, commenting and cold emails to influencers before, but now I've been seeing some competitors using press releases with a little success, I want to ask the community, is there a method they recommend of testing efficacy of press releases? My biggest fear is using press releases then months/years down the line be hit hard by a penalty just for testing different PR sites.
White Hat / Black Hat SEO | | JasonKhoo0 -
Meta tags for international domains
Hi Mozers I have 3 top level domains co.nz com.au and com each meta tag for the home page is unique for each country and I have tried to figure this out for the last 3 months, but unfortunately I can't seem to pin point why all 3 meta tags are showing up exactly the same. It seems all meta tags are showing up for the co.nz domain. In the attachments you can see all urls are showing up correctly for each country specific domain, however the meta tag description defaults to the NZ Any help around this would be much appreciated! Thanks all
White Hat / Black Hat SEO | | edward-may0 -
Alt tag best practices for a mutli gallery site with hundreds of images
I have neglected to add alt tags to one of my sites, and am ready to tackle the project. I want to make sure I do not do something that will have a negative impact on rankings....and I have not been able to find info that fits my situation. The pics are all about a product I make and sell through the site. I have a free gallery section that has about 10 galleries with about 20 pics each. Each gallery page has a different model and/or context of how the product. might be used. These are not sales pages directly just thumbnail galleries linked to larger images for the viewers enjoyment. I have 10 or so keyword phrases that would be good to use, with the intent to start getting listed in google images and other rank enhancements. Can I choose one keyword phrase as my alt tag choice for a whole gallery and give each individual large pic in the gallery that same alt tag, And use a different phrase for the next gallery's pics etc.? Or is that thought of as stuffing, and I would have to come up with a different keyword phrase for each pic? I hope that makes sense. Thanks Galen
White Hat / Black Hat SEO | | Tetruss0 -
Blogger Reviews w/ Links - Considered a Paid Link?
As part of my daily routine, I checked out inbound.org and stumbled upon an article about Grey Hat SEO techniques. One of the techniques mentioned was sending product to a blogger for review. My question is whether these types of links are really considered paid links. Why shouldn't an e-commerce company evangelize its product by sending to bloggers whose readership is the demographic the company is trying to target? In pre e-commerce marketing, it was very typical for a start-up company to send samples for review. Additionally, as far as flow of commerce is concerned, it makes sense for a product review to direct the reader to the company, whether by including a contact phone number, a mailing address, or in today's e-commerce world, a link to their website. I understand the gaming potential here (as with most SEO techniques, black-hat is usually an extreme implementation), but backlinks from honest product reviews shouldn't have a tinge of black, thus keeping it white-hat. Am I wrong here? Are these types of links really grey? Any help or insight is much appreciated!
White Hat / Black Hat SEO | | b40040400 -
Considering which agency to choose for a link building campaign is starting to seem like beating a dead horse.......
So first off, I've got to admit, I really haven't shopped around enough on SEOmoz's Recommended page. I have been doing some shopping, and have considered a few different people. The two main people that we are considering (or should I be saying 'were' right now?) is a company called Mainstreet Host. They have the best price, and when I first came into my partnership with Roseann at Uncommon Thread, she had already paid for some $1,500 trial of sorts. Our problems with these guys? Roseann says the sales guy is extremely pushy They want us to pay them a monthly fee to "optimize 11 pages, create 12 blog posts on wordpress blog, rss on homepage for fresh content, blah blah blah..." I was stuck on the fact that they want a recurring fee for a fairly small job I just looked into their ability to rank for the keyword they are targeting, and they rank #2 for a keyword difficulty score of 83. BUT, I looked into their linkbuilding and it's pretty blackhat. Several blog comments, mostly guest posts on what looked like some sore of article marketing site, and a few missing links according to opensiteexplorer.org did not say anything about link building other than a single Press Release distribution I guess my question is, is the $6,000 they want us to pay for those services actually going to get us to rank for some competitive terms? like keyword difficulty score 30 - 60? The other guys we have been considering is OrangeSoda. Right off the bat, they seem awesome, i mean just take a quick look at their site. but just like with the other company, they have a pretty dark backlink profile too. The only thing that they really have going for them is a few paid links on some sort of what appears to be semi-legitimate advertising partner based network. Google was on their too, near the bottom, which I thought was very strange, because it clearly discloses that its a paid network. They are asking $7,200 for 12 hours per week of work, in which time they will help us go through and fix any technical aspects, create a blog, and create content, as well as build a link building strategy. Should I keep shoppping??
White Hat / Black Hat SEO | | TylerAbernethy0 -
How much time do you think Google employees spend reverse engineering what we do?
Lets face it, it's the corner stone of SEO, reverse engineering sites to guess at what big G does. It would just make sense they did the same to learn all our tactics.
White Hat / Black Hat SEO | | naffhampton1