Would reviews being served to a search engine user agent through a noscript tag (but not shown for other user types) be considered cloaking?
-
This one is tough, and I've asked it once here, http://www.quora.com/Search-Engine-Optimization-SEO/Is-having-rich-snippets-placed-below-a-review-that-is-pulled-via-javascript-considered-bad-grey-hat-SEO, but I feel that the response was sided with the company.
As an SEO or digital marketer, it seems that if we are pulling in our reviews via iframe for our users, but serving them through a nonscript tag when the user agent is a search engine, that this could be considered cloaking.
I understand that the "intent" may be to show the same thing to the bots as the user sees, but if you look at the view source, you'll never see the reviews, because it would only be delivered to the search engine bot.
What do you think?
-
I can't speak to the BV implementation aspect as I have no experience with it, however I will echo and agree with Takeshi on the other points as they are the best practice scenario
-
BV does provide a newer format for their reviews, if your server allows server side scripting such as PHP. I believe it's called "Cloud SEO". This is the simplest solution.
If you can't run PHP, then I would recommend talking to YourStoreWizards (http://www.yourstorewizards.com/). They provide customized solutions that can automate the data pulling and updating process.
As far as reviews.mysite.com, you want to get that de-indexed as soon as you get the HTML reviews on your site. Otherwise, not only will the subdomain compete with your main site for traffic, bu tall the reviews on your site will be seen as duplicate content.
-
Alright, this is where we are with this. What Takeshi recommended is a work around. Yes, it works, but it takes more man hours to constantly upload the info. If someone wanted to do this more seamlessly, how could we do that? I don't have an answer quite yet (but wanted to post our follow-up in case someone else stumbles upon this Q&A), but we're going to the company with these questions:
- We need someone on the phone that understands SEO and the BV installation on our site being SEO friendly; i.e. not a developer that knows about implementing BV, but an SEO person that understands the difference between cloaking and duplicate content with product pages.
- We want to know how we can get our product reviews on our product pages that can be seen in the html of the page; i.e. I can view source and see the review content there. This is inline with Takeshi's work around, but is there an easier way to do this where it's automatic?
- Having the reviews sent over via javascript when the bot requests the info seems to be inline with cloaking behavior that is considered bad with the search engines.
- We don’t want to add a ~1.5 second delay to getting the info pulled over for the bots to see it, as this will increase our PageSpeed. However, this seems to be the next best solution for getting up-to-date reviews in the code of the product page.
I know, not every tool is perfect, but if there is a gap here, I'd imagine that one of largest companies in review management would be able to tackle this - don't you think?
To me, this feels like our content is being hi-jacked. I have my reviews in iframes (sort of speak) on my product pages, but also at reviews.mysite.com/product-reviews, which is essentially duplicating my product pages... we're competing with ourselves. Is the best fix to NOINDEX that subdomain and not let the reviews be seen at all, or keep the pages up and just compete with ourselves in the SERPs? Or is there an easy way to get those reviews (our reviews) seen on our site from users and bots?
-
Perfect! Thanks again for the follow-up!
-
Yup. Once you have the GWT verification in the header, you should be able to deindex the entire subdomain instantly.
-
That sounds like the answer Takeshi! We were worried about manually doing it because the user wouldn't see their reviews instantly, but with how you're doing it, it doesn't matter, and (like you said) it shouldn't muddy the user experience.
Are you referring to the "Remove URLs" tool in Google Webmaster for deindexing?
-
Yes, we manually bulk upload the HTML reviews every couples weeks or so to keep them fresh. We also had BV noindex the review.subdomain so that it wasn't competing with us in the SERPs (have them add a noindex tag in the header as well as your Google Webmaster Tools verification code, so you can instantly deindex all the pages).
-
Great idea for having a link to an html version. How do you keep those updated? Is it manual? And do you just block the pages that they create over on the review.mysite.com sub-domain?
That is actually where we started looking at fixing things. I see that sub-domain they created as basically competing with our product pages. Why would that ever be the way a business would want to operate their site, it doesn't make sense to do that. But all I keep hearing is name drops of big brands. It's frustrating really.
-
I'm pretty sure that it's structured markup, but I will definitely be double checking before simply guessing on this one! Thanks Alan.
-
We use BazaarVoice reviews for our ecommerce site too. What we do is right below the iframe reviews, we have a link that says "click here to see more reviews". When you click the link, it opens up a div with the html version of the reviews. So similar idea to what you are proposing, but less "cloaky" than a noscript tag, and it doesn't impact user experience much.
BazaarVoice can also do html reviews that are not iframed if you have a server that can handle server side scripting like PHP (which unfortunately our legacy Yahoo store does not).
-
Ah to have 100% guarantees for anything related to SEO.
Alas, that's not the world we live in. However, we can apply critical thinking to each choice and with that, we are more likely to be safe from the wrath of Google.
SO - for this question let's consider the following:
A "Noscript" version of a site is designed first and foremost for people who have scripts turned off, including those who have browsers set up for either security reasons or for visual impairment needs.
So if you provide content within a noscript block that essentially mirrors what visitors get when scripts are turned on, you are not likely in violation of any Google cloaking policy.
Cloaking comes into play when you generate content purely for Googlebot exclusively or Googlebot and Bingbot.
So if the content you are provided via that zip file (which I assume you then need to manually cut and paste into the noscript portion of the code) is pure content and not over-optimized, you can proceed with confidence that you'll be okay.
Where I DO have concern is this:
"The daily snapshot files contain UGC with SEO-friendly markup tags." (emphasis mine). Exactly what do they mean by that specific wording? That's the concern point. Are they referring to proper structured markup for reviews, from Schema.org or at the very least RDFa reviews markup? If not, that would be problematic because only proper "reviews" specific structured markup should wrap around reviews content.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have very good backlinks but not showing in search console?
hi, i have some editorial links from some sites, they are appearing in google search results even when i select verbatim, long story short-when will they show up in search console. are they indexed by google? if they are showing in search results , does that mean they will also show up in search console? i am confused.
White Hat / Black Hat SEO | | Sam09schulz0 -
Crawl anamoly issue on Search Console
Has anyone checked the crwal anamoly issue under the index section on Search console? We recently move to a new site and I'm seeing a huge list of excluded urls which are classified as crawl anamoly (they all lead to 404 page). Does anyone know that if we need to 301 redirect all the links? Is there any other smarter/ more efficiently way to deal with them like set up canonical link (I thought that's what they're used for isn't it?) Thanks!
White Hat / Black Hat SEO | | greenshinenewenergy0 -
Does this type of link pass juice?
I have a backlink that looks like this: https://theirsite.com/go/?t=https%3A//www.mysite.com Will that pass link juice?
White Hat / Black Hat SEO | | vcj0 -
Search Console Incorrectly Identifies WordPress Version and Recommends Update
Howdy, Moz fans, Today I received four emails from Google Search Console recommending I update WordPress. The message reads, "Google has detected that your site is currently running WordPress 3.3.1, an older version of WordPress. Outdated or unpatched software can be vulnerable to hacking and malware exploits that harm potential visitors to your site. Therefore, we suggest you update the software on your site as soon as possible." This is incorrect, however, since I've been on 4.3.1 for a while. 3.3.1 was never even installed since this site was created in September, 2015, so the initial WP Engine install was likely 4.3. What's interesting is that it doesn't list the root URL as the problem source. The email states that it found that issue on a URL that is set up via WP Engine to 301 to a different site, which doesn't use WordPress. I also have other redirects set up to different pages on the second site that aren't listed in the Search Console email. Anyone have any ideas as to what's causing this misidentification of WP versions? I am afraid that Google sees this as a vulnerability and is penalizing my site accordingly. Thanks in advance!
White Hat / Black Hat SEO | | jmorehouse0 -
Keyword in alt tag and future G Updates
Hello, I notice that it is common practice to put the page's keywords directly into an alt tag. I don't see how this helps the user and how it helps the user using screen readers and such. Do you think future G updates will slightly penalize pages with alt tags that are just the page's keywords and not a helpful phrase? What do you recommend to put in alt tags in light of future G updates?
White Hat / Black Hat SEO | | BobGW1 -
Site Search external hosted pages - Penguin
Hi All, On the site www.myworkwear.co.uk we have a an externally hosted site search that also creates separately hosted pages of popular searches which rank in Google and create traffic. An example of this is listed below: Google Search: blue work trousers (appears on front page of Google) Site Champion Page: http://workwear.myworkwear.co.uk/workwear/Navy%20Blue%20Work%20Trousers Nearest Category page: http://www.myworkwear.co.uk/category/Mens-Work-Trousers-936.htm Could this be a penalisation or duplication factor? Could these be interpreted as a dodgy link factor? Thanks in advance for your help. Kind Regards, Andy Southall
White Hat / Black Hat SEO | | MarzVentures0 -
Considering which agency to choose for a link building campaign is starting to seem like beating a dead horse.......
So first off, I've got to admit, I really haven't shopped around enough on SEOmoz's Recommended page. I have been doing some shopping, and have considered a few different people. The two main people that we are considering (or should I be saying 'were' right now?) is a company called Mainstreet Host. They have the best price, and when I first came into my partnership with Roseann at Uncommon Thread, she had already paid for some $1,500 trial of sorts. Our problems with these guys? Roseann says the sales guy is extremely pushy They want us to pay them a monthly fee to "optimize 11 pages, create 12 blog posts on wordpress blog, rss on homepage for fresh content, blah blah blah..." I was stuck on the fact that they want a recurring fee for a fairly small job I just looked into their ability to rank for the keyword they are targeting, and they rank #2 for a keyword difficulty score of 83. BUT, I looked into their linkbuilding and it's pretty blackhat. Several blog comments, mostly guest posts on what looked like some sore of article marketing site, and a few missing links according to opensiteexplorer.org did not say anything about link building other than a single Press Release distribution I guess my question is, is the $6,000 they want us to pay for those services actually going to get us to rank for some competitive terms? like keyword difficulty score 30 - 60? The other guys we have been considering is OrangeSoda. Right off the bat, they seem awesome, i mean just take a quick look at their site. but just like with the other company, they have a pretty dark backlink profile too. The only thing that they really have going for them is a few paid links on some sort of what appears to be semi-legitimate advertising partner based network. Google was on their too, near the bottom, which I thought was very strange, because it clearly discloses that its a paid network. They are asking $7,200 for 12 hours per week of work, in which time they will help us go through and fix any technical aspects, create a blog, and create content, as well as build a link building strategy. Should I keep shoppping??
White Hat / Black Hat SEO | | TylerAbernethy0 -
Using Canonical Tags to Boost Lead Form Ranking
I presently have a number of whitepapers that bring traffic to our site. If a visitor elects to download the whitepaper they are taken to a lead form with an abstract of the whitepaper. The abstract is present because the visitor may or may not have come to the lead form directly. I imagine this would be a "no no," but how do you feel about placing a canoncial tag on a whitepaper that points to the lead form w/ abstract? The obvious idea being to take the umph of a whitepaper to direction search visitors directly to the lead form.
White Hat / Black Hat SEO | | shoffy0