Penalty for Mixing Microdata with Metadata
-
The folks that built our website have insisted on including microdata and metadata on our pages.
What we end up with is something that looks like this in the header:
itemprop="description" content="Come buy your shoes from us, we've got great shoes.">
Seems to me that this would be a bad thing, however I can't find any info leaning one way or the other.
Can anyone provide insight on this?
-
Worth noting that meta desc isn't one of those 3 markup styles. it is a different thing completely so you aren't actually mixing schema in your example.
-
Thanks for sharing that link. That post is very informative.
-
Thanks for answering so quickly.
When I said "bad thing" I meant that I don't see how such redundancy could ever be beneficial.
Thank you for your thoughts.
-
I would read this post for more information: http://www.seomoz.org/blog/schema-examples
The post discusses how Google used to support 3 different styles of Markup but with the creation of Schema.org, decided to only use that going forward. Any websites with existing markup would be okay though.
Google also mentioned (noted in the article above) that you should avoid mixing different types of markup formats on the same page as it can confuse their parsers.
-
Why do you think this would be a bad thing? I'd question how much benefit will be gained in most areas by doing this, but I can't see it causing harm and it is good to get in there now with this rather than adding it later (assuming you've backed the right format!).
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Review Microdata if the product does not have a review
We have many products that do not have reviews and are causing warnings in googles microdata. Is there anything you can do for products without a review so you don't get errors for aggregate review or review??
Algorithm Updates | | jforthofer0 -
Schema.org Microdata or Microformats - Which Should We Use
Hi All, I'm wondering what would be the better alternative - schema.org microdata or microformats. I am aware that search engines such as Google, Yahoo, and Bing recognize Schema.org as the standard. Question is, will it have any negative affect? Our web developer here says that schema.org microdata may result in negative html. I don't think that it will affect our SEO, but I guess that's also something to shed some light on. So, what's the consensus here - should we implement schema.org or go with microformats - or, does it really make any difference?
Algorithm Updates | | CSawatzky1 -
Could we run into issues with duplicate content penalties if we were to borrow product descriptions?
Hello, I work for an online retailer that has the opportunity to add a lot of SKUs to our site in a relatively short amount of time by borrowing content from another site (with their permission). There are a lot of positives for us to do this, but one big question we have is what the borrowed content will do to our search rankings (we normally write our own original content in house for a couple thousand SKUs). Organic search traffic brings in a significant chunk of our business and we definitely don't want to do something that would jeopardize our rankings. Could we run into issues with duplicate content penalties if we were to use the borrowed product descriptions? Is there a rule of thumb for what proportion of the site should be original content vs. duplicate content without running into issues with our search rankings? Thank you for your help!
Algorithm Updates | | airnwater0 -
[G Penalty?] Significant Traffic Drop From All Sources
My client's traffic started to significantly decrease around Nov 21 (Panda update 22). This includes traffic from all sources - search engines (G, B, & Y!), direct, AND referral. At first we thought it was a G penalty but G answered our reconsideration request by stating that no manual penalty had occured. It could be algo penalty, but again, the site has been hit across all sources. Client has done zero backlinking - it is all natural. No Spam, etc.. All of his on-site SEO is perfect (700+ pages indexed, all unique content, unique title and desc). On Oct 16, he switched from his old URL to a new URL and did proper redirects. (Last year - Dec 2011 - he switched his CMS to Drupal and although there was a temporary decrease in traffic, it showed recovery within a month or so.) He does zero social on his site and he has many ads above the fold. Nevertheless, the traffic decrease is not source specific. In other words, all sources have decreased since Nov 21, 2012 and have not recovered. What is going on? What can be the explanation for decrease in traffic across all sources? This would be easy to answer if it was only Google Organic decrease but since direct and referral have also been hit, we cannot locate the problem. Please share your personal experiences as well as advice on where we should look. Could this be negative SEO? Where would we look? ANY ADVICE IS WELCOME !!!! Every bit counts Thanks!!
Algorithm Updates | | GreenPush0 -
Duplicate Content & www.3quarksdaily.com, why no penalty?
Does anyone have a theory as to why this site does not get hit with a DC penalty? The site is great, and the information is good but I just cannot understand the reason that this site does not get hit with a duplicate content penalty as all articles are posted elsewhere. Any theories would be greatly appreciated!
Algorithm Updates | | KMack0 -
Forum software penalties
I'm hoping to solicit some feedback on what people feel would be SEO best practices for message board/forum software. Specifically, while message boards that are healthy can generate tons of unique content, they also can generate a fair share of thin content pages. These pages include... Calendar pages that can have a page for each day of each month for 10 years! (thats like 3650 pages of just links). User Profile pages, which depending on your setup can tend to be thin. The board I work with has 20k registered members, hence 20k user profile pages. User lists which can have several hundred pages. I believe Google is pretty good at understanding what is message board content, but there is still a good chance that one could be penalized for these harmless pages. Do people feel that the above pages should be noindexed? Another issue is that of unrelated content. Many forums have their off-topic areas (the Pub or Hangout or whatever). On our forum up to 40% of the content is off-topic (when I say content I mean number of post versus raw word count). What are the advantages and disadvantages of such content? On one hand they expand the keywords you can rank for. On the other hand it might generate google organic traffic which you might now want because of a high bounce rate. Does too much indexable content that is unique dilute your good content?
Algorithm Updates | | entropytc1 -
Is this a possible Google penalty scenario?
In January we were banned from Google due to duplicate websites because of a server configuration error by our previous webmaster. Around 100 of our previously inactive domain names were defaulted to the directory of our company website during a server migration, thus showing the exact same site 100 times... obviously Google was not game and banned us. At the end of February we were allowed back into the SERPS after fixing the issue and have since steadily regained long-tail keyword phrase rankings, but in Google are still missing our main keyword phrase. This keyword phrase brings in the bulk of our best traffic, so obviously it's an issue. We've been unable to get above position 21 for this keyword, but in Yahoo, Bing, and Yandex (Russian SE) we're positions 3, 3, and 7 respectively. It seems to me there has to be a penalty in effect, as this keyword gets between 10 and 100 times as much traffic in Google than any of the ones we're ranked for, what do you think? EDIT: I should mention in the 4-5 years prior to the banning we had been ranked between 15 and 4th in Google, 80% of the time on the first page.
Algorithm Updates | | ACann0 -
Penalty or Algorithm hit?
After the Google Algorithm was updated my site took a week hit in traffic. The traffic came back a week later and was doing well a week AFTER the algorithm change and I decided that I should do a 301 redirect to make sure I didn't have duplicate content (www. vs. http://) I called my hosting company (I won't name names but it rhymes w/ Low Fatty) and they guided me through the supposedly simple process.. Well, they had me create a new (different) IP address and do a domain forward (sorry about bad terminology) to the www. This was in effect for approximately 2 weeks before I discovered it and came along with a subsequent massive hit in traffic. I then corrected the problem (I hope) by restoring the old IP address and setting up the HTACESS file to redirect all to www. It is a couple weeks later and my traffic is still in the dumps. On WMT instead of getting traffic from 10,000 keywords I'm getting it only from 2k. Is my site the victim of some penalty (I have heard of sandbox) or is my site simply just lower in traffic due to the new algorithm (I checked analytics data to find that traffic only in the US is cut by 50%, it is the same outside the US) Could someone please tell me what is going on?
Algorithm Updates | | askthetrainer0