Schema.org Microdata or Microformats - Which Should We Use
-
Hi All,
I'm wondering what would be the better alternative - schema.org microdata or microformats. I am aware that search engines such as Google, Yahoo, and Bing recognize Schema.org as the standard. Question is, will it have any negative affect? Our web developer here says that schema.org microdata may result in negative html. I don't think that it will affect our SEO, but I guess that's also something to shed some light on.
So, what's the consensus here - should we implement schema.org or go with microformats - or, does it really make any difference?
-
i think the case can be made for any of the three main formats. all have their pros & cons, but Google recommends schema.org microdata
https://support.google.com/webmasters/answer/99170?hl=en
structured data offers search engines more information about the site. more information they can use to evaluate the relevancy of your site to a query as well as the depth of content for a richer snippet. as far as it negatively affecting SEO, i would say take the same precautions as you would with any other tactic. implement it as cleanly & honestly as possible. if it's done with (perceived) manipulation for the sole purpose of better rankings it can negatively affect the a website's SEO. but that's just a good rule of thumb regardless of the tactic.
-
Hi Pedram,
You are correct when you say Schema.org is the standard by the major search engines.
Here's a great discussion on the subject here: http://programmers.stackexchange.com/questions/166612/schema-org-vs-microformats
Hope this helps
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does using non-https links (not pages) impact or penalise the website rankings?
Hi community, We have couple of pages where we we have given non-https (http) hyperlinks by mistake. They will redirect to http links anyway. Does using these http links on page hurt any rankings? Thansk
Algorithm Updates | | vtmoz0 -
Is there any way to prevent Google from using structured data on specific pages?
I've noticed that Google is now serving what looks like host-specific video cards on mobile for our site. Is there any way to control which videos are included in these lists without removing the structured data on those clip pages or user pages? We don't want to noindex those pages but we don't want content from those pages to appear as video cards. 1kzPW
Algorithm Updates | | Garrett570 -
How to find keywords competitor is using
I am doing work for a landscaping company and having trouble with finding the best keywords. Most of keywords are so expensive on adwords to use, so obviously we want to optimize as best possible. How do I find what keywords competitors are using for campaigns and/or optimized for? thx.
Algorithm Updates | | SexyLeggings0 -
VRL Parameters Question - Exclude? or use a Canonical Tag?
I'm trying to figure something out, as I just finished my "new look" to an old website. It uses a custom built shopping cart, and the system worked pretty well until about a year when ranking went down. My primary traffic used to come from top level Brand pages. Each brand gets sorted by the shopping cart and a Parameter extension is added... So customers can click Page 1 , Page 2 , Page 3 etc. So for example : http://www.xyz.com/brand.html , http://www.xyz.com/brand.html?page=1 , http://www.xyz.com/brand.html?page=2 and so on... The page= is dynamic, therefore the page title, meta's, etc are the same, however the products displayed are different. I don't want to exclude the parameter page= completely, as the products are different on each page and obviously I would want the products to be indexed. However, at the same time my concern is that have these parameters might be causing some confusion an hence why I noticed a drop in google rankings. I also want to note - with my market, its not needed to break these pages up to target more specific keywords. Maybe using something like this would be the appropriate measure?
Algorithm Updates | | Southbay_Carnivorous_Plants0 -
AS we using the keyword related to our link but we are not listed in first page of Google search
AS we using the keyword related to our link but we are not listed in first page of Google search, but our competitors using the same keyword , they are listing in first page. how we can short this problem and get into first page on search
Algorithm Updates | | krisanantha0 -
Using the canonical tag across multiple domains...
Hi guys I am looking for some help in regards to using canonical tags in other domains that have similar content to our main site. Would this be the right way to go about it? For example www.main.com is the website i would like to achieve best ranking with, but i also have other websites, www.secondary.com and www.somethingelse.com which have similar content and all link back to www.main.com So in order to make sure the google bot knows these other pages are a reference to the main.com page can i put a canonical tag in secondary.com that goes like this: rel="canonical" href="www.main.com" /> and put that same tag in somethingelse.com Would i achieve a better ranking for doing so on main.com or am i on the wrong track and will doing so not change a thing? I hope I'm making sense 😉 Best regards, Manny
Algorithm Updates | | Manny20000 -
Should I use canonical tags on my site?
I'm trying to keep this a generic example, so apologies if this is too vague. On my main website, we've always had a duplicate content issue. The main focus of our site is breaking down to specific, brick and mortar locations. We have to duplicate the description of product/service for every geographic location (this is a legal requirement). So for example, you might have the parent "product/service" page targeting the term, and then 100's of sub pages with "product/service San Francisco", "product/service Austin", etc. These pages have identical content except for the geographic location is dynamically swapped out. There is also additional useful content like google map of area, local resources, etc. As I said this was always seen as an SEO issue, specifically you could see in the way that googlebot would crawl pages and how pagerank flowed through the site that having 100's of pages with identical copy and just swapping out the geographic location wasn't seen as good content, however we still always received traffic and conversions for the long tail geographic terms so we left it. Las year, with Panda, we noticed a drop in traffic and thought it was due to this duplicate issue so I added canonical tags to all our geographic specific product/service pages that pointed back to the parent page, that seemed to be received well by google and traffic was back to normal in short order. However, recently what I notice a LOT in our SERP pages is if I type in a geographic specific term, i.e. "product/service san francisco", our deep page with the canonical tag is what google is ranking. Google inserts its own title tag on the SERP page and leaves the description blank as it doesn't index the page due to the canonical tag on the page. Essentially what I think it is rewarding is the site architecture which organizes the content to the specific geo in the URL: site.com/service/location/san-francisco. Other than that there is no reason for it to rank that page. Sorry if this is lengthy, thanks for reading all of that! Essentially my question is, should I keep the canonical tags on the site or take them off since Google insists on ranking the page? If I am ranking already then the potential upside to doing that is ranking higher (we're usually in the 3-6 spot on the result page) and also higher CTR because we can get a description back on our resulting page. The counter argument is I'm already ranking so leave it and focus on other things. Appreciate your thoughts on this!
Algorithm Updates | | edu-SEO0 -
"Revisit-after" Metatag = Why use it?
Hi Mozfans, Just been thinking about the robots revisit metatag, all pages on my website (200+ pages) have the following tag on them; name="revisit-after" content="7 days" /> I'm wondering what is the purpose of the tag? Surely isn't it best to allow robots (such as Googlebot or Bingbot) to crawl your site as often as possible so the index and rankings get updated as quickly as possible? Thanks in advance everyone! Ash
Algorithm Updates | | AshSEO20110