Implementing Schema.org on a web page
-
Hi all,
As we know, implementing Schema doesn't change the look & feel of a web page for the users.
So here is my Question..
Could we implement Schema markup on the web pages only for Bots (but not visible to users in a Source code) so that page load time doesn't increase?
-
Hello Anirbon,
You never want to show Google one thing in the code, and show everyone else something different. That is the very definition of cloaking.
Have you looked into using JASON-LD instead of Schema markup? Built Visible has a great article on micro data that includes a section about JSON-LD, which allows you to mark up code in a script instead of wrapping the HTML.
-
Hi,
I am not saying that schema is bad or that you shouldn't do it - it just seems that some big players only use schema on detail pages of an individual product & not on the overview pages. I found an example of site using it - but in the serp's it's only the average rating which appears (example http://www.goodreads.com/author/list/7779.Arthur_C_Clarke). The first result
You can always test what the impact will be - as mentioned before - I guess even for 50 elements fully tagged with Schema the impact on page speed will be minimal. Check your curent pages with webpagetest.org - see the repartition of load time. Probably the html will only account for 10-20% of the load time - rest being images, javascript & css files. Adding a few hundred lines of HTML will not fundamentally change this (text can be compressed quite well)
rgds
Dirk
-
Hi,
But using Schema, providing a well structure data will help bots to understand what type of content/information is present on a page & i think that will definitely help a page to rank better in Google search either its SRP or JD.
Regards,
Anirban
-
Hi,
I am not sure I adding schema.org on a result page is adding a lot of value. If you send 50 different blocks of structured data how should search engines understand which piece would be relevant to be shown in SERPS. I just did a check on 2 different sites (allrecipes.com & monster.com) - they only seem to use the schema markup on the detail pages - not on the result pages.
If you would like to go ahead - you could always try to measure the impact on the page by creating two (static) versions of a search result page - one with & one without markup and test both versions with webpagetest.org & Google page speed analyser. An alternative would be to using "lazy loading" - you first load the first x results (visible part on screen), when the user scrolls you load the next batch ...and so on. This way, the impact on loading times would remain minimal.
In each case, I would not try to show different pages to users & bots.
rgds,
Dirk
-
Hello Dirk,
Thanks for the reply.
Agreed that the impact of adding the few lines of extra code of schema.org will be zero on the load time of the pages. But it totally depends what content you are going to show on a page.
I want to implement Schema.org on the Search Result pages where a single page contains more than 50 listings with different information like Job Title, Company name, Skills, Job posted etc. For each i will have to use different properties as recommended by Google by which the load time of a page will definitely increase.
Please let me know for the above listed case.
Thanks
-
Try adding schema with meta tags in the html, for example:
This way you're telling bots your phone number with schema but it doesn't appear visibly to users. This is normally done with the latitude and longitude schema tags but you can use it for the others as well. Though I wouldn't rely on this as a permanent long-term solution as Google may change their policies on how they interpret content that is not visible to users.
-
It's a game of words. In the context of the question - if you would provide the schema tagging only to bots the tagged info could also be listed in the SERP's and the bots get a better understanding of what the page is all about. Final goal is off course to serve the user the best answers when he's searching. On the page itself however the user doesn't see any difference if the page is tagged with schema or not.
Dirk
-
Dirk I think you misunderstand my words. Schema for user means exactly the same that you wrote in last lines "Search engines including Bing, Google, Yahoo! and Yandex rely on this markup to improve the display of search results, making it easier for people to find the right Web pages.'
Thanks
-
Hi Alick,
Schema.org is not for users - it is "a collection of schemas that webmasters can use to markup HTML pages in ways recognized by major search providers, and that can also be used for structured data interoperability (e.g. in JSON). Search engines including Bing, Google, Yahoo! and Yandex rely on this markup to improve the display of search results, making it easier for people to find the right Web pages.'
Source: http://schema.org/
rgds,
Dirk
-
Hi Anirban,
I'm completely agree with Dirk second thing I would like to know what is the purpose of showing schema to bot only. In my limited understanding we use schema for user to show price, offers to users not bot.
Thanks
-
Hi Anirban,
The impact of adding the few lines of extra code of schema.org will be zero on the load time of your pages.
Apart from that, serving different content to bots & human users could be considered cloaking by search engines.
Implementing schema.org on the normal pages should do just fine!
rgds,
Dirk
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Description tag not showing in the SERPs because page is blocked by Robots, but the page isn't blocked. Any help?
While checking some SERP results for a few pages of a site this morning I noticed that some pages were returning this message instead of a description tag, A description for this result is not avaliable because of this site's robot.s.txt The odd thing is the page isn't blocked in the Robots.txt. The page is using Yoast SEO Plugin to populate meta data though. Anyone else had this happen and have a fix?
On-Page Optimization | | mac22330 -
Does schema.org assist with duplicate content concerns
The issue of duplicate content has been well documented and there are lots of articles suggesting to noindex archive pages in WordPress powered sites. Schema.org allows us to mark-up our content, including marking a components URL. So my question simply, is no-indexing archive (category/tag) pages still relevant when considering duplicate content? These pages are in essence a list of articles, which can be marked as an article or blog posting, with the url of the main article and all the other cool stuff the scheme gives us. Surely Google et al are smart enough to recognise these article listings as gateways to the main content, therefore removing duplicate content concerns. Of course, whether or not doing this is a good idea will be subjective and based on individual circumstances - I'm just interested in whether or not the search engines can handle this appropriately.
On-Page Optimization | | MarkCA0 -
Duplicate Page Content for Product Pages
Hello, We have one website which URL is http://www.bannerbuzz.com & we have many product pages which having duplicate page content issue in SEOMOZ which are below. http://www.bannerbuzz.com/backlit-banners-1.html
On-Page Optimization | | CommercePundit
http://www.bannerbuzz.com/backlit-banners-10.html
http://www.bannerbuzz.com/backlit-banners-11.html
http://www.bannerbuzz.com/backlit-banners-12.html
http://www.bannerbuzz.com/backlit-banners-13.html We haven't any content on these pages, still getting duplicate page content errors for all pages in SEOMOZ. Please help me how can i fix this issue. Thanks,0 -
Landing Pages
Howdy Guys, We currently have around 19 landing pages that are near enough identical for each make of car. The content on each page isn't identical but you can tell its a template. Do you think we should change this and just target models instead of makes. Thanks, Scott
On-Page Optimization | | ScottBaxterWW0 -
Home Page SEO
Hi! We recently re-designed our home page in early March. After Google panda, we re-tweaked it again, before we take it live, we really want to get some expert's opinions. We would be grateful for any comments/suggestions/feedback, particularly in the following area (you will need to click a few times to get the page to real size): is the bottom content ok? please scroll down all the way. 2) We used semantic keywords for 5-6 anchor interlinks to the same page to promote core products from the home page. Is this too much? 80% links on the footer is a repetition of header navigation links, do these footer serve any SEO value or is it over - optimization? Here is the URL: https://www.dropbox.com/gallery/36547134/1/WebDesign?h=109d4a Thanks a lot!
On-Page Optimization | | ypl0 -
Page title changes based on results per page
I have a product listing that allows customers to set a results per page option. This ads a GET variable to the URL. I've added the page number to the title of this page if they go past the first page. If this results per page variable is added to the URL then google see it as a different page. Do I need to change the page title for this?
On-Page Optimization | | BedInABox.com0 -
Google is indexing spam pages from my site. What is the most effective way to get ride of the search results? Pages are deleted now but should I do something more?
A long time ago I created a forum (Invision Power Board) and it got full of spam. Massive amounts! /forum/ I've now deleted the forum but the spam pages are still indexed on Google. Can I do something else to hurry up the process to get ride of them?
On-Page Optimization | | ocarlsson0 -
Too Many On-Page Links
I recently took on a website design client and ran his website through a battery of tests using Pro to take a look at the crawl errors. One that seems to stump me is the error "Too many On-Page links" concerning his blog. (http://franksdesigns.com/wp/blog) This is the first time I've seen this error and am rather confused. The report says there are 104 links on this page. However, I'm having trouble grasping this concept or finding the 104 links. Any suggestions are greatly appreciated. Thank you for your support!
On-Page Optimization | | WebLadder0