Structured Data Markup Helper in Webmasters
-
Structured Data Markup Helper in Webmasters. Once i set the article or Movies with star rating etc should i replace the whole code from here into my page?
is there some other way?
-
Hi Oliver, you've received some great responses to your question. Did any of them help answer your question? Let us know, thanks!
Christy
-
I wouldn't remove the code from the page, in my experience the Structured Data Markup Helper can miss certain pages and markup on some pages as their coverage is not a 100%. By using the code still you make sure that it covers a 100%.
-
Ideally just take the snippet of code from the output that's highlighted (whatever structured data you needed), and edit the code on your site. What CMS are you working with? Are you building from HTML or are you using a CMS platform like WordPress/Drupal/Joomla? Without knowing your site limitations it's hart to answer this question.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Trying to mark as fixed multiple errors on webmaster tools
We have 44,249 errors and I have set up for most of the URLs a 301 redirect. I know exactly which links are correctly redirected my problem is I don't want to mark as fixed each one individually. Is there a way to upload a URL list to webmaster tools and it automatically marks as fixed based on the list.
Technical SEO | | easyoffices0 -
New URL Structure
Hi Guy's, For our webshop we're considering a new URL structure because longtail keywords to rank so well. Now we have /category (main focus keywords)
Technical SEO | | Happy-SEO
/product/the-product345897345123/ (nice to rank on, not that much volume) We have over 500 categories and every one of them is placed after our domain. Because i think it's better to work with a good structure and managed a way to make categories and sub-categories. The 500 categories may be the case why not every one of them is ranking so well, so that was also the choice of thinking about a new structure. So the new URL structure will be: /category (main focus keywords)
/category/subcat/ (also main focus keywords) Everything will be redirect (301, good way), so i think there won't be to much problems. I'm thinking about what to do with the /product/ URL. Because now it will be on the same level as the subcategories, and i'm affraid that when it's on that level, Google will give the same value to both of them. My options that i'm considering are: **Old way **
/product/the-product-345897345123/ .html (seen this on big webshops)
/product/the-product-345897345123.html/ Level deeper SKU /product/the-product/345897345123/ What would you suggest? The new structure would be 20 categories 500+ sub's devided under main categories 5000+ products Thanks!0 -
Switching from HTTP to HTTPS and google webmaster
HI, I've recently moved one of my sites www.thegoldregister.co.uk to https. I'm using wordpress and put in the permanent 301 redirect for all pages to false https for all pages in the htaaccess file. I've updated the settings in google analytics to https for the original site. All seems to be working well. Regarding the google webmaster tools and what needs to be done. I'm very confused by the google documentation on this subject around https. Does all my crawl data and indexing from http site still stand and be inherited by the https version because of the redirects in place. I'm really worried I will lose all of this indexing data, I looked at the "change of address" in the settings of webmaster, but this seems to refer to changing the actual domain name rather than the protocol which i haven't at all. I've also tried adding the https version to the console as well, but the https version is showing a severe warning "is robots.txt blocking some important pages". I don't understand this error as it's the same version and file as the http site being generated by all in one seo pack for wordpress (see below at bottom). The warning is against line 5 saying it will ignore it. What i don't understand is i don't get this error in the webmaster console with the http version which is the same file?? Any help and advice would be much appreciated. Kind regards Steve User-agent: *
Technical SEO | | lqz
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /xmlrpc.php
Crawl-delay: 10 ceLAHIv.jpg0 -
Location of Content within the Code Structure
Hi guys,
Technical SEO | | artdivision
When working with advanced modern websites it many times means that in order to achieve the look and feel we end up with pages that has almost 1000 lines of code or more. In some cases it is impossible to avoid it if we are to reach the Client's visual and technical specifications. Say the page is 1000 lines of code, and our content only starts at line 450 onwards, will that have an impact from a Google crawlability, hence affect our SEO making it harder to rank? Thoughts? Dan.0 -
Structured Data Authorship
Hi I've just successfully set up authorship for a client according to the rich snippet testing tool although bit perplexed since underneath the results theres a section called 'Extracted Structured Data'. The first section is marked hatom feed and under that it says under the field saying 'Author' it says in red: Warning: At least one field must be set for Hcard.Warning: Missing required field "name (fn)".And then under the URL field & the URL it says:Warning: Missing required field "entry-title".Any ideas what this means or even if its important ? I would have thought the tool wouldnt acknowledge authorship as being set up correctly if this was an issue but that does beg the question what is it doing there and what does it mean ?Theres another section after that called rdfa node which seems all fineAlso says page does not contain publisher mark up although i know publisher has been added to the home page, is it best to add publisher to head section in every page (as i have heard some people say) or just the home page ?Many ThanksDan
Technical SEO | | Dan-Lawrence0 -
SEO friendldy Site structure?
we are in the process or rewriting all the pages on one of our sites and will be changing some urls around. i was just wondering if dashes or underscores are better in the urls SEO wise? www.site.com/word-word-word/ or
Technical SEO | | 858-SEO
www.site.com/word_word_word/ i personally like the underscores better but some colleagues tell me that dashes are better, any tests out there on this issue?? Thanks0 -
Should we redirect 404 errrors seen in webmaster tools with ... (dot.dot,dot) ?
Lately I have seen lots of 404 errors showing in webmaster tools that are not really links. Many of them from shammy pages. (I did not put them there) One of the most common types is ones that show the link ending in ... ( dot, dot, dot) The appearance of the link is being sent from pages like this http://www.the-pick.com/00_fahrenheit,2.html For example a link like this would show up in webmaster tools as a 404 error. http://www.ehow.com/how_2352088_easily-... Are these worth redirecting? So far I have redirected some of them and found that is was not helpful and possibly harmful. Anyone else had the same experience? Also getting lots of partial urls showing up from pages that reference my site but the url is cut off and the link is not active. Does Google really count these as links? Is redirecting a link from a spammy page acknowledging acceptance and could it count against you?
Technical SEO | | KentH0 -
Search optimal Tab structure?
Good day, We are in the process of starting a website redesign/development. We will likely be employing a tabbing structure on our home page and would like to be able to capitalize on the keyword content found across the various tabs. The tab structure will be similar to how this site achieves tabs: http://ugmo.com/ I've uploaded a screen grab of this page as the Googlebot user agent. The text "Soil Intelligence for professional Turf Managers" clicks through to this page: http://ugmo.com/?quicktabs_1=1#quicktabs-1 So I'm thinking there could be some keyword dilution there. That said Google is very much aware of the text on the quicktabs-1 page being related to the home page content: http://www.google.com/search?q=Up+your+game+with+precise+soil+moisture%2C+salinity+and+temperature+measurements.+And+in+the+process%2C+save+water%2C+resources%2C+money.+inurl%3Augmo.com&sourceid=ie7&rls=com.microsoft:en-us:IE-SearchBox&ie=&oe= Is this the best search optimal way to add keyword density on a home page with a tab structure? Or is there a better means of achieving this? {61bfcca1-5f32-435e-a311-7ef4f9b592dd}_tabs_as_Googlebot.png
Technical SEO | | Hershel.Miller0