Structured Data Markup Helper in Webmasters
-
Structured Data Markup Helper in Webmasters. Once i set the article or Movies with star rating etc should i replace the whole code from here into my page?
is there some other way?
-
Hi Oliver, you've received some great responses to your question. Did any of them help answer your question? Let us know, thanks!
Christy
-
I wouldn't remove the code from the page, in my experience the Structured Data Markup Helper can miss certain pages and markup on some pages as their coverage is not a 100%. By using the code still you make sure that it covers a 100%.
-
Ideally just take the snippet of code from the output that's highlighted (whatever structured data you needed), and edit the code on your site. What CMS are you working with? Are you building from HTML or are you using a CMS platform like WordPress/Drupal/Joomla? Without knowing your site limitations it's hart to answer this question.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta Data Question
Hi There, I am working on the umbraco CMS and we have a Menu page which sits under one page on the CMS. When accessing this page on the front end and navigating between the food menu / drinks menu, the url changes depending on which content you are on, however i have only one place to input a meta title and description meaning that it is seeing them as duplicate content as both the drinks menu url and food menu url are showing the same meta data. Hopefully this makes sense, does anyone have anything similair where a url change happens when content within the page changes.
Technical SEO | | AlexStanleyGK0 -
Received A Notice Regarding Spammy Structured Data. But we don't have any structured data or do we?
Got a message that we have spammy structured data on our site via webmaster tools and have no idea what they are referring to. We do not use any structured data using schema.org mark up. Could they be referring to something else? The message was: To: Webmaster of <a>http://www.lulus.com/</a>, Google has detected structured markup on some of your pages that violates our structured data quality guidelines. In order to ensure quality search results for users, we display rich search results only for content that uses markup that conforms to our quality guidelines. This manual action has been applied to lulus.com/ . We suggest that you fix your markup and file a reconsideration request. Once we determine that the markup on the pages is compliant with our guidelines, we will remove this manual action. What could we be showing them that would be interpreted as structured data, and or spammy structured data?
Technical SEO | | KentH0 -
Am I using pagination markups correctly?
Hey Mozzers! I am receiving duplicate title tag errors from Search Console on paginated pages (blog.com/chlorine, blog.com/chlorine-2, blog.com/chlorine-3). I do not currently have a view all page. If I were to create one, would I add all the content from chlorine-2 and chlorine-3 to the blog.com/chlorine page? Then use the rel=cononical on chlorine-2 and chlorine-3 to blog.com/chlorine? If I move forward without the view all page, I could implement the next/prev HTML markups but can I do this without dev help? I am currently using the Yoast SEO plugin and do not see the option. Would I use the text editor to add the markups directly before the content? I think I have a grasp on this, but this will be my first time implementing and I want to double check first! Thanks!
Technical SEO | | localwork0 -
Improving SEO Structure of a Page
Our site is an online Marketplace for Services. Naturally, we have a lot of unique content in the form of :
Technical SEO | | Ideas2life
a) Job Posts
b) Profiles of Service Providers We also have 2 very important pages:
a) The Job Listing Page
b) The Service Provider Page The Listing pages have very valuable H1 Titles, but everything else is duplicate content. To capture those keywords currently in H1, we have created a different landing page for each category page, and we`ll optimize around that, so these H1s are not that big of a deal any more. These landing pages are the key to our SEO strategy and we are building new content every day to help them rank I want to make the Listing Pages No Index Follow. This way they pass Juice to Jobs, and Profiles which have unique contents, but are not indexed themselves. Is this a bad idea? I have been thinking about doing this for over a year but it never felt important enough to be worth the risk of accidentally screwing up We `ll soon do a new on page flow optimization and that's why I am considering this again. Thank you so much in advance Argyris0 -
Secure and non-secure Schema.org Markup?
Is it possible to have schema.org itemtypes for both secure and insecure ports? I run a static-ish site made in Jekyll, and am implementing Schema.org on the individual pages. As a result, I'm trying to use the following: This doesn't validate with Google's Rich Snippet Tool. It doesn't register the Items as existing. Is there a good way to implement Schema.org in a static page hosted on both SSL and non-SSL ports?
Technical SEO | | RoxBrock0 -
Moved a site and changed URL structures: Looking for help with pay
Hi Gents and Ladies Before I get started, here is the website in question. www.moldinspectiontesting.ca. I apologize in advance if I miss any important or necessary details. This might actually seem like several disjointed thoughts. It is very late where I am and I am a very exhausted. No on to this monster of a post. **The background story: ** My programmer and I recently moved the website from a standalone CMS to Wordpress. The owners of the site/company were having major issues with their old SEO/designer at the time. They felt very abused and taken by this person (which I agree they were - financially, emotionally and more). They wanted to wash their hands of the old SEO/designer completely. They sought someone out to do a minor redesign (the old site did look very dated) and transfer all of their copy as affordably as possible. We took the job on. I have my own strengths with SEO but on this one I am a little out of my element. Read on to find out what that is. **Here are some of the issues, what we did and a little more history: ** The old site had a terribly unclean URL structure as most of it was machine written. The owners would make changes to one central location/page and the old CMS would then generate hundreds of service area pages that used long, parameter heavy url's (along with duplicate content). We could not duplicate this URL structure during the transfer and went with a simple, clean structure. Here is an example of how we modified the url's... Old: http://www.moldinspectiontesting.ca/service_area/index.cfm?for=Greater Toronto Area New: http://www.moldinspectiontesting.ca/toronto My programmer took to writing 301 redirects and URL rewrites (.htaccess) for all their service area pages (which tally in the hundreds). As I hinted to above, the site also suffers from a overwhelming amount of duplicate copy which we are very slowly modifying so that it becomes unique. It's also currently suffering from a tremendous amount of keyword cannibalization. This is also a result of the old SEO's work which we had to transfer without fixing first (hosting renewal deadline with the old SEO/designer forced us to get the site up and running in a very very short window). We are currently working on both of these issues now. SERPs have been swinging violently since the transfer and understandably so. Changes have cause and effect. I am bit perplexed though. Pages are indexed one day and ranking very well locally and then apparently de-indexed the next. It might be worth noting that they had some de-index problems in the months prior to meeting us. I suspect this was in large part to the duplicate copy. The ranking pages (on a url basis) are also changing up. We will see a clean url rank and then drop one week and then an unclean version rank and drop off the next (for the same city, same web search). Sometimes they rank along side each other. The terms they want to rank for are very easy to rank on because they are so geographically targeted. The competition is slim in many cases. This time last year, they were having one of the best years in the company's 20+ year history (prior to being de-indexed). **On to the questions: ** **What should we do to reduce the loss in these ranked pages? With the actions we took, can I expect the old unclean url's to drop off over time and the clean url's to pick up the ranks? Where would you start in helping this site? Is there anything obvious we have missed? I planned on starting with new keyword research to diversify what they rank on and then following that up with fresh copy across the board. ** If you are well versed with this type of problem/situation (url changes, index/de-index status, analyzing these things etc), I would love to pick your brain or even bring you on board to work with us (paid).
Technical SEO | | mattylac0 -
SEOMOZ and Webmaster Tools showing Different Page Index Results
I am promoting a jewelry e-commerce website. The website has about 600 pages and the SEOMOZ page index report shows this number. However, webmaster tools shows about 100,000 indexed pages. I have no idea why this is happening and I am sure this is hurting the page rankings in Google. Any ideas? Thanks, Guy
Technical SEO | | ciznerguy1 -
Is it worth adding schema markup to articles?
I know things like location, pagination, breadcrumbs, video, products etc have value in using schema markup. What about things like articles though? Is it worth all the work involved in having the pages mark up automatically? How does this effect SEO, and is it worthwhile? Thanks, Spencer
Technical SEO | | MarloSchneider0