Structured Data Markup Helper in Webmasters
-
Structured Data Markup Helper in Webmasters. Once i set the article or Movies with star rating etc should i replace the whole code from here into my page?
is there some other way?
-
Hi Oliver, you've received some great responses to your question. Did any of them help answer your question? Let us know, thanks!
Christy
-
I wouldn't remove the code from the page, in my experience the Structured Data Markup Helper can miss certain pages and markup on some pages as their coverage is not a 100%. By using the code still you make sure that it covers a 100%.
-
Ideally just take the snippet of code from the output that's highlighted (whatever structured data you needed), and edit the code on your site. What CMS are you working with? Are you building from HTML or are you using a CMS platform like WordPress/Drupal/Joomla? Without knowing your site limitations it's hart to answer this question.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Schema Markup Warning "Missing field "url" (optional)"
Hello Moz Team, I hope everyone is doing well & good, I need bit help regarding Schema Markup, I am facing issue in my schema markup specifically with my blog posts, In my majority of the posts I find error "Missing field "url" (optional)"
Technical SEO | | JoeySolicitor
As this schema is generated by Yoast plugin, I haven't applied any custom steps. Recently I published a post https://dailycontributors.com/kisscartoon-alternatives-and-complete-review/ and I tested it at two platforms of schema test 1, Validator.Schema.org
2. Search.google.com/test/rich-results So the validator generate results as follows and shows no error
Schema without error.PNG It shows no error But where as Schema with error.PNG in search central results it gives me a warning "Missing field "url" (optional)". So is this really be going to issue for my ranking ? Please help thanks!6 -
Structured Data Mark Up Helper 404?
Whenever I put our URL into markup helper, it returns not found 404.
Technical SEO | | RayflexGroup
I've tried this for different pages, different categories and it all returns the same "not found 404" - I did also trial other websites to see if it was an issue with the markup helper but everything returned fine.
Has anyone else had this issue or know how to resolve?0 -
How could you make a URL/Breadcrumb structure appear different in Google than when you click into site?
I'm seeing a competitor be able to make their URL/Breadcrumb stucture appear different in Google than on the site. Google shows a 3-4 category silo for the page but once clicked the page is off root. How could you do this?
Technical SEO | | TicketCity0 -
Google Structured Data Problem
Hello everyone, About 1-2 weeks ago, I have implemented rich snippets (microdata) for the product pages of my e-commerce site. However, in the web masters tools, google is saying that the crawlers did not detect any structured data in my site. I have also checked my pages using Structured Data Testing Tool. You can see an example test result in the following address. http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.tarzimon.com%2Fproduct%2Fnaif-tasarim-torr-aydinlatma-1031 What may cause this problem? Thank you for your help
Technical SEO | | hknkynr0 -
Moved a site and changed URL structures: Looking for help with pay
Hi Gents and Ladies Before I get started, here is the website in question. www.moldinspectiontesting.ca. I apologize in advance if I miss any important or necessary details. This might actually seem like several disjointed thoughts. It is very late where I am and I am a very exhausted. No on to this monster of a post. **The background story: ** My programmer and I recently moved the website from a standalone CMS to Wordpress. The owners of the site/company were having major issues with their old SEO/designer at the time. They felt very abused and taken by this person (which I agree they were - financially, emotionally and more). They wanted to wash their hands of the old SEO/designer completely. They sought someone out to do a minor redesign (the old site did look very dated) and transfer all of their copy as affordably as possible. We took the job on. I have my own strengths with SEO but on this one I am a little out of my element. Read on to find out what that is. **Here are some of the issues, what we did and a little more history: ** The old site had a terribly unclean URL structure as most of it was machine written. The owners would make changes to one central location/page and the old CMS would then generate hundreds of service area pages that used long, parameter heavy url's (along with duplicate content). We could not duplicate this URL structure during the transfer and went with a simple, clean structure. Here is an example of how we modified the url's... Old: http://www.moldinspectiontesting.ca/service_area/index.cfm?for=Greater Toronto Area New: http://www.moldinspectiontesting.ca/toronto My programmer took to writing 301 redirects and URL rewrites (.htaccess) for all their service area pages (which tally in the hundreds). As I hinted to above, the site also suffers from a overwhelming amount of duplicate copy which we are very slowly modifying so that it becomes unique. It's also currently suffering from a tremendous amount of keyword cannibalization. This is also a result of the old SEO's work which we had to transfer without fixing first (hosting renewal deadline with the old SEO/designer forced us to get the site up and running in a very very short window). We are currently working on both of these issues now. SERPs have been swinging violently since the transfer and understandably so. Changes have cause and effect. I am bit perplexed though. Pages are indexed one day and ranking very well locally and then apparently de-indexed the next. It might be worth noting that they had some de-index problems in the months prior to meeting us. I suspect this was in large part to the duplicate copy. The ranking pages (on a url basis) are also changing up. We will see a clean url rank and then drop one week and then an unclean version rank and drop off the next (for the same city, same web search). Sometimes they rank along side each other. The terms they want to rank for are very easy to rank on because they are so geographically targeted. The competition is slim in many cases. This time last year, they were having one of the best years in the company's 20+ year history (prior to being de-indexed). **On to the questions: ** **What should we do to reduce the loss in these ranked pages? With the actions we took, can I expect the old unclean url's to drop off over time and the clean url's to pick up the ranks? Where would you start in helping this site? Is there anything obvious we have missed? I planned on starting with new keyword research to diversify what they rank on and then following that up with fresh copy across the board. ** If you are well versed with this type of problem/situation (url changes, index/de-index status, analyzing these things etc), I would love to pick your brain or even bring you on board to work with us (paid).
Technical SEO | | mattylac0 -
How to remove crawl errors in google webmaster tools
In my webmaster tools account it says that I have almost 8000 crawl errors. Most of which are http 403 errors The urls are http://legendzelda.net/forums/index.php?app=members§ion=friends&module=profile&do=remove&member_id=224 http://legendzelda.net/forums/index.php?app=core&module=attach§ion=attach&attach_rel_module=post&attach_id=166 And similar urls. I recently blocked crawl access to my members folder to remove duplicate errors but not sure how i can block access to these kinds of urls since its not really a folder thing. Any idea on how to?
Technical SEO | | NoahGlaser780 -
What is the best way to change your sites folder structure?
Hi, Our site was originally created with a very flat folder structure - most of the pages are at the top level. Because we will adding more content I want to tidy up the structure first. I just wanted to check what the best way to go about this was. Is it best to: First configure all the new 301 redirects to point to the new pages, while leaving the actual links on our site pointing to the old pages. Then changing the links on the site after a few weeks. Configure the redirects and change the actual links on my website at the same time to point to the new locations. My thinking that if I go with option 1 route then I will give Google a chance to process all the redirects and change the locations in their index before I start pointing them to the new locations. But does it make any difference? What is the best wat to go about making this sort of change to minimize any loss in rankings, page rank etc? Thanks for the help.
Technical SEO | | Maximise0 -
Are (ultra) flat site structures better for SEO?
Noticed that a high-profile site uses a very flat structure for there content. It essentially places most landing pages right under the root domain folder. So a more conventional site might use this structure: www.widgets.com/landing-page-1/ www.widgets.com/landing-page-1/landing-page-2/ www.widgets.com/landing-page-1/landing-page-2/landing-page-3/ This site in question - a successful one - would deploy the same content like this: www.widgets.com/landing-page-1/ www.widgets.com/landing-page-2/ www.widgets.com/landing-page-3/ So when you're clicking deeper into the nav. options the clicks always roll up to the "top level." Top level pages are given more weight by SEs but conventional directory structures are also beneficial seen as ideal. Why would a site take the plunge and organize content in this way? What was the clincher?
Technical SEO | | DisneyFamily1