Using PURL.org/GoodRelations for Schema Markup
-
Hello awesome MOZ community!
Our agency uses JSON-LD for our local business schema markup. We validate our markup using Google's Structured Data Testing Tool. All good!
Recently, I discovered a competing agency using our similar JSON-LD markup (that's ok) and "http://purl.org/goodrelations" markup.
The latter appears to be–potentially–black hat SEO. Why? According to MOZ, "there is no conclusive evidence that this markup improves rankings."
BUT, the purl.org markup has provided an opportunity for "keyword stuffing". Using purl.org markup, the agency has stuffed/used 66 of the same keywords into the validated markup.
I would love to get feedback from the MOZ community. Can schema markup–of any kind–be used to "keyword stuff"? If so, why aren't sites getting penalized for this? Is this practice flying under the elusive algorithm radars?
Thanks! Your feedback, insight, and snarky remarks are welcome
Cheers!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What do you think of this "SEO software" that uses Rand's "proven method" ?
I saw an ad on Search Engine Roundtable and the call to action was... "What is the #1 metric that Google uses to rank websites?" I thought, "I gotta know that!". (I usually don't click ads but this one tempted me.) So I clicked in and saw a method "proven by Rand Fishkin" that will "boost the rankings of your website". This company has software that will use Rand's proven method (plus data from another unattributed test to boost the rankings of your website). I am not going to use this software. The video made my BS meter ring. But if you want to see it.... http://crowdsearch.me/special-backdoor/ Rather than use this "software", I would suggest using kickass title tags that deliver the searcher to kickass content. That has worked really well for me for years. Great title tags and great content will produce the same results. The bonus for you is that the great content will give you a real website.
White Hat / Black Hat SEO | | EGOL1 -
I have deleted a couple of posts from my blog, im using wordpress but still showing in the search how to delete?
Hey Guys, So I deleted a couple of pages from my blog, and when I search the keyword it is still showing do you guys have any idea how I can completed delete this from the search? Here is the page http://bit.ly/1cRR4qJ
White Hat / Black Hat SEO | | edward-may0 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Is it okay to use eLocal services?
Is it okay to use a service like eLocal's 'reach the web' to clean up our company listings on website directories or is it considered black hat? Our company name and address is inconsistent on many of the website directories and we want to clean it up fast. eLocal has a service that can do this. I just want to make sure it's not considered bad to have a vendor do it. Thanks!
White Hat / Black Hat SEO | | KristyFord0 -
Strange Pingback/Blog Comment Links
On one of my sites I've noticed some strange links from Google Webmaster Tools recent links feature. They are pingbacks/blog comments but they are using keyword anchor text and linking to my site. I know we are not doing this. Should I be concerned about this possibly being negative SEO? Here's a sample (be careful, shady site)
White Hat / Black Hat SEO | | eyeflow0 -
Competitors using unsavoury methods of link building. How to combat?
A lot of my competitors are using a lot of unsavoury/old-fashioned SEO methods to build links but are actually doing really well from it. A few different competitors are buying links in directories, using blogspam comments, forum posts, buying links in other places. The problem is, they all seem to be doing very well with it! What I've always been taught is that these methods are out and they could actually harm you - yet I haven't seen this happen to my competitors. Should I be using these spammy methods too or just concentrate on building quality content and high quality link building?
White Hat / Black Hat SEO | | charliedouglas1230 -
Advice on using the disavow tool to remove hacked website links
Hey Everyone, Back in December, our website suffered an attack which created links to other hacked webistes which anchor text such as "This is an excellent time to discuss symptoms, fa" "Open to members of the nursing/paramedical profes" "The organs in the female reproductive system incl" The links were only visible when looking at the Cache of the page. We got these links removed and removed all traces of the attack such as pages which were created in their own directory on our server 3 months later I'm finding websites linking to us with similar anchor text to the ones above, however they're linking to the pages that were created on our server when we were attacked and they've been removed. So one of my questions is does this effect our site? We've seen some of our best performing keywords drop over the last few months and I have a feeling it's due to these spammy links. Here's a website that links to us <colgroup><col width="751"></colgroup>
White Hat / Black Hat SEO | | blagger
| http://www.fashion-game.com/extreme/blog/page-9 | If you do view source or look at the cached version then you'll find a link right at the bottom left corner. We have 268 of these links from 200 domains. Contacting these sites to have these links removed would be a very long process as most of them probably have no idea that those links even exist and I don't have the time to explain to each one how to remove the hacked files etc. I've been looking at using the Google Disavow tool to solve this problem but I'm not sure if it's a good idea or not. We haven't had any warnings from Google about our site being spam or having too many spam links, so do we need to use the tool? Any advice would be very much appreciated. Let me know if you require more details about our problem. <colgroup><col width="355"></colgroup>
| | | |0 -
Attracta.com / "weekly submissions to top 100 search engines"
I recently received an offer from Attracta.com because I have a hostgator account. They are offering different levels of service for submitting xml sitemaps on a weekly basis. Is this a good idea? Thanks for your feedback! Will PS see graphic: Screen%20Shot%202012-02-08%20at%2010.06.56%20PM.png
White Hat / Black Hat SEO | | WillWatrous0