"Ghost" errors on blog structured data?
-
Hi,
I'm working on a blog which Search Console account advises me about a big bunch of errors on its structured data:
But I get to https://developers.google.com/structured-data/testing-tool/ and it tells me "all is ok":
Any clue?
Thanks in advance,
-
Hi Everett,
Yes it seems that this is the way.
Thanks a lot.
-
Yes it is.
Well, it's both a magento site with a wordpress blog.
Thank you very much
-
Webicultors,
Read this thread on Google's Product Forums. Let us know if it answers your question. If not, at least you're not alone...
Upon reading several similar threads on various forums and Q&A sites, it appears this is a very common occurrence resulting from a disparity between what the two tools define as an "error". The testing tool seems to be limited to errors in syntax / markup while GSC may see missing elements as errors.
-
Hi,
I've just added a couple of screenshots more to illustrate that I've already checked this information you are telling me.
But the test tool keeps telling me: All OK
Thanks
-
Like Kristen mentions, you should be able to see an overview of the Schema.org implementations with the amount of errors that they have individually. So that's why you're already not seeing any changes in the one URL you were testing. In the list you should be easily able to identify the pages with issues.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
We switched the domain from www.blog.domain.com to domain.com/blog.
We switched the domain from www.blog.domain.com to domain.com/blog. This was done with the purpose of gaining backlinks to our main website as well along with to our blog. This set us very low in organic traffic and not to mention, lost the backlinks. For anything, they are being redirected to 301 code. Kindly suggest changes to bring back all the traffic.
Technical SEO | | arun.negi0 -
Google Webmaster Tools is saying "Sitemap contains urls which are blocked by robots.txt" after Https move...
Hi Everyone, I really don't see anything wrong with our robots.txt file after our https move that just happened, but Google says all URLs are blocked. The only change I know we need to make is changing the sitemap url to https. Anything you all see wrong with this robots.txt file? robots.txt This file is to prevent the crawling and indexing of certain parts of your site by web crawlers and spiders run by sites like Yahoo! and Google. By telling these "robots" where not to go on your site, you save bandwidth and server resources. This file will be ignored unless it is at the root of your host: Used: http://example.com/robots.txt Ignored: http://example.com/site/robots.txt For more information about the robots.txt standard, see: http://www.robotstxt.org/wc/robots.html For syntax checking, see: http://www.sxw.org.uk/computing/robots/check.html Website Sitemap Sitemap: http://www.bestpricenutrition.com/sitemap.xml Crawlers Setup User-agent: * Allowable Index Allow: /*?p=
Technical SEO | | vetofunk
Allow: /index.php/blog/
Allow: /catalog/seo_sitemap/category/ Directories Disallow: /404/
Disallow: /app/
Disallow: /cgi-bin/
Disallow: /downloader/
Disallow: /includes/
Disallow: /lib/
Disallow: /magento/
Disallow: /pkginfo/
Disallow: /report/
Disallow: /stats/
Disallow: /var/ Paths (clean URLs) Disallow: /index.php/
Disallow: /catalog/product_compare/
Disallow: /catalog/category/view/
Disallow: /catalog/product/view/
Disallow: /catalogsearch/
Disallow: /checkout/
Disallow: /control/
Disallow: /contacts/
Disallow: /customer/
Disallow: /customize/
Disallow: /newsletter/
Disallow: /poll/
Disallow: /review/
Disallow: /sendfriend/
Disallow: /tag/
Disallow: /wishlist/
Disallow: /aitmanufacturers/index/view/
Disallow: /blog/tag/
Disallow: /advancedreviews/abuse/reportajax/
Disallow: /advancedreviews/ajaxproduct/
Disallow: /advancedreviews/proscons/checkbyproscons/
Disallow: /catalog/product/gallery/
Disallow: /productquestions/index/ajaxform/ Files Disallow: /cron.php
Disallow: /cron.sh
Disallow: /error_log
Disallow: /install.php
Disallow: /LICENSE.html
Disallow: /LICENSE.txt
Disallow: /LICENSE_AFL.txt
Disallow: /STATUS.txt Paths (no clean URLs) Disallow: /.php$
Disallow: /?SID=
disallow: /?cat=
disallow: /?price=
disallow: /?flavor=
disallow: /?dir=
disallow: /?mode=
disallow: /?list=
disallow: /?limit=5
disallow: /?limit=10
disallow: /?limit=15
disallow: /?limit=20
disallow: /*?limit=250 -
How should I deal with "duplicate" content in an Equipment Database?
The Moz Crawler is identifying hundreds of instances of duplicate content on my site in our equipment database. The database is similar in functionality to a site like autotrader.com. We post equipment with pictures and our customers can look at the equipment and make purchasing decisions. The problem is that, though each unit is unique, they often have similar or identical specs which is why moz (and presumably google/bing) are identifying the content as "duplicate". In many cases, the only difference between listings are the pictures and mileage- the specifications and year are the same. Ideally, we wouldn't want to exclude these pages from being indexed because they could have some long-tail search value. But, obviously, we don't want to hurt the overall SEO of the site. Any advice would be appreciated.
Technical SEO | | DohenyDrones0 -
Structured Data verses Data Highlighter Tool
Hello, I would like to implement structured data on a site, and I am a newbie in this. (I know, hard to believe considering it's 2015.) I'm slightly confused on why would I use structured data, if I can just tag the different elements with the data highlighter tool in Google Webmaster Tools. Should I focus on one or the other? The sites I would be working on would be considered a "local business" and have the basics (i.e. phone, address, hours, etc).
Technical SEO | | sumoleap0 -
Duplicate title/content errors for blog archives
Hi All Would love some help, fairly new at SEO and using SEOMoz, I've looked through the forums and have just managed to confuse myself. I have a customer with a lot of duplicate page title/content errors in SEOMoz. It's an umbraco CMS and a lot of the errors appear to be blog archives and pagination. i.e. http://example.com/blog http://example.com/blog/ http://example.com/blog/?page=1 http://example.com/blog/?page=2 and then also http://example.com/blog/2011/08 http://example.com/blog/2011/08?page=1 http://example.com/blog/2011/08?page=2 http://example.com/blog/2011/08?page=3 (empty page) http://example.com/blog/2011/08?page=4 (empty page) This continues for different years and months and blog entries and creates hundreds of errors. What's the best way to handle this for the SEOMoz report and the search engines. Should I rel=canonical the /blog page? I think this would probably affect the SEO of all the blog entries? Use robots.txt? Sitemaps? URL parameters in the search engines? Appreciate any assistance/recommendations Thanks in advance Ian
Technical SEO | | iragless0 -
Rel="external"
Hi all, I got a link and its off a site and marked up with rel="external". Is this a followed or nofollowed link? Does it pass link juice? Thanks
Technical SEO | | Sharer0 -
Domain "Forwarded"?
Hi SEOMoz! The company I work for has a website, www.accupos.com, but they also have an old domain which is not used anymore called http://accuposretail.com/ These two sites had duplicate content so I requested the OLD site (http://accuposretail.com/) be redirected to accupos.com to eliminate the dupe content. Unfortunately, I do not understand completely what happened but when they performed this forwarding the accuposretail.com URL is still in use. Now it just displays EXACTLY what accupos.com displays and not something similar. The tech team told me it is forwarded but I can't help but see the URL still in the search box on top. Is this unacceptable? The actual URL has to forward and change to the accupos.com URL in order to not be duplicate content, correct? I have limited experience in this. Please let me know if we are good to go, or if I need to tell them more action is required. Thanks! Derek M
Technical SEO | | DerekM880 -
Too many links on your blog?
In all of my campaigns, I have a lot of URLs with too many links on the page (defined loosely as around or over 100 links per page); these links are virtually all found on blog pages. The link count shoots up quickly when you start using things like tag clouds, showing all the tags/categories a post is in, in addition to all the cross linking thats typical of blog posts. My question is: Does this matter? Do you work to get blog pages down under that 100 link limit, or just assume most blogs are like this and move along? If you think it does matter, what strategies have you used to cut down the number of links while still keeping popular elements like tag clouds?
Technical SEO | | AdoptionHelp0