JSON-LD parsing error
-
I was wondering if anyone had a NON-SCHEMA MARK-UP solution for the following... I ran my Wordpress homepage HTML (after I implemented schema mark-up) in the structured data testing tool and received an error parsing the JSON-LD... it came at the beginning of the code and I know it is from the slider banner plugin... We are trying to see if there as an alternative to implementing schema with in the js... also does anyone know if the actual Google search will simply skip over the code and parse the rest of HTML? Thank you for your help.
-
You can definitely leave it as it is, it probably won't influence much. It's more additional information that could help you in the long run but it won't harm you.
-
So this source code is from a WP site... the script is for a slider banner... is it necessary to implement schema for it? Or will Google simply skip over it? Is it just the analysis tool itself doesn't know what to make of the code? Are there any long term metric issues we may run into if we leave it as is? Thanks again for your input.
-
Yes this makes total sense as the JSON-LD is not in the right place and the content it has is far from JSON-LD. What you're doing is opening the script tag for this and in that you would have the code for it but in your case there is plain HTML in there which definitely would throw an error.
-
-
Usually they would skip over the broken part of your code to see if they can get to know the other part but still I would see if you could fix the parts that are broken currently. Would you be able to share the URL we're talking about?
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO: High intent organic revenue down in Europe
Our team is stumped and we are hoping some of you might have some insight! We are seeing a drop in Europe organic revenue and we can't seem to figure out what the core cause of the problem is. What's interesting, the high intent traffic is increasing across the business, as is organic-attributed revenue. And in Europe specifically, other channels appear to be doing just fine. This seems to be a Europe high-intent SEO problem. What we have established: Revenue was at a peak in Q4 2017 and Q1 2018 Revenue dips in mid-late Q2 2018 and again in Q4 2018 where it has stayed low since Organic traffic has gone up, conversion rate has gone down, purchases have gone down Paid search traffic has gone up, conversion rate has gone down slightly, submissions have gone up Currency changes are minimal We cannot find any site load issues What we know happened during this time frame (January 2018 onward): Updates to the website (homepage layout, some text changes) end of April 2018 GDPR end of May 2018 Google Analytics stops being able to track Firefox Europe is a key market for us and we cant figure out what might be causing this to happen - again, only in Europe - beyond GDPR and the changes we've made on our site is there anything else major that we're missing that could be causing this? Or does anyone have any insights as to where we should look? Thank you in advance!
Algorithm Updates | | RS-Marketing0 -
Our Sites Organic Traffic Went Down Significantly After The June Core Algorithm Update, What Can I Do?
After the June Core Algorithim Update, the site suffered a loss of about 30-35% of traffic. My suggestions to try to get traffic back up have been to add metadata (since the majority of our content is lacking it), as well ask linking if possible, adding keywords to alt images, expanding and adding content as it's thin content wise. I know that from a technical standpoint there are a lot of fixes we can implement, but I do not want to suggest anything as we are onboarding an SEO agency soon. Last week, I saw that traffic for the site went back to "normal" for one day and then saw a dip of 30% the next day. Despite my efforts, traffic has been up and down, but the majority of organic traffic has dipped overall this month. I have been told by my company that I am not doing a good job of getting numbers back up, and have been given a warning stating that I need to increase traffic by 25% by the end of the month and keep it steady, or else. Does anyone have any suggestions? Is it realistic and/or possible to reach that goal?
Algorithm Updates | | NBJ_SM2 -
Is using REACT SEO friendly?
Hi Guys Is REACT SEO friendly? Has anyone used REACT and what was the results? Or do you recommend something else that is better suited for SEO? Many thanks for your help in advance. Cheers Martin
Algorithm Updates | | martin19700 -
Is anyone else's ranking jumping?
Rankings have been jumping across 3 of our websites since about 24 October. Is anyone seeing similar? For example ... jumps from position 5 to 20 on one day, then back to 5 for 3 days and then back to 20 for a day I'm trying to figure out if it's algorithm based or if my rank checker has gone mad. I can't replicate the same results if I search incognito or in a new browser, everything always looks stable in the SERPs if I do the search myself
Algorithm Updates | | Marketing_Today0 -
Google Webmaster Tools show the error in Manual Action while there is no any error in Structured Data Testing Tool.
It is showing error as below Spammy structured markup Markup on some pages on this site appears to use techniques such as marking up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google's Spammy Structured Markup guidelines. While I see in Structured Data Testing Tool, it doesn't show any error.
Algorithm Updates | | infinitemlm0 -
I'm Pulling Hairs! - Duplicate Content Issue on 3 Sites
Hi, I'm an SEO intern trying to solve a duplicate content issue on three wine retailer sites. I have read up on the Moz Blog Posts and other helpful articles that were flooded with information on how to fix duplicate content. However, I have tried using canonical tags for duplicates and redirects for expiring pages on these sites and it hasn't fixed the duplicate content problem. My Moz report indicated that we have 1000s of duplicates content pages. I understand that it's a common problem among other e-commerce sites and the way we create landing pages and apply dynamic search results pages kind of conflicts with our SEO progress. Sometimes we'll create landing pages with the same URLs as an older landing page that expired. Unfortunately, I can't go around this problem since this is how customer marketing and recruitment manage their offers and landing pages. Would it be best to nofollow these expired pages or redirect them? Also I tried to use self-referencing canonical tags and canonical tags that point to the higher authority on search results pages and even though it worked for some pages on the site, it didn't work for a lot of the other search result pages. Is there something that we can do to these search result pages that will let google understand that these search results pages on our site are original pages? There are a lot of factors that I can't change and I'm kind of concerned that the three sites won't rank as well and also drive traffic that won't convert on the site. I understand that Google won't penalize your sites with duplicate content unless it's spammy. So If I can't fix these errors -- since the company I work conducts business where we won't ever run out of duplicate content -- Is it worth going on to other priorities in SEO like Keyword research, On/Off page optimization? Or should we really concentrate on fixing these technical issues before doing anything else? I'm curious to know what you think. Thanks!
Algorithm Updates | | drewstorys0 -
Remove spam url errors from search console
My site was hacked some time ago. I've since then redesigned it and obviously removed all the injection spam. Now I see in search console that I'm getting hundreds of url errors (from the spam links that no longer work). How do I remove them from the search console. The only option I see is "mark as fixed", but obviously they are not "fixed", rather removed. I've already uploaded a new sitemap and fetched the site, as well as submitted a reconsideration request that has been approved.
Algorithm Updates | | rubennunez0 -
Website traffic dropped 50% after 14th November same day GWT reported a DNS error
HI there. On 14th November GWT reported a DNS error on my site I checked with my hosts but they said there was nothing wrong. I then went searching for answers and found it happened to lot of people on that specific day see http://moz.com/blog/was-there-a-november-14th-google-update. After that time my website traffic dropped by 50% over the period of a week and is still sitting on 50% of what it was. I then moved the site to VPS, had a few DNS errors - that were the cause of my hosts - so i moved back to shared hosting last weekend and now DNS issues are solved. However the original DNS issue is still unknown I dont know what went wrong and want to rectify issue. I dont sell ads and write original content 4 times a day. I have a miniscule bounce rate, my site speed is okay and i dont stuff keywords in my content although i was careless with my <alt tags="">so they could be considered keyword stuffing (and i have up to 10 images on one post). But i have been removing all the keywords in my images theres over 3000 posts so its taking time). My impressions have dropped from 10,000 a day to 2,500 and I have no idea why.</alt> My website has been building traffic consistently for the last 2 years. Only now has it crashed a bit. Have you any advice on what I can do to solve this problem or rather find the cause of the issue. Im not a pro seo person and do this blog in my free time so am not an expert on data analyzing etc.... thanks alot
Algorithm Updates | | mutant20080