Data Highlighter doesn't show page
-
We have an event related website http://www.sbo.nl so i wanted to use data highlighter because most of our event pages are the same. But data highlighter doesn't show those pages, I will see only an empty page.
For example http://www.sbo.nl/veiligheid/brandveiligheid-gebouwen/
Does someone of you understand what is going on. Data highlighter does show the homepage.
I am thinking it is maybe because of tabbed browsing, or the chat function in that page.
Hope someone can help. You can see a screenshot of Data Highlighter http://www.clipular.com/c?6659030=2X2gQv4O8_9RzcZ1Hk_7xGtCPYo&f=d40975c80bdd11dc357f050cafa73a80
I hope someone can help because i am lost
Cheers Ruud
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SERP Review Features show on a non-product page?
When reviewing my campaign's SERP Features, I notice that one of my competitors is gaining a lot of Review Features that I'm missing. I'm ranking high for the keywords that are showing the review features, but not on my product page. I'm ranking for those keywords on blogs and other pages. Is there a way to show for those review features as I currently have it, or should I be trying to rank for those keywords on my product page? I appreciate any insight into this situation.
Technical SEO | | LearningStuff0 -
SERPs started showing the incorrect date next to my pages
Hi Moz friends, I've noticed since Tuesday, November 9, half of my post's meta dates have changed in regards to what appears next to the post in the search results. Although published this year, I'm getting some saying a random date in 2010! (The domain was born in 2013; which makes this even more odd). This is harming the CTR of my posts and traffic is decreasing. Some posts have gone from 200 hits a day to merely 30. As far as on our end of the website, we have not made any changes in regards to schema markup, rich snippets, etc. We have not edited any post dates. We have actually not added new content since about a week ago, and these incorrect dates have just started to appear on Tuesday. Only changes have been updating certain plugins in terms of maintenance. This is occurring on four of our websites now, so it is not just specific to one. All websites use Wordpress and Genesis theme. It looks like only half of the posts are showing weird dates we've never seen before (far off from the original published date as well as last updated date -- again, dates like 2010, 2011, and 2012 when none of our websites were even created until 2013). We cannot think of a correlation as to why certain posts are showing weird dates and others the correct. The only change we can think of that's related is back in June we changed our posts to show Last Updated date to give our readers an insight into when we changed it last (since it's evergreen content). Google started to use that date for the SERPs which was great, it actually increased traffic. I'm hoping it's a glitch and a recrawl soon may help sift it around. Anybody have experience with this? I've noticed Google fluctuates between showing our last updated date or not even showing a date at all sometimes at random. We're super confused here. Thank you in advance!
Technical SEO | | smmour2 -
Received A Notice Regarding Spammy Structured Data. But we don't have any structured data or do we?
Got a message that we have spammy structured data on our site via webmaster tools and have no idea what they are referring to. We do not use any structured data using schema.org mark up. Could they be referring to something else? The message was: To: Webmaster of <a>http://www.lulus.com/</a>, Google has detected structured markup on some of your pages that violates our structured data quality guidelines. In order to ensure quality search results for users, we display rich search results only for content that uses markup that conforms to our quality guidelines. This manual action has been applied to lulus.com/ . We suggest that you fix your markup and file a reconsideration request. Once we determine that the markup on the pages is compliant with our guidelines, we will remove this manual action. What could we be showing them that would be interpreted as structured data, and or spammy structured data?
Technical SEO | | KentH0 -
Will a Robots.txt 'disallow' of a directory, keep Google from seeing 301 redirects for pages/files within the directory?
Hi- I have a client that had thousands of dynamic php pages indexed by Google that shouldn't have been. He has since blocked these php pages via robots.txt disallow. Unfortunately, many of those php pages were linked to by high quality sites mulitiple times (instead of the static urls) before he put up the php 'disallow'. If we create 301 redirects for some of these php URLs that area still showing high value backlinks and send them to the correct static URLs, will Google even see these 301 redirects and pass link value to the proper static URLs? Or will the robots.txt keep Google away and we lose all these high quality backlinks? I guess the same question applies if we use the canonical tag instead of the 301. Will the robots.txt keep Google from seeing the canonical tags on the php pages? Thanks very much, V
Technical SEO | | Voodak0 -
Google Cache can't keep up with my 403s
Hi Mozzers, I hope everyone is well. I'm having a problem with my website and 403 errors shown in Google Webmaster Tools. The problem comes because we "unpublish" one of the thousands of listings on the site every few days - this then creates a link that gives a 403. At the same time we also run some code that takes away any links to these pages. So far so good. Unfortunately Google doesn't notice that we have removed these internal links and so tries to access these pages again. This results in a 403. These errors show up in Google Webmaster Tools and when I click on "Linked From" I can verify that that there are no links to the 403 page - it's just Google's Cache being slow. My question is a) How much is this hurting me? b) Can I fix it? All suggestions welcome and thanks for any answers!
Technical SEO | | HireSpace1 -
Google showing https:// page in search results but directing to http:// page
We're a bit confused as to why Google shows a secure page https:// URL in the results for some of our pages. This includes our homepage. But when you click through it isn't taking you to the https:// page, just the normal unsecured page. This isn't happening for all of our results, most of our deeper content results are not showing as https://. I thought this might have something to do with Google conducting searches behind secure pages now, but this problem doesn't seem to affect other sites and our competitors. Any ideas as to why this is happening and how we get around it?
Technical SEO | | amiraicaew0 -
Google couldn't access your site because of a DNS error
Hello, We've being doing SEO work for a company for about 8 months and it's been working really well, we've lots of top threes and first pages. Or rather we did. Unfortunately the web host who the client uses (who to recommended them not to) has had severe DNS problems. For the last three weeks Google has been unable to access and index the website. I was hoping this was going to be a quickly resolved and everything return to normal. However this week their listing have totally dropped, 25 page one rankings has become none, Google Webmaster tools says 'Google couldn't access your site because of a DNS error'. Even searching for their own domain no longer works! Does anyone know how this will effect the site in the long term? Once the hosts sort it out will the rankings bounce back. Is there any sort of strategy for handling this problem? Ideally we'd move host but I'm not sure that is possible so any other options, or advice on how it will affect long term rankings so I can report to my client would be appreciated. Many thanks Ric
Technical SEO | | BWIRic0 -
How do I use the Robots.txt "disallow" command properly for folders I don't want indexed?
Today's sitemap webinar made me think about the disallow feature, seems opposite of sitemaps, but it also seems both are kind of ignored in varying ways by the engines. I don't need help semantically, I got that part. I just can't seem to find a contemporary answer about what should be blocked using the robots.txt file. For example, I have folders containing site comps for clients that I really don't want showing up in the SERPS. Is it better to not have these folders on the domain at all? There are also security issues I've heard of that make sense, simply look at a site's robots file to see what they are hiding. It makes it easier to hunt for files when they know the directory the files are contained in. Do I concern myself with this? Another example is a folder I have for my xml sitemap generator. I imagine google isn't going to try to index this or count it as content, so do I need to add folders like this to the disallow list?
Technical SEO | | SpringMountain0