Publisher is verified but no microdata in search results
-
ANy ideas why is this the case?
-
Do you have feedback on my answer? I do believe it answers this question.
I think you are confused with local/business results and G+ pages appearing on the right rail.
-
Yes Garden,,!!
I agree that you have already used that, but as i have commented above, Do you used microdata structure in your website? Please go through that link once.
-
The right hand side of the SERPs don't show company information immediately. Some sites that I've worked with that has had over 10 million visits a month and 60DA didn't get it until recently.
Also note that Google has stated they would not show rich snippets for sites that are new or have thin content. This was in the recent months and I think the number they stated was 15% less rich snippets to show up.
To get there you would need to be really active and have interactions with G+ users with your G+ page. The more interactions you have on G+ overall, the more likely your page would appear on the right rail.
-
I think you will get your all answers here - http://www.google.com/webmasters/tools/richsnippets?q=http%3A%2F%2Fwww.convertmedia.com
You will get all help from here too. You should implement microdata structure as google has mentioned it in detail - You can check it our here - https://support.google.com/webmasters/answer/99170?hl=en
Here Google has mentioned in detialed that which type of structure your website should have and which type of format you should implement on your website according to your website niche.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Replication on Search
Hi. We recently created a Christmas category page on our eCommerce website (christowhome.co.uk). Earlier today, I Googled ‘Christow Christmas Silhouette Lights’ (Christow being the name of our website and Christmas silhouette lights being one of the sub-categories we recently created). I was curious to see how the page appeared on search. Bizarrely, the page appeared multiple times on search (if you click on the link above, it should show you the search results). As you can see, multiple meta titles and descriptions have been created for the same page. This is something that is affecting a number of our Christmas category pages. I don't quite understand why this has happened. We recently added filters to the category. Could the filters be responsible? Any idea how I can prevent this from happening? How I can stop Google indexing these weird replica pages? Many thanks, Dave
Technical SEO | | Davden0 -
Search Console has found over 18k 404 errors in my site, should I redirect?
most of them where old URLs pointed from a really old domain, that we have just shutten down. If the pages didn't receive any traffic, should we redirect? If I follow this https://mza.seotoolninja.com/learn/seo/http-status-codes we shouldn't
Technical SEO | | pablo_carrara0 -
Google Search Console Site Map Anomalies (HTTP vs HTTPS)
Hi I've just done my usual Monday morning review of clients Google Search Console (previously Webmaster Tools) dashboard and disturbed to see that for 1 client the Site Map section is reporting 95 pages submitted yet only 2 indexed (last time i looked last week it was reporting an expected level of indexed pages) here. It says the sitemap was submitted on the 10th March and processed yesterday. However in the 'Index Status' its showing a graph of growing indexed pages up to & including yesterday where they numbered 112 (so looks like all pages are indexed after all). Also the 'Crawl Stats' section is showing 186 pages crawled on the 26th. Then its listing sub site-maps all of which are non HTTPS (http) which seems very strange since the site is HTTPS and has been for a few months now and the main sitemap index url is an HTTPS: https://www.domain.com/sitemap_index.xml The sub sitemaps are:http://www.domain.com/marketing-sitemap.xmlhttp://www.domain.com/page-sitemap.xmlhttp://www.domain.com/post-sitemap.xmlThere are no 'Sitemap Errors' reported but there are 'Index Error' warnings for the above post-sitemap, copied below:_"When we tested a sample of the URLs from your Sitemap, we found that some of the URLs were unreachable. Please check your webserver for possible misconfiguration, as these errors may be caused by a server error (such as a 5xx error) or a network error between Googlebot and your server. All reachable URLs will still be submitted." _
Technical SEO | | Dan-Lawrence
Also for the below site map URL's: "Some URLs listed in this Sitemap have a high response time. This may indicate a problem with your server or with the content of the page" for:http://domain.com/en/post-sitemap.xmlANDhttps://www.domain.com/page-sitemap.xmlAND https://www.domain.com/post-sitemap.xmlI take it from all the above that the HTTPS sitemap is mainly fine and despite the reported 0 pages indexed in GSC sitemap section that they are in fact indexed as per the main 'Index Status' graph and that somehow some HTTP sitemap elements have been accidentally attached to the main HTTPS sitemap and the are causing these problems.What's best way forward to clean up this mess ? Resubmitting the HTTPS site map sounds like right option but seeing as the master url indexed is an https url cant see it making any difference until the http aspects are deleted/removed but how do you do that or even check that's what's needed ? Or should Google just sort this out eventually ? I see the graph in 'Crawl > Sitemaps > WebPages' is showing a consistent blue line of submitted pages but the red line of indexed pages drops to 0 for 3 - 5 days every 5 days or so. So fully indexed pages being reported for 5 day stretches then zero for a few days then indexed for another 5 days and so on ! ? Many ThanksDan0 -
How is IT handling multi-page search results for this url?
How is the IT team handling multi-page results? The URL is the same - with out any parameters, but the content changes. Is this best way to handle it from an SEO perspective?
Technical SEO | | S.S.N0 -
Searching on root domain words = ranking on > page 10 in SERP
Hello, Our website wingmancondoms.com (a new condom brand) is not ranking in Google on the keywords "wingman condom", and I don't know why. In Yahoo and Bing everything is allright. I saw on this forum that it is maybe best to change my language URL's to wingmancondoms.com/nl /de and /fr instead of a direct URL like http://www.wingmancondoms.com/wingman-kondome (german translation). But is this our problem or are there more problems. Google is indexing our page well, no errors etc. Any other possibilities?
Technical SEO | | jogo0 -
Sitel inks suddenly vanished in google search
Hello, My site is designzzz.com if you will search in google the term : designzzz it use to return sitelinks below , but now it isn't, what could be the reason ? cheers
Technical SEO | | wickedsunny10 -
Image search and CDNs
Hi, Our site has a very high domain strength. Although our site ranks well for general search phrases, we rank poorly for image search (even though our site has very high quality images). Our images are hosted on a separate CDN with a different domain. Although there are a number of benefits to doing this, since they are on a different domain, are we not able to capitalize on our my site's domain strength? Is there any way to associate our CDN to our main site via Google webmaster tools? Has anyone researched the search ranking impacts due to storing your images on a CDN, given that your domain strength is very high? Curious on people's thoughts?
Technical SEO | | NicB10 -
Similar category names result in similar urls and duplicate anchor texts
Hi all, I'm working on an e-commerce website about car tuning and car parts. There are main categories like ( Aerodynamics, Power tuning, Interior, Wheels, Tires, etc. ) and in the products are organized in sub-categories representing the product manufacturer, car manufacturer and car model + modification. Unfortunately this kind of structure creates duplicate sub-category names. For example we can have parts for Audi A4 8K in Aerodynamics and ABT, and the same time we can have Power tuning from the same manufacturer and for the same car, or Sport brakes for the same car by different manufacturers. So here are how some links look-like: /alfa-romeo-147-c1070-en /alfa-romeo-147-c234-en /alfa-romeo-147-c399-en These are totally different categories, with the same anchor text and almost the same url addresses ( the only difference in the urls is the category id ). Can this be affecting the site's indexation, and which can be the better way to create the internal link structure ?
Technical SEO | | mdimov0