Mobile SERP Thumbnail Image Control
-
Is there any way we can control the image that is selected next to the mobile serps? What google selects for the mobile serp thumbnail on a few of our serps is not conducive to high CTR.
-
Hi Gareth,
Whatever schema.org markup you're using on that page, you should have the option to include an "image" property to define the image. Hard to get more specific without seeing the page unfortunately. But maybe check out the Schema.org list of recognized types and see which feels like the best match for the content on this page (although as a rule, category pages are a little tricky - usually you'd be using product or article markup on each individual item in the list, so I'm not sure where an image would fit into that).
-
Hi Bridget,
It's actually remedied now - it was a server loading issue with some images and now it's showing correctly.
It was a category page and I can't share details I'm afraid as it's a client's site, however if you had the schema markup for thumbs in mobile search at category level that would be really helpful - thanks so much for your time
B
-
Hi Gareth, you can certainly use Schema.org markup to suggest a thumbnail image for a SERP - could you share a little more context around the type of result it is for? (e.g. a product page, video thumbnail, etc). If you're comfortable sharing a screenshot that would also be helpful.
-
Hi Bridget - sorry to jump in on an old Q&A.
I have a thumbnail image showing in the mobile SERPs which is incorrect. Is there a Schema markup we can add to suggest Google show a product rather than the current image which is un-related.
many thanks
b
-
Hi George, I'm not sure which image you're referring to? A video snippet/recipe snippet type image? You should be able to suggest a 'featured image' in the structured data markup for those, but Google is not required to follow that suggestion.
With a bit more specific info I might be able to debug a bit more for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is a Link Wheel Safe If I Control the Wheel?
Hi, folks. Our company operates over 50 disease-specific, nice websites. Currently, we're building resource/landing pages for some therapies and other related topics. One experimental therapy is being investigated across four different disease types: cystic fibrosis, Muscular Dystrophy, Hemophilia, and cancers. We have sites for all of them, and have created original landing pages for each site. Question: is it safe / does it make sense to "link wheel" these pages, especially since the wheel is composed of all our own sites? The other option of course is to simply interlink all of them, but will I get more visibility with a cyclical linking scheme? I'd love to hear your thoughts on this. Thanks!
White Hat / Black Hat SEO | | Michael_Nace1 -
Controlling crawl speed/delay through dynamic server-code and 503's
Lately i'm experiencing performance trouble caused by bot traffic. Although Googlebot is not the worst (it's mainly bingbot and ahrefsbot), they cause heavy server load from time to time. We run a lot of sites on one server, so heavy traffic on one site impacts other site's performance. Problem is that 1) I want a centrally managed solution for all sites (per site administration takes too much time), which 2) takes into account total server-load in stead of only 1 site's traffic and 3) controls overall bot-traffic in stead of controlling traffic for one bot. IMO user-traffic should always be prioritized higher than bot-traffic. I tried "Crawl-delay:" in robots.txt, but Googlebot doesn't support that. Although my custom CMS system has a solution to centrally manage Robots.txt for all sites at once, it is read by bots per site and per bot, so it doesn't solve 2) and 3). I also tried controlling crawl-speed through Google Webmaster Tools, which works, but again it only controls Googlebot (and not other bots) and is administered per site. No solution to all three of my problems. Now i came up with a custom-coded solution to dynamically serve 503 http status codes to a certain portion of the bot traffic. What traffic-portion for which bots can be dynamically (runtime) calculated from total server load at that certain moment. So if a bot makes too much requests within a certain period (or whatever other coded rule i'll invent), some requests will be answered with a 503 while others will get content and a 200. Remaining question is: Will dynamically serving 503's have a negative impact on SEO? OK, it will delay indexing speed/latency, but slow server-response-times do in fact have a negative impact on the ranking, which is even worse than indexing-latency. I'm curious about your expert's opinions...
White Hat / Black Hat SEO | | internetwerkNU1 -
Can image links help improve my backlinking profile?
I recently spent some time looking at the backlink profile of a leading UK food & clothing retailer and noticed that a high number of their backlinks for very competitive search phrase's consisted entirely of image backlinks. 50% of the links contained no alt text and other 50% contained a mix of just the targeted keyword or a phase containig one mention of the targeted keyword. Has anyone had any experiance of this type of marketing producing any positive effect on SEO or search engine rankings?
White Hat / Black Hat SEO | | BigJonOne0 -
Solved PayDay hack - but SERPs show URLs - what should I do?
We had the PayDay hack - and solved it completely. The problem is - the SERPs have over 3,000 URLs pointing to 404 on our website all of which have urls that are like this: <cite>www.onssi.com/2012/2/post1639/payday-loan-companies-us</cite> What should I do? Should I disavow every one of the 3,000? No Follow?
White Hat / Black Hat SEO | | Ocularis0 -
Spam report duplicate images
Should i do a spam report if a site competitor as copied my clinical cases images and placed as their own clinical cases. That site also does not have privacy policy or medical doctor on that images. My site: http://www.propdental.es/carillas-de-porcelana/
White Hat / Black Hat SEO | | maestrosonrisas0 -
Why do websites use different URLS for mobile and desktop
Although Google and Bing have recommended that the same URL be used for serving desktop and mobile websites, portals like airbnb are using different URLS to serve mobile and web users. Does anyone know why this is being done even though it is not GOOD for SEO?
White Hat / Black Hat SEO | | razasaeed0 -
Mobile SEO best practices : Should my mobile website be located at m.domain.com or domain.com/mobile?
I'd like to know if there's any difference between using m.domain.com/pages or domain.com/mobile/pages for a mobile website? Which one is better? Why? Does Google treat the two differently? As you can see, I'm new to this! This is my first time working on a mobile website, so any links/resources would be highly appreciated. Thanks!
White Hat / Black Hat SEO | | GroupeDSI0 -
Impressions in Google SERP has declined from 3500 to 1600 after 5-25-2012\. Is it Penguin?
It's about the website http://www.apartments-houseboats-amsterdam.com/ The visitors had declined from 270 to 150 visitors per day. Is this caused by the Google update Penguin? If so what can I do to solve the problem? Thank you for your time and effort,
White Hat / Black Hat SEO | | letsbuilditnl0