GoogleBot Mobile & Depagination
-
I am building a new site for a client and we're discussing their inventory section. What I would like to accomplish is have all their products load on scroll (or swipe on mobile). I have seen suggestions to load all content in the background at once, and show it as they swipe, lazy loading the product images. This will work fine for the user, but what about how GoogleBot mobile crawls the page?
Will it simulate swiping? Will it load every product at once, killing page load times b/c of all of the images it must load at once? What are considered SEO best practices when loading inventory using this technique.
I worry about this b/c it's possible for 2,000+ results to be returned, and I don't want GoogleBot to try and load all those results at once (with their product thumbnail images). And I know you will say to break those products up into categories, etc. But I want the "swipe for more" experience. 99.9% of our users will click a category or filter the results, but if someone wants to swipe through all 2,000 items on the main inventory landing page, they can. I would rather have this option than "Page 1 of 350".
I like option #4 in this question, but not sure how Google will handle it.
I asked Matt Cutts to answer this, if you want to upvote this question.
https://www.google.com/moderator/#11/e=adbf4&u=CAIQwYCMnI6opfkj -
What you ideally want to do is set up the mobile site as a standard site. Then utilize javascript to call each page in an order defined by the users actions with dynamic loading.
This has two benefits:
-
SEO and SERP. The pages will be indexed as they should. If you have one huge page you are still limited to the 2 or 3 keywords as always. When you see a good infinite scroll website it is not one page, it only looks this way due to JavaScript calling additional pages at triggers that have been set.
-
No JavaScript graceful fallback (or fallforward as it is actually the native state). If you have one page, lazy loading with JavaScript and they do not support it then you have 2,000 pages worth of images loading at one time which is otherwise known as a bounce.
You will want to build out the site with no consideration to the infinite scrolling (except in design ie. tile-able backgrounds for a smooth non stop flow) then apply the script after you have a logical site structure using silo'ed categories. Google bot, Google bot mobile and users who do not have JavaScript will all have a useable site and the SERPS will rank pages as they should.
Tip: Keep any page wide bar or graphic styles in the header or the footer of the page. You will normally only call the content or article portion of the page to the infinite scroll so you have a non-stop flow on the site.
Hope this helps
I know your not using WordPress but I am assuming you are using some sort of templated PHP script for a 2K product store. This WP plugin is pretty easy to understand and what I first used to grab the concept. Also, if wanting to go a more Pinterest route look into Masonry JavaScript. http://www.infinite-scroll.com/
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking #1 in Bing & DuckDuckGo, not at all for Google - where am I going wrong?
According to the Moz rank checking tool, my blog ranks in the top 3 for my name "James Crowley" on Bing, Yahoo (both in the US and UK), and also DuckDuckGo (though Moz can't tell me that). And yet doesn't rank anywhere for Google. I don't have any penalties, and for other keywords it appears fine on Google. Does this seem strange to you? Am I going wrong somewhere? The blog is https://www.jamescrowley.net/. Many thanks James Nq5uF2al.png
Intermediate & Advanced SEO | | james.crowley0 -
Robots.txt & Disallow: /*? Question!
Hi, I have a site where they have: Disallow: /*? Problem is we need the following indexed: ?utm_source=google_shopping What would the best solution be? I have read: User-agent: *
Intermediate & Advanced SEO | | vetofunk
Allow: ?utm_source=google_shopping
Disallow: /*? Any ideas?0 -
How Can I Displace a Quora Q&A in a Google Featured Snippet?
Hello all. I'm looking for ideas for displacing a Quora Q&A as the featured snippet in google search results. I rank organically for the target term (it's a branded term, "urban airship pricing") in results 1, 2, 3 and 4. The Quora Q&A ranks 5, but is still getting the featured snippet. The Quora question, which is from 2013, is negative - essentially "why does Urban Airship cost so much." It was posed / someone answered the question before we restructured pricing, and added a free starter edition, so the information in the answer is incorrect. It's causing issues for our sales teams, there's a fair amount of volume around this term for us, and worst of all, it's making me mad 😉 I've considered the tactics listed below, but would love to know if anyone's done this, and what free or low-lost tactics work/where to focus efforts. Thanks in advance for help! -Jessica Tactics I'm Considering (Are some or all worth doing? Better ideas?) Create a pricing FAQ page on my website to try give Google a short answer to a query related to pricing that it might feature instead of the Quora Q&A Get a lot of folks to downvote the Quora question (and upvote the short answer we added). Although I'm worried that "activity" on the question might actually make things worse not better in terms of its visibility. Buy paid Google Adwords for the term so the featured snippet isn't quite so starkly featured (we were buying for this term, looking into why our ads aren't showing up at the moment) Talk about pricing on sites like Product Hunt or others (other ideas?) to see if they'll rank highly enough to add more/better content to page 1 results. Contact Quora and let them know that this outdated question is being pulled into a featured snippet and see if they'll do something about it (remove it, etc.) Provide feedback to Google (using the link under the snippet) that "something is wrong" or "this isn't useful"
Intermediate & Advanced SEO | | jpoundstone0 -
Disavow Links & Paid Link Removal (discussion)
Hey everyone, We've been talking about this issue a bit over the last week in our office, I wanted to extend the idea out to the Moz community and see if anyone has some additional perspective on the issue. Let me break-down the scenario: We're in the process of cleaning-up the link profile for a new client, which contains many low quality SEO-directory links placed by a previous vendor. Recently, we made a connection to a webmaster who controls a huge directory network. This person found 100+ links to our client's site on their network and wants $5/link to have them removed. Client was not hit with a manual penalty, this clean-up could be considered proactive, but an algorithmic 'penalty' is suspected based on historical keyword rankings. **The Issue: **We can pay this ninja $800+ to have him/her remove the links from his directory network, and hope it does the trick. When talking about scaling this tactic, we run into some ridiculously high numbers when you talk about providing this service to multiple clients. **The Silver Lining: **Disavow Links file. I'm curious what the effectiveness of creating this around the 100+ directory links could be, especially since the client hasn't been slapped with a manual penalty. The Debate: Is putting a disavow file together a better alternative to paying for crappy links to be removed? Are we actually solving the bad link problem by disavowing or just patching it? Would choosing not to pay ridiculous fees and submitting a disavow file for these links be considered a "good faith effort" in Google's eyes (especially considering there has been no manual penalty assessed)?
Intermediate & Advanced SEO | | Etna0 -
Should we include a canonical or noindex on our m. (mobile) pages?
According to https://developers.google.com/webmasters/smartphone-sites/details, we should include a canonicalicalize back to our desktop version of the URL, but what if that desktop URL is noindexed? Should the m. version be noindexed as well? Or is it fine to leave it as a canonical?
Intermediate & Advanced SEO | | nicole.healthline0 -
How can I remove duplicate content & titles from my site?
Without knowing I created multiple URLs to the same page destinations on my website. My ranking is poor and I need to fix this problem quickly. My web host doesn't understand the problem!!! How can I use canonical tags? Can somebody help, please.
Intermediate & Advanced SEO | | ZoeAlexander0 -
It appears that Googlebot Mobile will look for mobile redirects from the desktop site, but still use the SEO from the desktop site.
Is the above statement correct? I've read that its better to have different SEO titles & descriptions for mobile sites as users search differently on mobile devices. I've also read it's good to link build, keep text content on mobile sites etc to get the mobile site to rank. If I choose to not have titles & descriptions on my mobile site will Google just rank our desktop version & then redirect a user on a mobile device to our mobile site or should I be adding in titles & descriptions into the mobile site? Thanks so much for any help!
Intermediate & Advanced SEO | | DCochrane0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0