Does Lazy Loading Create Indexing Issues of products?
-
I have store with 5000+ products in one category & i m using Lazy Loading . Does this effects in indexing these 5000 products. as google says they index or read max 1000 links on one page.
-
Hello Vinay,
Please see Mashable for an example:
http://mashable.com/2013/4/They have pagination links at the bottom of the page and use lazy loading / infinite scroll.
Adam Sherk has a good post about this:
http://www.adamsherk.com/seo/seo-tips-for-infinite-scrolling/ -
Everett i got ur point you mean ajax for users and pagination for spiders. can you show me one exp. that will help me alot.
Thanks
vinay
-
View the source of the cached page and look toward the bottom. Do all of your products/listings show in the source? If so, you're all good. If not, you may want to add pagination for the spiders, as mentioned above.
-
thanks Everett Sizemore.
I just check cashe as you suggested. and i found lazy loading is also working on cashed page. that means everything ok?
-
Great suggestions Everett: "If you're curious how much of the page they are rendering just looked at the cached version of the page from the search results."
-
Hello,
Where did you read that Google only indexes or reads a maximum of 1,000 links on a page? I think this is outdated information. However, it is best practice not to have that many links on a page even if Google does crawl more than 100 or 1,000 per page.
So to answer your question, yes if you're loading additional product listings via javascript after the user scrolls down it could cause Google to only render part of the page. However, often this does not cause "indexation issues" for your product pages because they have other paths into them from sub-categories, related product links, external links, sitemaps, etc...
If you're curious how much of the page they are rendering just looked at the cached version of the page from the search results. That should answer your question directly from Google.
I usually recommend pagination links be added to the page so Google, or users those without javascrpt enabled, has a path to access more product listings. If you like you can set those paginated category pages to noindex,follow so they do not get indexed, but Google can still crawl them to find deeper products.
-
I have never heard of any negative aspects on SEO when using lazy load... our shop has nearly 100.000 products and we use lazy load as well... we recently could raise the numbers of indexed pages form 5.5 mio. to 6.7 mio. ... so this is just an example but to answer your question from my personal point of view: the answer would be no
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can I create a new Website to promote just one set of services from a list of several services?
Hi, I have a 10 years old website, where I promote all my services - around 30 of them under 5 main categories. For example, my current website promotes these services. A service - with a1, a2, a3 services B service - with b1, b2, b3 services C service - with c1, c2, c3 services D service - with d1, d2, d3 services E service - with e1, e2, e3 services Now I want to promote just "A service" with its sub-services into a separate website, as that service is in demand now and also those keywords should be my main keywords. I want to connect my old website with the new one, to increase the trust among users. Can I do this? I hope I am not violating any Google rules by doing this. Please help with suggestions. Thanks. Jessi.
White Hat / Black Hat SEO | | Sudsat0 -
Without prerender.io, is google able to render & index geographical dynamic content?
One section of our website is built as a single page application and serves dynamic content based on geographical location. Before I got here, we had used prerender.io so google can see the page, but now that prerender.io is gone, is google able to render & index geographical dynamic content? I'm assuming no. If no is the answer, what are some solutions other than converting everything to html (would be a huge overhaul)?
White Hat / Black Hat SEO | | imjonny1231 -
Malicious links on our site indexed by Google but only visible to bots
We've been suffering from some very nasty black hat seo. In Google's index, our pages show external links to various pharmaceutical websites, but our actual live pages don't show them. It seems as though only certain user-agents see the malicious links. Setting up Screaming Frog SEO crawler using the Googlebot user agent also sees the malicious links. Any idea what could have caused this or how this can be stopped? We scanned all files on our webserver and couldn't find any of malicious links. We've changed our FTP and CMS passwords, is there anything else we can do? Thanks in advance!
White Hat / Black Hat SEO | | SEO-Bas0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Loading websites faster
which are the best plugings for loading a website faster? Wp smush it+ w3 total cache + having a good theme like thesis? This plugin is good to have? or having w3 total cache is enough: http://wordpress.org/plugins/db-cache-reloaded-fix/ thanks ! Best regards, Sebastian Papp & His Team
White Hat / Black Hat SEO | | Sebastyan220 -
Page not being indexed or crawled and no idea why!
Hi everyone, There are a few pages on our website that aren't being indexed right now on Google and I'm not quite sure why. A little background: We are an IT training and management training company and we have locations/classrooms around the US. To better our search rankings and overall visibility, we made some changes to the on page content, URL structure, etc. Let's take our Washington DC location for example. The old address was: http://www2.learningtree.com/htfu/location.aspx?id=uswd44 And the new one is: http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training All of the SEO changes aren't live yet, so just bear with me. My question really regards why the first URL is still being indexed and crawled and showing fine in the search results and the second one (which we want to show) is not. Changes have been live for around a month now - plenty of time to at least be indexed. In fact, we don't want the first URL to be showing anymore, we'd like the second URL type to be showing across the board. Also, when I type into Google site:http://www2.learningtree.com/htfu/uswd44/reston/it-and-management-training I'm getting a message that Google can't read the page because of the robots.txt file. But, we have no robots.txt file. I've been told by our web guys that the two pages are exactly the same. I was also told that we've put in an order to have all those old links 301 redirected to the new ones. But still, I'm perplexed as to why these pages are not being indexed or crawled - even manually submitted it into Webmaster tools. So, why is Google still recognizing the old URLs and why are they still showing in the index/search results? And, why is Google saying "A description for this result is not available because of this site's robots.txt" Thanks in advance! Pedram
White Hat / Black Hat SEO | | CSawatzky0 -
Secretly back-linking from whitelabel product
Lets say a company (provider.com) offers a whitelabel solution which enables each client to have all of the content on their own domain (product.client.com), with no branding by the content provider. Now lets say that client.com is a site with a lot of authority, and to promote the launch of product.client.com, they put a lot of links from their main site to the subdomain. This can be very valuable link juice, and provider.com would like to be able to take advantage. The problem is, that client.com wouldn't like it if provider.com put in links on their whitelabel site. Suppose the following: All pages on product.client.com start to have a rel="canonical" link to themselves, with a get variable (e.g. product.client.com/page.htm -> product.client.com/page.html?show_extra_link=true) When the page is visited with the extra get parameter "show_extra_link" a link appears in the footer that points to provider.com My question is, would this have the same effect for provider.com as placing a link on the non-canonical version of the pages on the whitelabel site would?
White Hat / Black Hat SEO | | seoczar0 -
Indexing search results
One of our competitors indexes all searches performed by users on their site. They automatically create new pages/ new urls based on those search terms. Is it black hat technique? Do search engines specifically forbid this?
White Hat / Black Hat SEO | | AEM131