Does Lazy Loading Create Indexing Issues of products?
-
I have store with 5000+ products in one category & i m using Lazy Loading . Does this effects in indexing these 5000 products. as google says they index or read max 1000 links on one page.
-
Hello Vinay,
Please see Mashable for an example:
http://mashable.com/2013/4/They have pagination links at the bottom of the page and use lazy loading / infinite scroll.
Adam Sherk has a good post about this:
http://www.adamsherk.com/seo/seo-tips-for-infinite-scrolling/ -
Everett i got ur point you mean ajax for users and pagination for spiders. can you show me one exp. that will help me alot.
Thanks
vinay
-
View the source of the cached page and look toward the bottom. Do all of your products/listings show in the source? If so, you're all good. If not, you may want to add pagination for the spiders, as mentioned above.
-
thanks Everett Sizemore.
I just check cashe as you suggested. and i found lazy loading is also working on cashed page. that means everything ok?
-
Great suggestions Everett: "If you're curious how much of the page they are rendering just looked at the cached version of the page from the search results."
-
Hello,
Where did you read that Google only indexes or reads a maximum of 1,000 links on a page? I think this is outdated information. However, it is best practice not to have that many links on a page even if Google does crawl more than 100 or 1,000 per page.
So to answer your question, yes if you're loading additional product listings via javascript after the user scrolls down it could cause Google to only render part of the page. However, often this does not cause "indexation issues" for your product pages because they have other paths into them from sub-categories, related product links, external links, sitemaps, etc...
If you're curious how much of the page they are rendering just looked at the cached version of the page from the search results. That should answer your question directly from Google.
I usually recommend pagination links be added to the page so Google, or users those without javascrpt enabled, has a path to access more product listings. If you like you can set those paginated category pages to noindex,follow so they do not get indexed, but Google can still crawl them to find deeper products.
-
I have never heard of any negative aspects on SEO when using lazy load... our shop has nearly 100.000 products and we use lazy load as well... we recently could raise the numbers of indexed pages form 5.5 mio. to 6.7 mio. ... so this is just an example but to answer your question from my personal point of view: the answer would be no
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Without prerender.io, is google able to render & index geographical dynamic content?
One section of our website is built as a single page application and serves dynamic content based on geographical location. Before I got here, we had used prerender.io so google can see the page, but now that prerender.io is gone, is google able to render & index geographical dynamic content? I'm assuming no. If no is the answer, what are some solutions other than converting everything to html (would be a huge overhaul)?
White Hat / Black Hat SEO | | imjonny1231 -
Active, Old Large site with SEO issues... Fix or Rebuild?
Looking for opinions and guidance here. Would sincerely appreciate help. I started a site long, long ago (1996 to be exact) focused on travel in the US. The site did very well in the search results up until panda as I built it off templates using public databases to fill in the blanks where I didn't have curated content. The site currently indexes around 310,000 pages. I haven't been actively working on the site for years and while user content has kept things somewhat current, I am jumping back into this site as it provides income for my parents (who are retired). My questions is this. Will it be easier to track through all my issues and repair, or rebuild as a new site so I can insure everything is in order with today's SEO? and bonus points for this answer ... how do you handle 301 redirects for thousands of incoming links 😕 Some info to help: CURRENTLY DA is in the low 40s some pages still rank on first page of SERPs (long-tail mainly) urls are dynamic (I have built multiple versions through the years and the last major overhaul was prior to CMS popularity for this size of site) domain is short (4 letters) but not really what I want at this point Lots of original content, but oddly that content has been copied by other sites through the years WHAT I WANT TO DO get into a CMS so that anyone can add/curate content without needing tech knowledge change to a more relevant domain (I have a different vision) remove old, boilerplate content, but keep original
White Hat / Black Hat SEO | | Millibit1 -
New Service/Product SEO and rankings
Hello, fellow MOZers. We are a web design company, and we had SEO as secondary service for years. Due to changes in the company we started pushing SEO as one of our main services about 6 monhs ago. We have separate page , targeting that service, as well as case studies, supportive information pages, even SEO Center, which is like a blog about SEO only. We are not using black hat SEO, doing honest link earning and building, don't use keyword stuffing, everything is by the book. I understand that SEO takes time, especially for a company which has a footprint as web design company, not as SEO company. We are ranking very good for web design related keyphrases, however, we don't see any improvements for SEO related keywords. It always was and is between 25-30 SERP. At the same time, competitors, who are ranking on first page for SEO related phrases are pretty bad looking. Design-wise as well as blackhat-SEO-wise. Everything is keyword stuffed, UX is horrible, prices are ridiculous. So, do you guys have any thought/advise on how we can see results / why we are not seeing results. Links: Google search result: https://www.google.com/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=seo%20houston Competitors: www.seohouston.com, www.graphicsbycindy.com Our pages: https://www.hyperlinksmedia.com/seo-houston.php, https://www.hyperlinksmedia.com/seo-houston/
White Hat / Black Hat SEO | | seomozinator0 -
Redirecting 86'd Brand Product Category Page
What would be the approach if my website is no longer selling products for a brand that is driving top organic traffic? Where should I redirect the traffic on the page? I'm trying to decide between the homepage or another similar brand product page.
White Hat / Black Hat SEO | | JMSCC0 -
Seeing URLS indexed that we don't want how do we approach this?
Hey guys, I have seen a few pages in the SERPS that are appearing from my site, some of these pages urls are actually ajax to refresh the buttons on our site... If these are important to our site but don't need to show up in the serps results can anyone recommend anything? Should I remove the urls? Or exclude them from the sitemap? or noindex? Any advice would be much appreciated thanks
White Hat / Black Hat SEO | | edward-may0 -
Does Google crawl and index dynamic pages?
I've linked a category page(static) to my homepage and linked a product page (dynamic page) to the category page. I tried to crawl my website using my homepage URL with the help of Screamingfrog while using Google Bot 2.1 as the user agent. Based on the results, it can crawl the product page which is a dynamic. Here's a sample product page which is a dynamic page(we're using product IDs instead of keyword-rich URLs for consistency):http://domain.com/AB1234567 Here's a sample category page: http://domain.com/city/area Here's my full question, does the spider result (from Screamingfrog) means Google will properly crawl and index the property pages though they are dynamic?
White Hat / Black Hat SEO | | esiow20130 -
Bot or Virus Creating Bad Links?
Hey Everyone, We are getting ready to engage a client for some potential marketing/SEO so in preparing for this have ran the site through OpenSiteExplorer. The site is relatively new and there are only two links under the inbound links section. They are relevant and add value, no issues there. Here is where it get strange. When I look under the 'Just Discovered' section there are many (hundreds) new links going back about a month. Virtually all of them have the anchor text 'Louis Vuitton outlet'. Now the client swears he has not engaged anyone for black hat SEO, so wondering who could possibly be creating these links. They do sell some Louis Vuitton items on the site, so I'm wondering if it is possible that some spam bot has picked up the site and began to spam the web with links to the clients site. So far today, 50 or so new links have been created with said anchor text and the clients root URL all on very poor quality, some foreign blog sites. Would like to find out why this is happening and put a stop to it for obvious reasons. Has anyone experienced something similar? Could this be a bot? Or maybe someone with an axe to grind against the client? Anyone could be doing this on their own, but just seems strange for it to be happening to a new site that does not even rank highly at the moment. Any advice or info is greatly appreciated, thanks in advance.
White Hat / Black Hat SEO | | Whebb0 -
Oh sh@t Wetherby Racecourse has been de indexed by Google :-(
Dio mio! Wetherby racecourse <cite>www.wetherbyracing.co.uk/</cite> has been de indexed by Google, re indexing request has been made via webmaster tools and the offending 3rd party banner ad has been stripped out. So my question is please. How long will it take approximately to re -index?
White Hat / Black Hat SEO | | Nightwing
And is it true re submitting an updated xml site & firing tweets at the ailing site may spark it back into life? Grazie tanto,David0