Lazy Loading of products on an E-Commerce Website - Options Needed
-
Hi Moz Fans.
We are in the process of re-designing our product pages and we need to improve the page load speed.
Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact.
We can have upwards of 50 associated products on a page so need a solution.
So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines.
The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Here's the official page: Making AJAX Applications Crawlable.The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case).It seems complicated but it is not, let's use our gallery as an example.
- Every gallery thumbnail has to have an hyperlink like:
http://www.idea-r.it/...#!blogimage=<image-number></image-number>
- When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number>
Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good.var fragment = Request.QueryString[``"_escaped_fragment_"``];``if
(!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'='
});``if
(escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``}
What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested imageif
(window.location.hash)``{``// NOTE: remove initial #``var
fragmentParams = window.location.hash.substring(1).split(``'='``);``var
imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``}
The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content.
Any advice and discussion welcome
- Every gallery thumbnail has to have an hyperlink like:
-
Ok, cool. To reiterate - with escaped_fragment you are just serving the same content in a tweaked format and Google recommend it rather than frown upon it. Good to be sure though.
See you at SearchLove!
-
Hi Tom, Thank you for the response,
The concern about serving an alt version is that it would be frowned up from a SEO perspective and may lead to a form of penalty.
I agree that escaped_fragment would be the best approach and just wanted to satisfy my own concerns before I get them working on this.
Thank you and see you at Search Love
-
Hi,
I am not sure I follow your concerns around serving an alternative version of the page to search engines - is that concern based on concerns it will be frowned upon or technical concerns?
Using the escaped_fragment methodology would work for your purposes, and would be the best approach. If you have technical concerns around creating the HTML snapshots you could look at a service such as https://prerender.io/ which helps manage this process.
If that doesn't answer your question, please give more information so we can understand more specifically where you concerns are.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Website removed from Bing and Yahoo
Hello, Our website howtoremove.guide was recently removed from the Bing and Yahoo index. The first thing we did was contact Bing Webmaster support to ask what the issue was since we did not get any notifications or messages in our webmaster dashboard. The email that we got back said “I have escalated the issue to our engineers and will get back to you once I receive an update.” Since then, we haven't received any word back from them, but we did not find any technical problems and we strongly believe we were manually penalized. We've never had issues with a search engine before, so we are at a loss what to do. Could you please give us advice as to what technical issue our website might have or what could incur a deindex penalty in our case? We want to do everything that is possible to get back into Bing and Yahoo search results ASAP. The website has primarily affiliate content, so we are doing anything we can to clean everything up, but any recommendations will be incredibly useful to us. We are also open to contacting an expert on this, but we have no idea where to look.
Intermediate & Advanced SEO | | ThreatAnalyzer0 -
Website copying in Tweets from Twitter
Just noticed a web developer I work with has been copying tweets into the website - and these are displayed (and saved) one page at a time across hundreds of pages (this is so they can populate a twitter feed, I am told). How would you tackle this, now that the deed's been done? This is in Drupal. Your thoughts would be welcome as this is a new one to me. Thanks, Luke
Intermediate & Advanced SEO | | McTaggart0 -
GWT - Links to website - Are they accurate
Hi I am looking at GWT more and more and I starting not to believe the information within it. For example we had a link on XYZ.com say 6 months ago. This link as been removed no reference to our website, however it still showing on GWT inbound links. I have noticed quite a few sites which have no relevance to our site. Is anyone else finding the information wrong in WMT
Intermediate & Advanced SEO | | Cocoonfxmedia0 -
How many pages should be on landscapers website
Hi Guys, We have a good website strong onsite and offsite seo. A year ago, we had a 15 pages website for all main keywords we needed and we were on top 3 for most of these keywords in google. We were happy but we wanted more.. So we created lots of unique content targeting long tail keywords and created 100 more pages for the website. In next 4-5 months we lost positions for almost all our main keywords but got lots of longtails SERPs. Trafiic grew but the quality and the conversion rate shrinked. Everybody keep saying that it doesn't matter how many pages you have on the website as long as content is unique and I don't think it is true. I see lots of 3-5 paged websites without any seo in top 3 results in google. Does it mean that if I delete all these 100 pages that I created I will have more chances to get my main keywords SERP back? Basically does the seo juice that you have on domain is spreading across all pages and the more pages you have the less juice every page will get?
Intermediate & Advanced SEO | | vadimmarusin100 -
We have two different websites with the same products and information, will that hurt our rankings?
We have two different domains, one for the UK and the other for the US, they have the exact same products, categories and information. (the information is almost the same in 400 products) We know that Google could recognize that as duplicate content, but will that actually hurt our rankings in both sites? Is it better if we create two completely different versions of the content on those pages?
Intermediate & Advanced SEO | | DoitWiser0 -
Do we need breadcrumbs?
I found myself in a weird position today having to defend the use of breadcrumbs.... This is what I wrote.... From an SEO point of view it is best practice to have breadcrumbs as they are high up in the code and help the search engines crawling the site. Do you need a breadcrumb for SEO – Yes – as well as from a usability point of you view users can navigate a breadcrumb instead of hitting the back button. What would you have said?
Intermediate & Advanced SEO | | JohnW-UK0 -
Are there any SEO Tips before killing a website?
Hey guys, My company acquired another company, and after a couple of months we decided to completely kill their website. I'm not finding any info about SEO best practices for this type of situation. From the "switching domains" and "new sites" articles and blog posts I can extrapolate that I should: 301 redirect their home page to ours Look at specific pages with good authority that relate to our pages and 301 them. Look at the strongest backlinks to their site and try to change them to point to our site. Create a 404 page for the rest of their webpages that tells them that we acquired the company (hopefully with a main menu and search bar) Any other suggestions?
Intermediate & Advanced SEO | | nrv0 -
What would be the ideal method to handling auto-generated product content across network of dealership websites?
We have recently started work with a dealership group that operates ~20 separate dealerships (different locations and brands) and individual websites for each. The group also operates two umbrella websites for the group brand that shows the inventory across All 20 dealerships. All websites are basically using the same template and all product listings are from the same data source (same back-end system). All websites are currently also hosted on the same IP address. Typically we work with clients to rectify duplicate content issues and work towards having just one version of any piece of content. However, this is a unique situation in that each dealership has a legitimate brand and marketing need for having their own website. It also is not realistic to ask the client to create unique content for the same product listing 22x. We understand there are numerous options to consider but I would appreciate hearing any advice/feedback from individuals who have dealt with similar situations. If you know of any good resources on such a scenario, that would also be helpful to verify our thoughts. NOTE: the duplicate content for product inventory is not across all 22 sites but just usually between 3-4 for each product. Often each product listing is shown on 1 or 2 dealerships and the 2 umbrella sites (one is the main group site and the other a product used/clearance site). Currently we can see multiple domains indexed for the same product listings.
Intermediate & Advanced SEO | | BryanSmith0