Google Pagination Changes
-
What with Google recently coming out and saying they're basically ignoring paginated pages, I'm considering the link structure of our new, sooner to launch ecommerce site (moving from an old site to a new one with identical URL structure less a few 404s).
Currently our new site shows 20 products per page but with this change by Google it means that any products on pages 2, 3 and so on will suffer because google treats it like an entirely separate page as opposed to an extension of the first.
The way I see it I have one option: Show every product in each category on page 1.
I have Lazy Load installed on our new website so it will only load the screen a user can see and as they scroll down it loads more products, but how will google interpret this? Will Google simply see all 50-300 products per category and give the site a bad page load score because it doesn't know the Lazy Load is in place? Or will it know and account for it?
Is there anything I'm missing?
-
It's likely that they will be valued a bit less but the effects shouldn't be drastic. Even if you just had one massive page with all products on the ones at the top would likely get more juice anyway
If it's a crazy big concern, think about a custom method to sort your products
-
Thank you very much for taking the time to respond so eloquently.
If all the products would be visible in the base, non-modified source code (right click page, then click "view source" - is the data there?) then there is a high likelihood that Google will see and crawl it.
I can confirm that each product does in fact appear in the source data, so as you say, Google will crawl it which is somewhat of a relief.
Does this then mean that regardless of which page the products appear on, Google will simply ignore this factor and treat each product the same regardless?
The thing I am trying to avoid is products on page 2, 3 and so on from being valued less.
-
This is a great, technical SEO query!
What you have to understand is that whilst Google 'can' crawl JS, they often don't. They don't do it for just anyone, and even then they don't do it all of the time. Google's main mission is to 'index the web' - on that account their index of the web's pages, whilst vast - is still far from complete
Crawling JavaScript necessitates the usage of a headless browser (if you were using Python to script such a thing, you'd be using the Selenium or Windmill modules). A browser must open (even if it does so invisibly) and 'run' the JavaScript, which creates more HTML - which can then be crawled only **AFTER **the script execution
On average this takes 10x longer than basic, non-modified source code scraping. Ask your self, would Google take a 10x efficiency hit on an incomplete mission - for 'everyone' on the web? The answer is no (I see evidence of this every day across many client accounts)
Let's answer your question. If all the products would be visible in the base, non-modified source code (right click page, then click "view source" - is the data there?) then there is a high likelihood that Google will see and crawl it
If the data (code) only exists with right click, inspect element - and not in "view source" - then the data only exists in the 'modified' source code (not the base-source). In that scenario, Google would be extremely unlikely to crawl it (or always crawl it). If it's a very important page on a very important site (Coca Cola, M&S, Barclays, Santander) then Google may go further
For most of us, the best possible solution is to 'get' the data we want crawled, into the non-modified source code. This can be achieved by using JS only for the visual changes (but not the structure) or by adopting SSR (Server Side Rendering)
Hope that helps
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Regret changing the URL structure, Would it be appropriate to change it back?
Hi Moz Community, We changed the URL structure 6 months ago for our new site, and we experienced a ranking drop since then. From my understanding, changing URL structure and using 301 redirects will lose link juice, more or less. We think the ranking drop is because of the loss of link juice, assuming other factors remain constant. Here are my questions: How do those link juice losses have an impact on our ranking? Would changing URL structure back to original version regain the lost link juice, with all the redirects done properly? Would it take a lot of efforts? Is it recommended to change it back? Thank you so much in advance. Any thoughts and opinions are appreciated! Best, Raymond
Intermediate & Advanced SEO | | raymondlii0 -
Google Search Console "Change of Address" - Failed Redirection Test
I have a client who has a lot of domain variations, which have all been set up in Google Search Console. I requested that the client use the COA feature in GSC for the domains that are now redirecting to other domains that they own (which are set up in GSC). The problem is that we're not redirecting the homepages to the homepages of the destination domains. So, GSC is giving us this error message: fails redirection test: The old site redirects to www.domain.com/blog, which does not correspond to the new site you chose. Is our only way to use GSC COA for these domains to change the homepage redirect to go to the homepage of the destination domain? We don't really want that since the domain we're redirecting is a "blog.domain1.com" subdomain and we want to redirect it to "domain2.com/blog". Any help appreciated! Thanks,
Intermediate & Advanced SEO | | kernmedia
Dan0 -
Reclaiming Ranking positions in Google
We have a website we are working on that was ranking well in Google but since having a hosting upgrade has completely dropped in rankings. When a hosting upgrade was made, the developer added an incorrect robots.txt file that restricted the site from being found, hence resulting in lost rankings. We have since sorted out that issue so the robots.txt is OK. However, ranking results have yet to be reclaimed. We are unsure why these rankings haven't rebounded back, as it has been a while now. The site is https://www.brightonpanelworks.com.au. We have since also attempted to add a sitemap however to help the site be better crawled and to regain rankings, however, it appears that sitemap generators are having problems creating a sitemap for this site and we are not sure why. And we are not sure whether this may relate to why Google has not picked up on pages and ranking results have not be restored. If you have any ideas as to how we can reclaim rankings to the strong positions they were in previously, that would be much appreciated. We believe we may be missing something here that is not allowing webpages to be picked up and ranked by Google.
Intermediate & Advanced SEO | | Gavo0 -
Image URL Change Catastrophe
We have a site with over 3mm pages indexed, and an XML sitemap with over 12mm images (312k indexed at peak). Last week our traffic dropped off a cliff. The only major change we made to the site in that time period was adding a DNS record for all of our images that moved them from a SoftLayer Object Storage domain to a subdomain of our site. The old URLs still work, but we changed all the links from across our site to the new subdomain. The big mistake we made was that we didn't update our XML sitemap to the new URLs until almost a week after the switch (totally forgot that they were served from a process with a different config file). We believe this was the cause of the issue because: The pages that dropped in traffic were the ones where the images moved, while other pages stayed more or less the same. We have some sections of our property where the images are, and have always been, hosted by Amazon and their rankings didn't crater. Same with pages that do not have images in the XML sitemap (like list pages). There wasn't a change in geographic breakdown of our traffic, which we looked at because the timing was around the same time as Pigeon. There were no warnings or messages in Webmaster Tools, to indicate a manual action around something unrelated. The number of images indexed in our sitemap according Webmaster Tools dropped from 312k to 10k over the past week. The gap between the change and the drop was 5 days. It takes Google >10 to crawl our entire site, so the timing seems plausible. Of course, it could be something totally unrelated and just coincidence, but we can't come up with any other plausible theory that makes sense given the timing and pages affected. The XML sitemap was updated last Thursday, and we resubmitted it to Google, but still no real change. Anyone had a similar experience? Any way to expedite the climb back to normal traffic levels? Screen%20Shot%202014-07-29%20at%203.38.34%20PM.png
Intermediate & Advanced SEO | | wantering0 -
Google Tag Manager
Has anyone used Google Tag Manager and do you feel it is worth it?
Intermediate & Advanced SEO | | ChristinaRadisic0 -
Changing the relevance of the homepage
One of my new clients is hell bent on changing the content of their homepage. They are one of the world's largest resort companies. The site is graphics-heavy (with embedded text), and barely contains any content. I haven't started any of the on-page optimization yet, but when I do, it will be a major overhaul. Despite the poor on-page of the site, they are getting great rankings and a ton of traffic due to number and quality of their backlinks and domain authority. My concern is this: they want to change the homepage and make it into a "vacation sweepstakes" type of page. Their logic seems to be that they will generate a lot of interest on the site and get people excited about winning an expensive dream vacation, which is all fine and dandy, however, my feeling is that this will change the relevance of the page. So, instead of pitching their ownership-based program, now, they will be promoting vacation contests. So here's the stupid question: would this have the potential to negatively affect their search engine results or the Domain Authority? I'm thinking of suggesting to them a less drastic approach. Perhaps something like the lightbox sweepstakes overlay on marshallsonline.com. At least, this way, we can keep the current homepage and improve on it, rather than going into another niche. Any feedback or suggestions on this is greatly appreciated!
Intermediate & Advanced SEO | | ollan0 -
Google Preview not showing images
No matter which our pages i find in the SERP the Google Preview does not show images for www.spies.dk Try searching for "rejser til malaysia" on google.dk. The result www.spies.dk/malaysia is not showing images. Why is that? Using the "google preview tool" under labs in Google Webmaster Tools sys it found 13 errors on that page: | Ressource: | Information: http://images2.spies.dk/images/SiteID11/SiteLayout/logo-spies.png?v=4 | Indekseret http://images1.spies.dk/images/StaticLayout/Spies_hori_sRGB.png?v=1 | Indekseret http://images1.spies.dk/images/SiteID11/Button/search-button-text3.png?v=2 | Indekseret http://images2.spies.dk/images/Country/my1001_10_48.jpg?v=1 | Indekseret http://images2.spies.dk/images/SiteID11/Button/Opdater_bt.png?v=2 | Indekseret http://images1.spies.dk/images/SiteID11/Button/Sog_bt.png?v=3 | Indekseret http://images2.spies.dk/images/Country/my1002_2_22.gif?v=1 | Indekseret http://images2.spies.dk/images/Resort/bkikki1001_4_11.jpg?v=1 | Indekseret http://images2.spies.dk/images/SiteID11/Button/Vaelg_bt.png?v=7 | Indekseret http://images1.spies.dk/images/Resort/bkiskn1001_4_11.jpg?v=1 | Indekseret http://images1.spies.dk/images/Resort/kulkum1001_4_11.jpg?v=1 | Indekseret http://images1.spies.dk/images/Resort/lgklak1001_4_11.jpg?v=1 | Indekseret https://track.adform.net/serving/scripts/trackpoint/async/ | Indekseret |
Intermediate & Advanced SEO | | alsvik0 -
Google Penalty, Which one is hitting me?
Hello there SEOMOZ'ers, Just signed up for a Pro Account. Impressed with the trial. Just wondering is there any way to find out if you have fallen foul of a Google update, is there any way to know which you have fallen foul of or is the date it happens the only guide or pointer to what might have happened? My site was getting a reasonable amount of traffic for the last few years and was ranking well for my main search terms up till Feburary the 4th when the traffic fell by almost 50% ... and it has never really recovered. I thought it might be the Above the Fold Advertising penalty (or whatever it is called) because I had a lot of images linking elsewhere on my site, above the fold on my homepage, but I removed them about 4 weeks ago and the traffic still hasnt recovered. I've been using SEOMOZ to detect duplicate content and various other problems with my site since and fixing a good few of them each week. I have the main issues down from about 300 to about 20 which i think i have just fixed. I have also used Copyscape and found that some other websites had stolen some of my content, which i have since rewritten. I was ranking number 2 or 3 for my main keyword but dropped out of the top 100 in February, which is pretty frustrating at this stage. I don;t know if I have accidentally gone too far over optimising my site or what has happened. If anyone can give me some pointers I'd really appreciate it, if anyone can help me fix it I'll buy them a beer next time you are in Ireland! 🙂 Thanks, Shane
Intermediate & Advanced SEO | | shaneod0