Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
-
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article.
My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar.
They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too.
My question is, is VentureBeat's implementation actually that SEO-friendly or not.
VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles?
Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/
Would be interesting to know if someone has dealt with this first-hand or just has an opinion.
Thanks in advance!
Daniel
-
Totally agreed, Daniel! I'd also say it's our job to set expectations and be clear about when something is a test vs when something will more than likely work. Consulting is all about setting expectations!
-
Thanks a lot for your thoughts on this John. Really appreciate you taking the time to look into it.
You make a great point about not always copying competitors without testing first. If it's rolled out on such a wide scale, it's always going to be a hard case to put to the client knowing that they're going to lose out in the short-term when it comes to advertising revenue but regardless, I think it's our job as SEOs to first and foremost propose the most SEO-friendly implementation possible.
-
This is actually a really interesting question. I looked at their category pages (eg http://venturebeat.com/tag/ar-vr-weekly/) and those seem to be set up correctly to handle infinite scroll as it sends the search engines to the next page.
I've not come across this with infinite scroll on articles, though. I'm sure they've tested it extensively to figure out the best way to send search engines to future articles, but who really knows if it's being effective. If it's still there, I'd assume that they've seen positive signs but it is definitely a non-standard implementation of rel-next/prev!
This does bring up a good point about copying/not copying a competitor's strategy. They have this implemented, but would it work for your own site/business? Maybe, but maybe not. We can't be sure until we test it ourselves (or speak with someone at VentureBeat who wants to share their learnings :-)). If you know when it was rolled out you could benchmark there and look at SEMrush or another tool to see their organic visibility and from there draw at least some correlation, if not causation.
Thanks for flagging this up! It's cool to see.
-
IT depends on application and other design aspects.
I have seen websites that implement the same thing and like morons keep a never accessible footer there as well... you have no idea how impossible it was to get to the social bar/links at the bottom.
You have to think of the user experience to be honest, while there may be good technical reasons for such a design, you must in the end consider what the user goes through and wants to get out of. A/B testing these kinds of things would not hurt either.
But honestly only "feeds" should be this way. Facebook feed, twitter feed, news feed and even then applications should be considered with care.
Disclosure: I personally hate this behavior by default... basically the only place I find it acceptable is on facebook and twitter.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does ID's in URL is good for SEO? Will SEO Submissions sites allow such urls submissions?
Example url: http://public.beta.travelyaari.com/vrl-travels-13555-online It's our sites beta URL, We are going to implement it for our site. After implementation, it will be live on travelyaari.com like this - "https://www.travelyaari.com/vrl-travels-13555-online". We have added the keywords etc in the URL "VRL Travels". But the problems is, there are multiple VRL travels available, so we made it unique with a unique id in URL - "13555". So that we can exactly get to know which VRL Travels and it is also a solution for url duplication. Also from users / SEO point of view, the url has readable texts/keywords - "vrl travels online". Can some Moz experts suggest me whether it will affect SEO performance in any manner? SEO Submissions sites will accept this URL? Meanwhile, I had tried submitting this URL to Reddit etc. It got accepted.
White Hat / Black Hat SEO | | RobinJA0 -
More or Less pages helps in SEO?
Hi all, I have gone through some articles where less pages are suggested and they claim that they will be favoured by Google. I'm not sure as with limited pages, we can only target limited keywords. There might be threat from Google in-terms of doorway pages for more pages. But one of our competitor has many pages like dedicated page for every keyword. And their website ranks high and good for all keywords. I can see three pages created with differnet phrases for same on keyword. If less pages are good, how come this works for our competitor? Thanks
White Hat / Black Hat SEO | | vtmoz0 -
Traffic going down in all sites in a niche
Hello, A client has three Ecommerce sites in a niche. Because of competition and a (possibly) non manual penalty due to doorways and paid links (though I think it's mainly competition too) our traffic is going down. What are the keys to increasing traffic at this point. Feel free to include tricks that cost money. A Hrefs (I love Moz though!) has some neat content tricks. Please give me the best tricks in the industry to increase traffic. We're adding content to the main site of the three and maybe that's what to focus on, but we're having trouble driving serious traffic with the content. We need serious traffic. We are experts in our field and capable of almost anything as far as information goes in our field. Thanks.
White Hat / Black Hat SEO | | BobGW0 -
How do I deal with Negative SEO (Spammy Links)?
For the past 12 months, our website has been hit by spammy links with annoying anchor text. We suspected one of our competitor are deploying negative SEO on us. The image is an example of the sites and anchor text we have been spammed with. The frequency is about 1 - 2 spammy links a day. I have a few questions from here onwards: Does those links affect our SEO? (Most are mainly nofollow) Other than disavow, what other stuff can I do? How will google and other search engines see this incident? TcmFsti
White Hat / Black Hat SEO | | Changsst0 -
Redesigning my site, and not sure what is best for seo, subfolders or direct .html links?
,I have 4 examples to choose from, what is best:? http://hoodamath.com/games/dublox/index.html http://hoodamath.com/games/dublox.html http://hoodamath.com/dublox/index.html http://hoodamath.com/dublox.html
White Hat / Black Hat SEO | | hoodamath0 -
Knowledge Graph SEO Factors
I notice when I search for my clients brand name it pulls up the Google local info and Google+ stuff, knowledge graph etc, as well as a section at the bottom, 'People Also Search For' and lists a number of the clients competitors. However when I search one of the competitors no Google local or knowledge graph stuff comes up. Client obviously wants to limit promotion of the competitors. Does anyone have any experience with this? I know Google Author rank seems to play a factor in knowledge graph results? Are the competitors doing anything on their end SEO wise? What can be done to limit this? Thanks for any help! jkn0BMT.png
White Hat / Black Hat SEO | | EmarketedTeam0 -
Retail Site and Internal Linking Best Practices
I am in the process of recreating my company's website and, in addition to the normal retail pages, we are adding a "learn" section with user manuals, reviews, manufacturer info, etc. etc. It's going to be a lot of content and there will be linking to these "learn" pages from both products and other "learn" pages. I read on a SEOmoz blog post that too much internal linking with optimized anchor text can trigger down-rankings from Google as a penalty. Well, we're talking about having 6-8 links to "learn" pages from product pages and interlinking many times within the "learn" pages like Wikipedia does. And I figured they would all have optimized text because I think that is usually best for the end user (I personally like to know that I am clicking on "A Review of the Samsung XRK1234" rather than just "A Review of Televisions"). What is best practice for this? Is there a suggested limit to the number of links or how many of them should have optimized text for a retail site with thousands of products? Any help is greatly appreciated!
White Hat / Black Hat SEO | | Marketing.SCG0 -
If a site is punished by google like -30, or -60, are the link from that site efficient?
Like this way, if I build a blog and in some situation, the blog is punished by google as some reason I don't know, all the rank dropped and got the -30 punishment. If I put a outbound link on the sidebar, or footer position. what it'll be for that link? A is punished, a link is put on the A website and link to B website what that link means to B punished got many ways Thank you
White Hat / Black Hat SEO | | yifang01230