Infinite Scrolling on Publisher Sites - is VentureBeat's implementation really SEO-friendly?
-
I've just begun a new project auditing the site of a news publisher. In order to increase pageviews and thus increase advertising revenue, at some point in the past they implemented something so that as many as 5 different articles load per article page. All articles are loaded at the same time and from looking in Google's cache and the errors flagged up in Search Console, Google treats it as one big mass of content, not separate pages. Another thing to note is that when a user scrolls down, the URL does in fact change when you get to the next article.
My initial thought was to remove this functionality and just load one article per page. However I happened to notice that VentureBeat.com uses something similar.
They use infinite scrolling so that the other articles on the page (in a 'feed' style) only load when a user scrolls to the bottom of the first article. I checked Google's cached versions of the pages and it seems that Google also only reads the first article which seems like an ideal solution. This obviously has the benefit of additionally speeding up loading time of the page too.
My question is, is VentureBeat's implementation actually that SEO-friendly or not.
VentureBeat have 'sort of' followed Google's guidelines with regards to how to implement infinite scrolling https://webmasters.googleblog.com/2014/02/infinite-scroll-search-friendly.html by using prev and next tags for pagination https://support.google.com/webmasters/answer/1663744?hl=en. However isn't the point of pagination to list multiple pages in a series (i.e. page 2, page 3, page 4 etc.) rather than just other related articles?
Here's an example - http://venturebeat.com/2016/11/11/facebooks-cto-explains-social-networks-10-year-mission-global-connectivity-ai-vr/
Would be interesting to know if someone has dealt with this first-hand or just has an opinion.
Thanks in advance!
Daniel
-
Totally agreed, Daniel! I'd also say it's our job to set expectations and be clear about when something is a test vs when something will more than likely work. Consulting is all about setting expectations!
-
Thanks a lot for your thoughts on this John. Really appreciate you taking the time to look into it.
You make a great point about not always copying competitors without testing first. If it's rolled out on such a wide scale, it's always going to be a hard case to put to the client knowing that they're going to lose out in the short-term when it comes to advertising revenue but regardless, I think it's our job as SEOs to first and foremost propose the most SEO-friendly implementation possible.
-
This is actually a really interesting question. I looked at their category pages (eg http://venturebeat.com/tag/ar-vr-weekly/) and those seem to be set up correctly to handle infinite scroll as it sends the search engines to the next page.
I've not come across this with infinite scroll on articles, though. I'm sure they've tested it extensively to figure out the best way to send search engines to future articles, but who really knows if it's being effective. If it's still there, I'd assume that they've seen positive signs but it is definitely a non-standard implementation of rel-next/prev!
This does bring up a good point about copying/not copying a competitor's strategy. They have this implemented, but would it work for your own site/business? Maybe, but maybe not. We can't be sure until we test it ourselves (or speak with someone at VentureBeat who wants to share their learnings :-)). If you know when it was rolled out you could benchmark there and look at SEMrush or another tool to see their organic visibility and from there draw at least some correlation, if not causation.
Thanks for flagging this up! It's cool to see.
-
IT depends on application and other design aspects.
I have seen websites that implement the same thing and like morons keep a never accessible footer there as well... you have no idea how impossible it was to get to the social bar/links at the bottom.
You have to think of the user experience to be honest, while there may be good technical reasons for such a design, you must in the end consider what the user goes through and wants to get out of. A/B testing these kinds of things would not hurt either.
But honestly only "feeds" should be this way. Facebook feed, twitter feed, news feed and even then applications should be considered with care.
Disclosure: I personally hate this behavior by default... basically the only place I find it acceptable is on facebook and twitter.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Moving content form Non-performing site to performing site - wihtout 301 Redirection
I have 2 different websites: one have good amount of traffic and another have No Traffic at all. I have a website that has lots of valuable content But no traffic. And I want to move the content of non-performing site to performing site. (Don't want to redirect) My only concern is duplicate content. I was thinking of setting the pages to "noindex" on the original website and wait until they don't appear in Google's index. Then I'd move them over to the performing domain to be indexed again. So, I was wondering If it will create any copied content issue or not? What should i have to take care of when I am going to move content from one site to another?
White Hat / Black Hat SEO | | HuptechWebseo0 -
Ecommerce sites we own have similar products, is this OK?
Hello, In one of our niches, we have a big site with all products and a couple more sites that are smaller niches of the same niche. The product descriptions are different with different product names. Is this OK. We've got one big site and 2 smaller subsides in different niches that cross over with the big site. Let me know if Google is OK with this. We will have a separate blog for each with completely different content. There's not really duplicate content issues and although only the big site has a blog right now, the small ones eventually will have their own unique blog. Is this OK in Google's eyes now and in the future? What can we do to ensure we are OK? Thank you.
White Hat / Black Hat SEO | | BobGW1 -
Bad for SEO to have two very similar websites on the same server?
Is it bad for SEO to have two very similar sites on the same server? What's the best way to set this up?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
What if White Hat SEO does not get results?
If company A is paying 5k a month and some of that budget is buying links or content that might be in the gray area but is ranking higher than company B that's following the "rules" and paying the same but not showing up at all, what's company B suppose to do?
White Hat / Black Hat SEO | | EmarketedTeam2 -
Google-backed sites' link profiles
Curious what you SEO people think of the link profiles of these (high-ranking) Google-backed UK sites: http://www.opensiteexplorer.org/domains?site=www.startupdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.lawdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.marketingdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.itdonut.co.uk http://www.opensiteexplorer.org/domains?site=www.taxdonut.co.uk Each site has between 40k and 50k inlinks counted in OSE. However, there are relatively few linking root domains in each case: 273 for marketingdonut 216 for startupdonut 90 for lawdonut 53 for itdonut 16 for taxdonut Is there something wrong with the OSE data here? Does this imply that the average root domain linking to the taxdonut site does so with 2857 links? The sites have no significant social media stats. The sites are heavily inter-linked. Also linked from the operating business, BHP Information Solutions (tagline "Gain access to SMEs"). Is this what Google would think of as a "natural" link profile? Interestingly, they've managed to secure links on quite a few UK local authority resources pages - generally being the only commercial website on those pages.
White Hat / Black Hat SEO | | seqal0 -
Here's some more proof white hat SEO works
I guess this is the most logical place to share this with you. I do SEO for many sites. I've recently been focusing on two in particular for the same client. We used Netfirms SEO services to get links--he insisted--which basically consists of writing articles in broken English and placing them all over blog networks with our desired anchor text. On the other site, I simply refused to employ those services. This was the client's main site, and was way too important to mess around with. I built links myself, the legit way. Long story short, for months I watched the shady, black hat site climb and climb in the SERPs, while the white hat one kept falling. This morning, I checked my SEOmoz campaigns and my white hat site went from #8 to #2 and my black hat site went from page 2 to no longer being in the top 50. Just another example of what's been happening with Google lately and how great it is. Interestingly, the black hat site never got a warning in GWT about buying links. Now I just have to figure out a way to break the news to my boss and tell him I told him so without actually using those words.
White Hat / Black Hat SEO | | UnderRugSwept5 -
Do Friends Let Friends Sell Links?
I have a friend with a site that has a lot of content. Some of that content has affiliate links with no follows to affiliate urls. Those pages also have a disclosure on them about the affiliate relationship. Now, he's talking about taking some of the existing under-performing affiliate links and renting them out to another site that wants them for the link juice. He says he'd have an on-page disclosure, a display ad for the advertiser on the page and something in the text like "you might check out our advertiser..." and then some keyword targeted link. He was asking me how risky I thought this is for him and really I don't know.Do you think Google would find this and s**t a chicken over it? I really don't know, given that I see really blatant undisclosed rented links all the time.Of course, my easy answer to him is "don't do it," but it does make me wonder how risky that is. Also, is that a realistic site-wide penalty kind of thing or it just doesn't pass any link juice to the advertiser kind of thing? So, I'm posting here for others to weigh in on. Thanks!
White Hat / Black Hat SEO | | 945010 -
Should this site be punished?
Every summer for the past 4 years one of our customer's competitors suddenly has a big jump in Google's (.co.uk) rankings for some of the main industry phrases, particularly "air conditioning". We were always under the impression that they bought links before the busy summer season, as they have these strange massive jumps in the rankings. (for the rest of the year they often drop down) I recently checked out some of the back-links going to their site and noticed something I'd not seen before. Of the (approx) 480 links that showed up, around 80% of the SourceURL's ended with "?Action=Webring" (see 1st attached image). To me it doesn't look natural at all and I'm surprised that Google hasn't picked up on. Their site is www.aircon247.com. It had been mentioned to me that this may be to do with link sharing sites (which I assume is black-hat) but I'm not 100% sure that they are doing this. They also have an identical long spammy-looking footer at the bottom of every page which is clearly only for search engines to see. We reported it to Google a year ago but no action was taken. Do you think that it is acceptable to have it on every page? (see 2nd attached image) I would be interested to know your thoughts on both of these, and whether this would be a dangerous tactic to try and emulate? Gc5MU.png iXGA9.png
White Hat / Black Hat SEO | | trickshotric0