Double Listings On Page One
-
I've been noticing a trend over the past month and a half. My sites that use to get more than one page listed in certain SERPs are now being adjusted. It almost looks manual but I know it is most likely a change in the algorithm. Let's say I had a SERP where my site was showing two different sub-pages in a single SERP at #4 and #6 are now having one page being pushed up to #3 but the other page is being pushed back past the first page.
I'm not worried about penalizations or loss of value. I have been seeing this accross many of my client's sites. I just wanted to confirm that others were seeing it as well (so I'm not going crazy) and/or if Google has made any announcements or leaks regarding this shift.
Maybe it's just my sites coming of age or something but I would love to be able to explain it more knowledgeably than with a "Google might be doing this".
BTW - This is not effecting any of my Brand SERPs.
-
I used to have lots of #1 - #2 and even #1 - #2 - #3 - (sometimes #4) listings.
I still have some - but not as many.
Over the past few months Google is still allowing some of these but it is much harder to get two of your pages listed in the top ten positions of the SERPs.
You can really stack them up on the second and third page... but Google seems to be forcing more domain diversity in the top ten positions.
-
The Google Penguin update had two major changes that impacted the algorithm.
1. It penalized many sites that it felt was gaming the rankings.
2. It rewarded trusted sites with better rankings.
The net result of these two updates is that trusted sites will not only obtain several rankings on the first page, but will get multiple rankings on all subsequent search results pages. This doesn't leave a lot of SERP space for the rest of the competition.
-
That's it!!! I'm not crazy.
Now I am happy. I really have to pay more attention to that main blog.
-
Yeah, I remembered reading something on their blog.
"More domain diversity. [launch codename "Horde", project codename "Domain Crowding"] Sometimes search returns too many results from the same domain. This change helps surface content from a more diverse set of domains."
http://insidesearch.blogspot.com/2012/05/search-quality-highlights-53-changes.html
-
Have you come across any documented change in they way they are returning SERPs?
-
I've see this as well. It seems like google wants more diversity.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Key webpage fluctuating between page 2 and page 6 of Google SERP
Hi, We have found that one of our key webpages has been fluctuating between page 2 and page 6 of Google SERP for around 2 weeks. Some days it will be on page 6 in the morning and then page 2 in the afternoon. We have recently updated some copy on the page and wondered if this could be the cause. Has anyone else experienced this? If so how long was it before the page settled? https://www.mrisoftware.com/uk/products/property-management-software/ Thanks.
Algorithm Updates | | nfrank0 -
Need only tens of pages to be indexed out of hundreds: Robots.txt is Okay for Google to proceed with?
Hi all, We 2 sub domains with hundreds of pages where we need only 50 pages to get indexed which are important. Unfortunately the CMS of these sub domains is very old and not supporting "noindex" tag to be deployed on page level. So we are planning to block the entire sites from robots.txt and allow the 50 pages needed. But we are not sure if this is the right approach as Google been suggesting to depend mostly on "noindex" than robots.txt. Please suggest whether we can proceed with robots.txt file. Thanks
Algorithm Updates | | vtmoz0 -
Should one end URLs with or without a slash?
Moz, I am noticing that I need to go back and update my outbound links to your site. There are a lot of them because your content is so great and we love you guys. Could you explain your logic for making the change? Example on my Valid JSON-LD image sizes page: [https://mza.bundledseo.com/blog/state-of-searcher-behavior-revealed/](https://mza.bundledseo.com/blog/state-of-searcher-behavior-revealed/) redirected to: [https://mza.bundledseo.com/blog/state-of-searcher-behavior-revealed](https://mza.bundledseo.com/blog/state-of-searcher-behavior-revealed)
Algorithm Updates | | jessential0 -
Latest Best Practices for Single Page Applications
What are the latest best practices for SPA (single page application) experiences? Google is obviously crawling Javascript now, but is there any data to support that they crawl it as effectively as they do static content? Considering Bing (and Yahoo) as well as social (FB, Pinterest, etc) - what is the best practice that will cater to the lowest-common denominator bots and work across the board? Is a prerender solution still the advised route? Escaped fragments with snapshots at the expanded URLs, with SEO-friendly URL rewrites?
Algorithm Updates | | edmundsseo2 -
Do you think this page has been algorithmically penalised or is it just old?
Here is the page: http://www.designquotes.com.au/business-blog/top-10-australian-business-directories-in-2012/ It's fairly old, but when it was first written it hit #1 for "business directories". After a while it dropped but was receieving lots of traffic for long tail variations of "business directories Australia" As of the 4th of October (Penguin 2.1) it lost traffic and rankings entirely. I checked it's link profile and there isn't anything fishy: From Google Webmaster https://docs.google.com/spreadsheet/ccc?key=0AtwbT3wshHRsdEc1OWl4SFN0SDdiTkwzSmdGTFpZOFE&usp=sharing In fact, two links are entirely natural http://blog.businesszoom.com.au/2013/09/use-customer-reviews-to-improve-your-website-ranking/ http://dianajones.com.au/google-plus-local-equals-more-business-blog/ Yet when I search for a close match in title in Google AU, the article doesn't appear within even the first 4 pages. https://www.google.com.au/#q=top+10+Australian+Business+Directories&start=10 Is this simple because it's an old article? Should I re-write it, update the analysis and use a rel=canonical on the old article to the new?
Algorithm Updates | | designquotes0 -
Why do I have 7 URLs from the same domain ranking on the 1st page?
I have a client that has individual pages for authorized dealers of their product (say "Car Dealers"). When you search for "brand name + location", Google returns 7 "dealership" pages from the parent company's domain as the first 7 results, but there is one that gets pushed off to the 5th page of the SERPs. The formatting of content, geo-targeting, and meta data on the page is identical on every single one. None of them have external links and there is not one extremely distinguishable thing to assess why the one page doesn't get placed on that first SERP. Why is the one getting pushed so far down? I know this may be a bit confusing, but any thoughts would be greatly appreciated. Thanks!
Algorithm Updates | | MichaelWeisbaum0 -
Trying to figure out why one of my popular pages was de-indexed from Google.
I wanted to share this with everyone for two reasons. 1. To try to figure out why this happened, and 2 Let everyone be aware of this so you can check some of your pages if needed. Someone on Facebook asked me a question that I knew I had answered in this post. I couldn't remember what the url was, so I googled some of the terms I knew was in the page, and the page didn't show up. I did some more searches and found out that the entire page was missing from Google. This page has a good number of shares, comments, Facebook likes, etc (ie: social signals) and there is certainly no black / gray hat techniques being used on my site. This page received a decent amount of organic traffic as well. I'm not sure when the page was de-indexed, and wouldn't have even known if I had't tried to search for it via google; which makes me concerned that perhaps other pages are being de-indexed. It also concerns me that I have done something wrong (without knowing) and perhaps other pages on my site are going to be penalized as well. Does anyone have any idea why this page would be de-indexed? It sure seems like all the signals are there to show Google this page is unique and valuable. Interested to hear some of your thoughts on this. Thanks
Algorithm Updates | | NoahsDad0 -
FB Like and G+ on page transferred ?
Hi , As we all know that social signals like FB Like and G+ give a boost to the search engine rankings. I have several pages we have garnerd 100s of FB likes with a 10s of G+ . Now i want to reorganize my site to so that the contents of the blog would be having a different URL . So i have planned for 301 redirections for the older content to the newer ones. But does the FB Like and G+ also move along with the 301 redirection ? If not whats is the best way to handle this... Warm Rgds Avinash MB
Algorithm Updates | | ShoutOut0