Why do we have so many pages scanned by bots (over 250,000) and our biggest competitors have about 70,000? Seems like something is very wrong.
-
We are trying to figure out why last year we had a huge (80%) and sudden (within two days) drop in our google searches. The only "outlier" in our site that we can find is a huge number of pages reported in MOZ as scanned by search engines. Is this a problem? How did we get so many pages reported? What can we do to bring the number of searched pages back to a "normal" level?
BT
-
Hi. A mystery indeed! Have you recently upgraded or changed Web platforms or changed or upgraded what you are using for your site navigation?
-
Stewart_SEO
Thanks for your quick response. We did review the robots.txt of the competitors. Not line by line - they took surprisingly different approaches to the robots.txt. But there were the usual exclusions for wish lists, etc. We've gone back and tightened up our robots.txt and haven't yet seen any changes. Several months ago we were at about 600,000 pages and it is dropping. Very mysterious.
-
Have you looked at your competitors robots.txt file? they are probably blocking the very same searches you are talking about. if there is a particular bot like a Chinese crawler for example baidu that you don't want to come to your site you can block them via the command: User-agent: Baiduspider
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Meta robots at every page rather than using robots.txt for blocking crawlers? How they'll get indexed if we block crawlers?
Hi all, The suggestion to use meta robots tag rather than robots.txt file is to make sure the pages do not get indexed if their hyperlinks are available anywhere on the internet. I don't understand how the pages will be indexed if the entire site is blocked? Even though there are page links are available, will Google really index those pages? One of our site got blocked from robots file but internal links are available on internet for years which are not been indexed. So technically robots.txt file is quite enough right? Please clarify and guide me if I'm wrong. Thanks
Algorithm Updates | | vtmoz0 -
Log-in page ranking instead of homepage due to high traffic on login page! How to avoid?
Hi all, Our log-in page is ranking in SERP instead of homepage and some times both pages rank for the primary keyword we targeted. We have even dropped. I am looking for a solution for this. Three points here to consider is: Our log-in page is the most visited page and landing page on the website. Even there is the primary keyword in this page or not; same scenario continues Log-in page is the first link bots touch when they crawling any page of our website as log-in page is linked on top navigation menu If we move login page to sub-domain, will it works? I am worrying that we loose so much traffic to our website which will be taken away from log-in page sub domain Please guide with your valuable suggestions. Thanks
Algorithm Updates | | vtmoz0 -
Indexed Pages Increase and Major Drop June 25th and July 16th?
I am seeing information regarding a possible Google algorithm that may have taken place on June 25th...and seeing total number of pages indexed in GSC increase (cool!)...BUT, then on July 16, I'm seeing a consistent drop (BIG DROP) of pages indexed - not only on our site, but several. Does anyone have any insight into this or experiencing the same issue?
Algorithm Updates | | kwilgus0 -
The evolution of Google's 'Quality' filters - Do thin product pages still need noindex?
I'm hoping that Mozzers can weigh in with any recent experiences with eCommerce SEO..... I like to assume (perhaps incorrectly) that Google's 'Quality' filters (formerly known as Panda) have evolved with some intelligence since Panda first launched and started penalising eCommerce sites for having thin product pages. On this basis i'd expect that the filters are now less heavy handed and know that product pages with no or little product description on them are still a quality user experience for people who want to buy that product. Therefore my question is this...
Algorithm Updates | | QubaSEO
Do thin product pages still need noindex given that more often that not they are a quality search result for those using a product specific search query? Has anyone experienced penalty recently (last 12 months) on an ecommerce site because of a high number of thin product pages?0 -
Page 2 to page 1
I've found a lot of times it does not take much activity to get a keyword from ranking on page 3 of Google or further down to page 2 but there seems to be a hurdle from page 2 to page 1. It is very frustrating to be between 11 and 15 but not being able to make that push to 9 or 10. Has anyone got or seen any data to justifiy this?
Algorithm Updates | | S_Curtis0 -
Wrong Google pin locations
Did something happen recently that would affect pin locations on Google Maps? I've been updating Google Places pages, but not touching the address or pins - but I received a phone call from one of my locations that their pin location changed in the past month and now it is wrong. Meanwhile, another department recently had MomentFeed update the pins for accuracy. Thoughts?
Algorithm Updates | | SSFCU0 -
Sudden drop in rankings and indexed pages!
Over the past few days I have noticed some apparent major changes. Before I explain, let me say this: Checking my analytics and WMT: There is an increase in traffic (even via google organic) There is no drop in impressions or clicks There is no drop in indexed pages in GWT Having said that; When I check my indexed pages using site:www.mywebsite.com, I see only 30 results as opposed to the 120K that I was seeing before (it was steadily climbing). The indexed pages have increase 3 fold in the past year, because of the increase in pages, updates, and products on the site. I see a sudden drop in rankings for major keywords that had been steadily rising. For example, I had some major keywords that were on page 7-8, not they are on page 20+ or not at all. Also, the page that used to show in the rankings has changed. I have only done white-hat guest blogging in the past year for link building, on a small scale (maybe 20-30 links in a year). They only other change recently, is that we are: Posting products on Houzz and Pinterest daily adding our site to all local directories (white pages, Yelp, citysearch, etc.) My site got hit by Penguin more than a year ago, but we have done everything right since, and our traffic via organic results has more than doubled since the Penguin release. What the hell is going on? Should I be concerned?
Algorithm Updates | | inhouseseo0 -
What was the biggest challenge you faced as an SEO in 2012?
As an SEO (in-house, freelance, consultant, agency, entrepreneur) what was the biggest challenge you faced in 2012? Please be as specific as you can, and let us all know what you are doing to overcome this challenge in 2013. For me personally I would have to say the biggest challenge I had to deal with was Google+ Local. Obviously Google is putting a lot into G+L, but it has been so messy and at times I have just thrown my arms up in the air. Especially when it comes to multi-state locations and losing reviews.
Algorithm Updates | | clarktbell0