Sudden Increase In Number of Pages Indexed By Google Webmaster When No New Pages Added
-
Greetings MOZ Community:
On June 14th Google Webmaster tools indicated an increase in the number of indexed pages, going from 676 to 851 pages. New pages had been added to the domain in the previous month. The number of pages blocked by robots increased at that time from 332 (June 1st) to 551 June 22nd), yet the number of indexed pages still increased to 851.
The following changes occurred between June 5th and June 15th:
-A new redesigned version of the site was launched on June 4th, with some links to social media and blog removed on some pages, but with no new URLs added. The design platform was and is Wordpress.
-Google GTM code was added to the site.
-An exception was made by our hosting company to ModSecurity on our server (for i-frames) to allow GTM to function.
In the last ten days my web traffic has decline about 15%, however the quality of traffic has declined enormously and the number of new inquiries we get is off by around 65%. Click through rates have declined from about 2.55 pages to about 2 pages.
Obviously this is not a good situation.
My SEO provider, a reputable firm endorsed by MOZ, believes the extra 175 pages indexed by Google, pages that do not offer much content, may be causing the ranking decline.
My developer is examining the issue. They think there may be some tie in with the installation of GTM. They are noticing an additional issue, the sites Contact Us form will not work if the GTM script is enabled. They find it curious that both issues occurred around the same time.
Our domain is www.nyc-officespace-leader. Does anyone have any idea why these extra pages are appearing and how they can be removed? Anyone have experience with GTM causing issues with this?
Thanks everyone!!!
Alan -
Yes, and I appreciate it!
Alan -
I did what I asked you to do.
-
-
-
- in my first post and repeated frequently.
-
-
-
-
Hi Egol:
How did you locate this duplicate or re-published content?
Obviously what you have pointed out is a major source of concern so I ran Copyscape search this afternoon for duplicate content and did not locate any the URLs you mention in the "this", "this" link above. It appears you entered the URL of the blog post in Google's search bar. Would that work? This method would be pretty slow going with 600 URLs.
Thanks,
Alan -
Those are the 448 URLs from your website that have been filtered.
You should find garbage in them like shown below.
Have you done what I have suggested three times above? Do that if you want to identify the problem pages.
-
www.nyc-officespace-leader.com/wp-content/plugins/...
A description for this result is not available because of this site's robots.txt – learn more.
-
www.nyc-officespace-leader.com/wp-content/plugins/...
A description for this result is not available because of this site's robots.txt – learn more.
-
www.nyc-officespace-leader.com/wp-content/plugins/...
A description for this result is not available because of this site's robots.txt – learn more.
-
-
Hi Egol:
Thanks for the suggestion.
When I click on _ repeat the search with the omitted results included _I get 448 results not the entire 859 results. Seems very strange. Some of these URLS have light content but I don't believe they are dups. I don't see any content outside our website when I click this.
Am I doing something wrong? I would think the total of 859 would appear not 447 URLs.
Thanks!!
Alan -
I don't know. You should ask someone who knows a lot about canonicalization.
Did you drill down through all of those indexed pages to see if you can identify all of them?
I've suggested it twice.
-
Hi Egol:
In the content of launching an upgraded site, could the canonicalization have implemented incorrectly? That could account for 175 pages sudden new content as the thin content has been there for some time.
I am particularly suspicious regarding canonicalization as there was an issue involving multi page URLs of property listings when the site was migrated from Drupal to Wordpress last Summer.
Thoughts?
Thanks, Alan
-
Apparently infitter24.rssing.com/chan-13023009/all is poaching my content, taking my original content and adding it to there site. I am not quiet sure what to do about that.
You can have an attorney demand that they stop, you can file DMCA complaints. Be careful
**However it does not explain the sudden appearance of the 175 pages on Googles index **
-
Do this query: site:www.nyc-officespace-leader.com
-
Start drilling down the SERPs. One page at a time. Look for content that you didn't make. Look for duplicates.
-
Get a spreadsheet that has all of your URLs. Drill down through the SERPs checking every one of them. Can you account for your pagination. You have a lot of it and that type of page is usually rubbish in the index. Combine, canonicalize, or get rid of them.
-
-
Hi Egol:
Thanks so much for taking the time for your thorough response!!
Apparently infitter24.rssing.com/chan-13023009/all is poaching my content, taking my original content and adding it to there site. I am not quiet sure what to do about that.
You have pointed out something very useful and I appreciate it and will act upon it. However it does not explain the sudden appearance of the 175 pages on Googles index that did not appear at the end of May and somehow coincided with uploading of the new version of our website in early June. Any ideas???
Thanks,
Alan -
-
Do this query: site:www.nyc-officespace-leader.com
-
Start drilling down the SERPs. One page at a time. Look for content that you didn't make. Look for duplicates.
-
When you drill down about 44 pages you will find this...
In order to show you the most relevant results, we have omitted some entries very similar to the 440 already displayed.
If you like, you can repeat the search with the omitted results included.The bad stuff is usually behind that link. Google doesn't want to show that stuff to people. It could be thin, it could be duplicate, it could be spammy, they just might not like it.
- Find out what is in there.
Possible problems that I see....
I see dupe content like this and this. Either your guys are grabbin' somebodyelse's content or they are grabbin' yours. Can get you in trouble with Panda. You need original and unique. Anything that is not original and unique should be deleted, noindexed or rewritten.
A lot of these pages are really skimpy. Think content can get you into trouble with Panda. Anything that is skimpy should be deleted, noindexed or beefed up.
I see multiple links to tags on lots of these posts. That can cause duplicate content problems.
The tag pages are paginated with just a few pages on each. These can generate extra pages that are low value, suck up your linkjuice or compound duplicate content problems.
You have archive pages, and category pages and more pagination problems.
-
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page Speed or Site Speed which one does Google considered a ranking signal
I've read many threads online which proves that website speed is a ranking factor. There's a friend whose website scores 44 (slow metric score) on Google Pagespeed Insights. Despite that his website is slow, he outranks me on Google search results. It confuses me that I optimized my website for speed, but my competitor's slow site outperforms me. On Six9ja.com, I did amazing work by getting my target score which is 100 (fast metric score) on Google Pagespeed Insights. Coming to my Google search console tool, they have shown that some of my pages have average scores, while some have slow scores. Google search console tool proves me wrong that none of my pages are fast. Then where did the fast metrics went? Could it be because I added three Adsense Javascript code to all my blog posts? If so, that means that Adsense code is slowing website speed performance despite having an async tag. I tested my blog post speed and I understand that my page speed reduced by 48 due to the 3 Adsense javascript codes added to it. I got 62 (Average metric score). Now, my site speed is=100, then my page speed=62 Does this mean that Google considers page speed rather than site speed as a ranking factor? Screenshots: https://imgur.com/a/YSxSwOG **Regarding: **https://six9ja.com/
Reporting & Analytics | | Kingsmart1 -
Google Search Console (new GWT) - Does a language specific sub folder need its own GSC profile
HI I've got a clients site set which targets 3 language/countries: English via the main site on the domain.com Turkish via a Turkish language site on a subfolder domain.com/tr/ And German via domain.de The devs have set up .com and .de in GSC and is reporting data in both However there's no data in the domain/com/tr GSC profile ! Is that because its on a subfolder so data pertaining to it is being reported in the main domain.com GSC account ? Or does something more need to be done to set up the Turkish subfolder in GSC ? If so what ? All Best Dan
Reporting & Analytics | | Dan-Lawrence0 -
Excluding referral traffic from a specific page Google analytics
Hi, I am trying to exclude from referrals from a particular page i.e. www.domain.com/nothispage within Google analytics, I have tried a couple variations within the advanced filter (Regex etc) section without much luck, could anyone assist ? Updated-trying to do this using a filter for the entire profile. Thanks Marc
Reporting & Analytics | | NRMA0 -
A lot of traffic to one page from Google referral
We recently received a lot of traffic to one page from
Reporting & Analytics | | underthesun808
google.com referral. When I look in analytics it reports that the traffic is
coming from /url that’s not real helpful. Is there a way to get more specific
information as to what the referring url was?0 -
How do you eMail reports in the "NEw" version of google analytics
Bonjour from 9 degrees C Wetherby UK In google anlytics I want to export a monthly report to a client. Whilst I can do this in the old version where in the name of flying spacial jockstraps is it in the new version? Any insights welcome 🙂
Reporting & Analytics | | Nightwing0 -
Duplicate Url with Google shopping feed
In webmaster tool I have many duplicate url tagged as google_shopping Obviously i'm tagging the url with the goog url builder Url: elettrodomestici.yeppon.it/cura-corpo/tagliacapelli/remington-tagliacapelli-funzionamento-rete-ricaricabile-lame-in-acciaio-inox-hc5150-garanzia/ Duplicate url: elettrodomestici.yeppon.it/cura-corpo/tagliacapelli/remington-tagliacapelli-funzionamento-rete-ricaricabile-lame-in-acciaio-inox-hc5150-garanzia/?utm_source=google_shopping&utm_medium=web&utm_content=Elettrodomestici+e+Clima+%3E+Cura+del+corpo+%3E+Tagliacapelli&utm_campaign=google_shopping How can I solve it? Thanks
Reporting & Analytics | | yeppon0 -
Transferring Google analytics accounts
Hi, Does anyone know if you can transfer an Google Analytics account. We are looking to take over a new clients existing website and the analytics account is registered to their current web designers. Is it possible to transfer it to the client so that their existing web company no longer has access and they keep all their historical data, or do we have to start from scratch. Thanks Fraser
Reporting & Analytics | | fraserhannah0 -
Duplicate page content
I have a website which "houses" five different and completely separate departments, so the content is separated by subfolders. e.g. domain.com/department1 domain.com/department2 etc. and each have their own individual top navigation menus. There is an "About Us" section for each department which has about 6 subpages (Work for us, What we do, Awards etc.) but the problem is that the content for each department is exactly the same. The only difference is the navigation menu and the breadcrumbs. This isn't ideal as a change to one page means having to make the change to all 5 and from an SEO perspective it's duplicate content x5 (apart from the Nav). One solution I can see is to have the "About Us" section moved to the root level (domain.com/about-us) and have a generic nav, possibly with the department names on it. The only problem with this is that it disrupts the user journey if they are forced away from the department that they're chosen. Basically i'm looking for suggestions or examples of other sites that have got around this problem, I need inspiration! Any help would be greatly appreciated.
Reporting & Analytics | | haydennz0