Help, a certain directory is not being indexed
-
Before I start, dont expect this to be too easy. This really has me puzzled and am surprised I am still yet to find a solution for it. Get ready.
We have a wordpress website, launched over 6 months ago and have never had an issue getting content such as pages and post pages and categories indexed. However, I some what recently (about 2 months ago) installed a directory plugin (Business Directory Plugin) which lists businesses via unique urls that are accesible from a sub folder. Its these business listings that I absolutely cannot get indexed.
The index page to the directory which links to the business pages is indexed, however for some reason google is not indexing all the listing pages which are linked to from this page. Its not an issue of the content being uncrawlable or at least dont think so as when I run crawlers on my site such as xml sitemap crawlers it finds all the pages including the directory pages so I am sure its not an issue of the search engines not finding the content.
I have created xml sitemaps and uploaded to webmaster tools, tools recongises that there are many pages in the xml sitemap but google continues to only index a small percentage (everything but my business listings).
The directory has been there for about 8 weeks now so I know there is a issue as it should of been indexed by now.
See our main website at www.smashrepairbid.com.au and the business directory index page at www.smashrepairbid.com.au/our-shops/
To throw in a curve ball, in looking into this issue and setting up tools we noticed a lot of 404 error pages (nearly 4,000). We were very confused where these were coming from as they were only being generated from search engines - humans could not access the 404s and so we are guessing se's were firing some javascript code to generate them or something else weird. We could see the 404s in the logs so we know they were legit but again feel it was only search engines, this was validated when we added some rules to robots.txt and we saw the errors in the logs stop. We put the rules in robots txt file to try and stop google from indexing the 404 pages as we could not find anyway to fix the site / code (no idea what is causing them). If you do a site search in google you will see all the pages that are omitted in the results.
Since adding the rules to robots, our impressions shown through tools have jumped right up (increased by 5 times) so thought this was a good indication of improvement but still not getting the results we want.
Does anyone have any clue whats going on or why google and other se's are not indexing this content? Any help would be greatly appreciated and if you need any other information to assist just ask me.
Really appreciate anyone who can spare their time to help me, I sure do need it.
Thanks.
-
OK issue resolved!
Lynn thank you - was the relative url in the canonical tag that played havoc Changing it to absolute is now causing the pages to be indexed.
Lesson learnt.
-
Hey Kane,
The /shops url was a old url that had a directory in it. We blocked it in the robots as it was generating tons of 404 errors. In webmaster tools we can see thousands of 404 errors within that directory so we deleted it all and tried to block se's from throwing the errors (like i described in initial post).
A number of those listing do have very little information however there are a bunch that do have great content which is why I am not sure if that is the case. I will keep an eye on this though and also check about the logs and let you know what that says.
-
Thanks Lynn.
I have taken on your recommendation and changed the canonical tag to be absolute. Thanks for your help we will see how it goes.
-
As Lynn said, relative canonical tags could absolutely cause issues. That said, I'm seeing absolute URLs in the canonical tag now, so you may have fixed that in the past few days.
Also, I do see the Our Shops pages indexed when I search for site:smashrepairbid.com.au, but I don't see any other pages in the /our-shops/ directory aside from www.smashrepairbid.com.au/our-shops/?action=search
Your robots.txt is currently blocking /shops/. I don't think that would cause an issue but would be nice to remove that if it's not needed...
There's almost zero content on the pages I glanced at, eg. http://www.smashrepairbid.com.au/our-shops/1263/bakker-towing/ and http://www.smashrepairbid.com.au/our-shops/1616/coastal-towing-service/. When you look at it from Google's perspective, there's very little value being added by these pages. No unique photos, no phone number, no website, etc. There's a million local business scrapers that have more content than this, so why should they bother indexing these pages?
Try pulling up your logs and seeing if these URLs have been requested by Google's spiders. Here's a good guide from Ian Lurie on how to do that in Excel: http://www.portent.com/blog/analytics/how-to-read-a-web-site-log-file.htm
If the spiders are crawling those shop URLs but aren't indexing them, I think the first thing to do is add way more content to the pages.
-
Hi Trent,
Having a quick look I saw that you have relative urls in your canonical tag and this could be problematic. I think it would be worth making those urls absolute to avoid any confusion on Google's part in determining what page or page version should be indexed.
Cannot say for sure if this is the problem, but worth looking into.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Index / Monthly Click Number
Hi,
On-Page Optimization | | HypermediaSystems
This is a general question, so sorry in advance if inappropriate. Once I was told, in large scale EC / Forum Site,
the following number should be around 1,
and if it is below 1, it is a good sign ... Google Indexed Page Number / Monthly ( 30days ) Click Number I was told this is just a general idea, and real world situation varies, then
if you don't have any standard, this could be a start. (not dogmatic rules, just reference) Does this sounds about right? or do you have any other formula? I was tasked to do the site wide SEO, and diagnose the general state of SEO-wellness/fitness..
and right now, the number is 1.5, so I am about to report we can do more to get more SERP presence or something... If you guys point me relevant blog article / Q&A forum, I would really appreciate. Thanks!0 -
Indexed/Submitted URLS vs Total Indexed
Hello, My site is www.colbysphotography.com. I have Total Indexed 195 while I have 87 URLs submitted and only 79 URLs Indexed. What is the difference and is there a problem? Thanks ahead of time,
On-Page Optimization | | littlecolby
Colby0 -
Do sites built with WordPress work in China get indexed by Baidu?
I think that WordPress.com is banned, but are websites that are created by WordPress get banned?
On-Page Optimization | | CoGri0 -
Latent semantic Indexing - Does this help rankings/relevance?
Hi, Does semantically related words to the target term on a page help with rankings/relevance? If your after the term 'PC Screen' and you use the term 'PC Monitor' will go make the connection and also reward you because of the relevance? Anyone do this and have you seen any positives? I've just started to try this out lately and have been combining it with Wordle.net to give me an indication of where the content piece is heading and how aggressive the content leans towards certain words (makes things a little more interesting then calculating densities).
On-Page Optimization | | Bondara0 -
Would adding noindex help?
I had completely forgotten that I have about 20 pages of content on my site that is an exact duplicate of other sites (i.e. obtained from PLR site). I really do not want to delete these pages as they do get a lot of visitors (or did before last algo updates). These visitors are not from organic search but have navigated to the pages from within my site. Question is should I a) add noindex to these pages and then ask google to remove them from index or b) try to rewrite them Many Thanks Simon
On-Page Optimization | | spes1230 -
Remove internal site SERPS from Google Index?
1. Internal Serp pages did not have a robots meta tag 2. As a result, client site has thousands (~4,400) of internal site SERP pages in the Google index. 3. We added the NoIndex, Follow attribute to all internal SERPS 4. We Disallowed: domain.com/internal-search-operator in Robots.txt 5. No new SERP pages are being indexed, but the other 4000 something that were already there are still in the index weeks later. 6. The pages are dynamically created and still work, so I can't use the Remove Content tool from google, because the pages don't 404. Is there any way to get these pages out of the index besides just waiting and hoping google eventuall drops them? Thanks
On-Page Optimization | | delegator.com0 -
Duplicate page content & title for www.mydomain.com and www.mydomain.com/index.php?
Hi, First post so please be gentle! My Crawl Diagnostics Summary is showing an error relating to duplicate page content and duplicate page title for www.mydomain.com and www.mydomain.com/index.php which are, in my view, the same thing/page? Could anyone shed any light please? Thanks Carl
On-Page Optimization | | Carl2870 -
Sitemap Help!
Hi Guys, Quick question regarding sitemaps. I am currently working on a huge site that has masses of pages. I am looking to create a site map. How would you guys do this? i have looked at some tools but it say it will only do up to 30,000 pages roughly. It is so large it would be impossible to do this myself....any suggestions? Also, how do i find out how many pages my site actually has indexed and not indexed?? Thank You all Wayne
On-Page Optimization | | wazza19850