Help, a certain directory is not being indexed
-
Before I start, dont expect this to be too easy. This really has me puzzled and am surprised I am still yet to find a solution for it. Get ready.
We have a wordpress website, launched over 6 months ago and have never had an issue getting content such as pages and post pages and categories indexed. However, I some what recently (about 2 months ago) installed a directory plugin (Business Directory Plugin) which lists businesses via unique urls that are accesible from a sub folder. Its these business listings that I absolutely cannot get indexed.
The index page to the directory which links to the business pages is indexed, however for some reason google is not indexing all the listing pages which are linked to from this page. Its not an issue of the content being uncrawlable or at least dont think so as when I run crawlers on my site such as xml sitemap crawlers it finds all the pages including the directory pages so I am sure its not an issue of the search engines not finding the content.
I have created xml sitemaps and uploaded to webmaster tools, tools recongises that there are many pages in the xml sitemap but google continues to only index a small percentage (everything but my business listings).
The directory has been there for about 8 weeks now so I know there is a issue as it should of been indexed by now.
See our main website at www.smashrepairbid.com.au and the business directory index page at www.smashrepairbid.com.au/our-shops/
To throw in a curve ball, in looking into this issue and setting up tools we noticed a lot of 404 error pages (nearly 4,000). We were very confused where these were coming from as they were only being generated from search engines - humans could not access the 404s and so we are guessing se's were firing some javascript code to generate them or something else weird. We could see the 404s in the logs so we know they were legit but again feel it was only search engines, this was validated when we added some rules to robots.txt and we saw the errors in the logs stop. We put the rules in robots txt file to try and stop google from indexing the 404 pages as we could not find anyway to fix the site / code (no idea what is causing them). If you do a site search in google you will see all the pages that are omitted in the results.
Since adding the rules to robots, our impressions shown through tools have jumped right up (increased by 5 times) so thought this was a good indication of improvement but still not getting the results we want.
Does anyone have any clue whats going on or why google and other se's are not indexing this content? Any help would be greatly appreciated and if you need any other information to assist just ask me.
Really appreciate anyone who can spare their time to help me, I sure do need it.
Thanks.
-
OK issue resolved!
Lynn thank you - was the relative url in the canonical tag that played havoc Changing it to absolute is now causing the pages to be indexed.
Lesson learnt.
-
Hey Kane,
The /shops url was a old url that had a directory in it. We blocked it in the robots as it was generating tons of 404 errors. In webmaster tools we can see thousands of 404 errors within that directory so we deleted it all and tried to block se's from throwing the errors (like i described in initial post).
A number of those listing do have very little information however there are a bunch that do have great content which is why I am not sure if that is the case. I will keep an eye on this though and also check about the logs and let you know what that says.
-
Thanks Lynn.
I have taken on your recommendation and changed the canonical tag to be absolute. Thanks for your help we will see how it goes.
-
As Lynn said, relative canonical tags could absolutely cause issues. That said, I'm seeing absolute URLs in the canonical tag now, so you may have fixed that in the past few days.
Also, I do see the Our Shops pages indexed when I search for site:smashrepairbid.com.au, but I don't see any other pages in the /our-shops/ directory aside from www.smashrepairbid.com.au/our-shops/?action=search
Your robots.txt is currently blocking /shops/. I don't think that would cause an issue but would be nice to remove that if it's not needed...
There's almost zero content on the pages I glanced at, eg. http://www.smashrepairbid.com.au/our-shops/1263/bakker-towing/ and http://www.smashrepairbid.com.au/our-shops/1616/coastal-towing-service/. When you look at it from Google's perspective, there's very little value being added by these pages. No unique photos, no phone number, no website, etc. There's a million local business scrapers that have more content than this, so why should they bother indexing these pages?
Try pulling up your logs and seeing if these URLs have been requested by Google's spiders. Here's a good guide from Ian Lurie on how to do that in Excel: http://www.portent.com/blog/analytics/how-to-read-a-web-site-log-file.htm
If the spiders are crawling those shop URLs but aren't indexing them, I think the first thing to do is add way more content to the pages.
-
Hi Trent,
Having a quick look I saw that you have relative urls in your canonical tag and this could be problematic. I think it would be worth making those urls absolute to avoid any confusion on Google's part in determining what page or page version should be indexed.
Cannot say for sure if this is the problem, but worth looking into.
Hope that helps!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Should I nofollow certain outbound links?
Hi, All of our outbound links are currently follow.I've read that the only case where nofollow should be used, according to Google, is for paid links, crawl prioritization & untrustworthy sites.The kinds of websites we are linking to from our blog include:- Websites with great content & high authority that is relevant to the topic we've written about that would enhance the user experience- Partners/companies we have a strong relationship with who also have decent authority- Social media profiles of industry people not within our organisation - Websites with definitions/wikipedia/soundcloud/published statistics/news sites and other large well known sites. I am wondering if we should put the nofollow attribute on the last 2 points (social media profiles of people we've referenced, websites with definitions/wikipedia/soundcloud/published statistics/news sites and other large well known sites.)Or do you think it's totally fine that all our links are follow provided the websites are legitimate, trustworthy and have some authority?Thanks in advance!
On-Page Optimization | | PGAUE0 -
Can lazy loading of images affect indexing?
I am trying to diagnose a massive drop in Google rankings for my website and noticed that the date of the ranking and traffic drop coincides with Google suddenly only indexing about 10% of my images, whereas previously it was indexing about 95% of them. Wondering if addition of lazy load script to images (so they don't load from the server until visible in the browser) could cause this index blocking?
On-Page Optimization | | Gavin.Atkinson1 -
Magento Duplicate Content Question - HELP!
In Magento, when entering product information, does the short description have to be different than the meta description? If they are both the same is this considered duplicate content? Thanks for the help!!!
On-Page Optimization | | LeapOfBelief0 -
Noindex pages being indexed
Hi all Wondering if anyone could offer a pointer on a problem i am having please. I am developing an affiliate store and to prevent problems with duplicate content I have added name="robots" content="NOINDEX,FOLLOW" /> to all the product pages to avoid google penalties. However, Google appears to be indexing product pages. When I do a site: search I see a few hundred product pages in the engine. This is odd as the site has always had noindex on these pages. Even viewing the cache of the indexed page shows the noindex meta tag to be in place. I'm at a loss as to why these pages are being indexed and could do with removing them asap to stop any penalties on the site. Many thanks for any help.
On-Page Optimization | | carl_daedricdigital0 -
Help an SEO-DUMMY : ) Established hyphenated domain...redirect?!...new domain?!
Hello, everybody. I am definitely not an SEO specialist. My family owns a transportation business (since 2010) and i am the one responsible for the website (until we find a good SEO company). My question: Several years ago i did not know much about SEO and have chosen a domain name www.airporttransportation-limo.com (it is not the actual domain...just an example...i'm not sure if i can post the real website here) and another domain that is just the name of our company (it also has hyphen in it). Both websites are still doing good and we receive quite a bit of traffic, but i read more an more about how hyphenated domains and domains with more then two worlds can be bad for your SEO/business/traffic. I feel like the websites are stuck and not moving up any more..could that be because of the hyphens? I registered another domain that is the name of our company (which is well known by now) without any hyphens. Now i have no idea what to do. Should i redirect both old domains (old websites are different and do not have duplicate content) to the new one, or should i just redirect the old domain (just the name of our company with hyphen) to a new one (without hyphen) and leave the www.airportransportation-limo.com as is... Or maybe i should register another domain without any hyphens (two words only) and redirect the www.airporttransportation-limo.com to it... I am very nervous to make any changes and loose all the traffic. My family will kill me. Please help! I'm lost!
On-Page Optimization | | KL20140 -
Disallow indexing of ALL subdomains
I'm using www.domain.com as my development hosting. Each website that i'm developing get's a temporary URL like this: project1.domain.com
On-Page Optimization | | conversal
project2.domain.com
project3.domain.com
... Now i'd like to set that ALL these subdomains can not be indexed in Google. Now I manually have to do this for each subdomain's site, and when I go online I have to change the robots.txt again. So I would like to make things a bit easier for me. Is this possible?0 -
Need help ranking my site
Hi, Can anyone help me out? I am trying to get this site ranked for "Villa General Belgrano". It was on the first page of Google and then it disappeared. Did I over optimize the anchor text? http://www.opensiteexplorer.org/anchors?site=www.lawebdelvalle.com.ar
On-Page Optimization | | Carla_Dawson0 -
Is it ok to point internal links to index.html home page rather than full www
I thought I saw this somewhere on SEOmoz before but I was so busy by the time I got around to work on my SEO on my site, I realized I have this happening and can't recall if it is a problem which takes away from my ranking. If my www.website.com is ranking well but I have internal menu links pointing to www.website.com/index.html instead of www.website.com will that take away from my www.website.com rankings? Should I change all my menu links that point to /index.html to the full website url path www.website.com ?
On-Page Optimization | | Twinbytes0