Getting pages indexed by Google
-
Hi SEOMoz,
I relaunched a site back in February of this year (www.uniquip.com) with about 1 million URL's.
Right now I'm seeing that Google is not going past 110k indexed URL's (based on sitemaps).
Do you have any tips on what I can do to make the site more likeable by Google and get more indexed URL's?
All the the part pages can be browsed to by going to: http://www.uniquip.com/product-line-card/suppliers/sw-a/p-1
I've tried to make the content as unique as possible by adding random testimonials and random "related part numbers" see here: http://www.uniquip.com/id/246172/electronic-components/infineon/microcontrollers-mcu/sabc161pilfca
Do I need to wait more time and be more patient with Google? It just seems like I'm only getting a few thousand URL's per day at the most.
Would it help me if I implemented a breadcrumb on all part pages?
Thanks,
-Carlos
-
Carlos
Actually, it is in the index but if you do: site:www.uniquip.com as a search in Google it is not the first result, which it should be. If you do "site:www.uniquip.com" (with the quotes) as a search in Google, it is still only the second result, it should be the first, which is a sign that something is weird, which would require some more investigation.
-Dan
-
Thanks for both tips Dan, I will look into them althought number 2 may be a bit tough.... since there are hundreds of suppliers/categories (bit tough to put in a menu).
as for the page not being indexed I'm showing that it is. http://webcache.googleusercontent.com/search?sourceid=navclient-ff&ie=UTF-8&q=cache%3Ahttp%3A%2F%2Fwww.uniquip.com%2F
is this not the case for you?
-
Thanks for the feedback Brent.
I already my sitemaps setup in groups of 30k with a master sitemap file. http://www.uniquip.com/sm/smindex.xml
I will talk to the developer so we can look into regrouping them and perhaps pickup an indexing pattern.
-
Hi Carlos
It appears there are two major issues with your site not getting indexed.
1. Off-Site - Go to opensiteexplorer and type in your URL. You will see there are many suspicious looking links pointing back to your domain. No matter what you do on-site, if you have really low quality backlinks pointing to your domain (and no good ones to balance things out), nothing you do on-site will help. You'll want to do whatever you can to clean these up (get them removed). You'll also want to work on getting some GOOD backlinks in conjunction with cleaning out the bad ones. There are many resources here on SEOmoz on backlinking strategies.
2. On-site - Your product pages are 4 or more clicks away from the homepage. You really want this to be 3 or less (ideally 1 if possible for some products and categories).
I also noticed that your homepage does not appear to be indexed in Google.
We could go WAY more in depth with this, but in general those seem to be the two major issues.
-Dan
-
Carlos,
There is a great thread you should read about this topic: http://www.seomoz.org/q/can-you-push-too-many-urls-via-sitemaps
As for me, I would recommend creating multiple XML sitemaps where you can track which urls are bing indexed and to also control the submission quantity. You don't want to publish a ton of urls at one time. For the urls that aren't in Google's index, I would no index them for now and slowly submit a small group on a monthly basis, along with different XML sitemaps for tracking.
You have to think as if you were Google, is the content you're submitting valuable and new information? There is no way that 1 million new pages is so valuable or new that isn't already out there. A nice steady submission stream will give you the best results in the long run.
I hope this helps.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google is showing erroneous results on SERPs page
Hello, All, In April, two months ago, we caught a hack on a client's website. It created about 40 pages in what looked to be a black hat link tactic. We removed the pages, resubmitted the sitemap.xml (it reprocessed) and ran it through screaming frog to confirm all the pages were gone, but the forty pages still show up in the search results for a site search. We have both the www. and non www. version of sites claimed and set a preference. Nothing is awry with the robots.text. We're not really sure what to do to resolve it. We asked Google to recrawl (fetch) the site. I'm not sure what's going on with it. The website's name is fortisitsolutions.com The site search bringing up the pages from the hack is below. site:www.fortisitsolutions.com Any ideas?
On-Page Optimization | | Cazarin-Interactive0 -
Will it upset Google if I aggregate product page reviews up into a product category page?
We have reviews on our product pages and we are considering averaging those reviews out and putting them on specific category pages in order for the average product ratings to be displayed in search results. Each averaged category review would be only for the products within it's category, and all reviews are from users of the site, no 3rd party reviews. For example, averaging the reviews from all of our boxes products pages, and listing that average review on the boxes category page. My question is, will this be doing anything wrong in the eyes of Google, and if so how so? -Derick
On-Page Optimization | | Deluxe0 -
Which is better? One dynamically optimised page, or lots of optimised pages?
For the purpose of simplicity, we have 5 main categories in the site - let's call them A, B, C, D, E. Each of these categories have sub-category pages e.g. A1, A2, A3. The main area of the site consists of these category and sub-category pages. But as each product comes in different woods, it's useful for customers to see all the product that come in a particular wood, e.g. walnut. So many years ago we created 'woods' pages. These pages replicate the categories & sub-categories but only show what is available in that particular wood. And of course - they're optimised much better for that wood. All well and good, until recently, these specialist page seem to have dropped through the floor in Google. Could be temporary, I don't know, and it's only a fortnight - but I'm worried. Now, because the site is dynamic, we could do things differently. We could still have landing pages for each wood, but of spinning off to their own optimised specific wood sub-category page, they could instead link to the primary sub-category page with a ?search filter in the URL. This way, the customer is still getting to see what they want. Which is better? One page per sub-category? Dynamically filtered by search. Or lots of specific sub-category pages? I guess at the heart of this question is? Does having lots of specific sub-category pages lead to a large overlap of duplicate content, and is it better keeping that authority juice on a single page? Even if the URL changes (with a query in the URL) to enable whatever filtering we need to do.
On-Page Optimization | | pulcinella2uk0 -
Page Title & Meta Description Getting Cut Off In The SERPs
Hi Guys, I am trying to figure out why my page titles and meta d tags are getting cut off in Goofle SERPS. My page titles are 70 characters or under (including spaces) and my meta Dd's are 155 characters or under (including spaces) so I cannot work out why They are getting cut off. Is there something I have missed?! Thanks, Meaghan
On-Page Optimization | | StoryScout0 -
Noindex pages being indexed
Hi all Wondering if anyone could offer a pointer on a problem i am having please. I am developing an affiliate store and to prevent problems with duplicate content I have added name="robots" content="NOINDEX,FOLLOW" /> to all the product pages to avoid google penalties. However, Google appears to be indexing product pages. When I do a site: search I see a few hundred product pages in the engine. This is odd as the site has always had noindex on these pages. Even viewing the cache of the indexed page shows the noindex meta tag to be in place. I'm at a loss as to why these pages are being indexed and could do with removing them asap to stop any penalties on the site. Many thanks for any help.
On-Page Optimization | | carl_daedricdigital0 -
Where does Google say this?
Just came across this article: http://www.searchmarketingstandard.com/tips-for-avoiding-thin-content And, it states, "Google says that it will ignore pages with less than 200 words of body text " I submitted a comment to the author, but was wondering in the meantime if anyone knows where Google says this?
On-Page Optimization | | nicole.healthline0 -
Do product pages need unique content or does having duplcate content hurt on those pages?
We are adding product rapidly to our website but this requires allowing duplicate to exist on our product pages of furniture-online.com. From an SEO standpoint do we need to make this content unique for each product. Since we aren't link building to specific product pages and we don't anticipate product pages being found in a search result, are we ok leaving the duplicate content in place and spending our dollars elsewhere?
On-Page Optimization | | gallreddy0 -
Optimization of home page
Hi there I have an issue which, despite searching hard, I simply cannot find the right solution for. We have an index page that used to rank pretty well for a main industry keyword. However following a revamp of the site last year the kw slipped and no longer brings in decent traffic levels. The problem seems to be that the old static site had a sprinkling of variable anchor text links that brought value to the home page. Instead of the main anchor being "home" we would revert to "main keyword" and variations across the site sometimes in t he content but mainly on the nav bars. However the new CMS design structure restricts us considerably with anchor distribution and so instead we opted for the site logo on the masthead to have an ALT tag for "main keyword" but so as not to game google too much we added .."home" to the tag. Probably pointless but we figured it could do no harm. This ALT text is site wide Problem now is that we have lost the spread of internal nav bar anchors and variety etc. We have slipped in the serps for "main keyword" and I cant help thinking we are not maximising the anchors as we should. So what Im coming to is this.... How can we tell if Google is picking up the ALT tage anchor as the main anchor to rank the site at the expense of all internal text anchors. Despite retaining lots of embedded anchors - according to the Moz metrics these are not being picked up because OSE suggests the ALT tag anchor is taking precedence. The serps probably support this view as well. Should we: a) Vary the masthead ALT if there is no way of avoiding this being the most important link / anchor on the page b) Remove the ALT anchor and instead opt for content links high on the page (we do have nav bar links saying "Home" site wide as well which may overrid the embedded links?) c) Leave the ALT alone and still push for content anchors as described in b) What is the best way to handle this..? Best wishes and thanks Morch
On-Page Optimization | | Morch0