Effects of significant cross linking between subdomains
-
A client has noticed in recent months that their traffic from organic search has been declining, little by little.
They have a large ecommerce site with several different categories of product - each product type has its own subdomain. They have some big megamenus going on, and the end result is that if you look in their Webmaster Tools for one of their subdomains, under Links to your Site, it says they have nearly 22 million links from their own domain!
Client is wondering if this is what is causing the decline in traffic and wondering whether to change the whole structure of their site.
Interested to hear the thoughts of the community on this one!
-
Helen,
I know people who have had success in reducing the number of links within mega menus by turning some of them (after the first two levels, for instance, but you could get much more sophisticated if you wanted) into javascript links. If the javascript is not too complex Google will still have no trouble getting to those pages, but the links won't be "hrefs" and therefore won't waste pagerank on pages that are not as important relative to the others. The upside to this is that the links are still there for users, assuming that is a good thing.
As someone else mentioned, consider whether having those links there really is good for the users, or if they'd rather see a simpler menu. The search engines are beside the point in that case.
Any time an eCommerce site experiences slow, steady traffic drops I always look into the uniqueness of their product copy. That is often a sign that they are sharing product copy with other sites, either due to manufacturer description use, or by publishing feeds to 3rd party sites like Amazon, eBay or price comparison shopping engines.
Good luck!
-
You mentioned it's a large site Google only goes so deep into a site but as its an irrelevant detail it doesn't matter. Have you tried blocking some of the unused pages by robots and/or implementing tags like canonical &/or pagination tag
http://googlewebmastercentral.blogspot.co.uk/2011/09/pagination-with-relnext-and-relprev.html
https://support.google.com/webmasters/answer/139394?hl=en
you could look in Google trends for a rough idea of search volume over the years but that wont help your site as you mentioned. you can try tracking your SERP rank in Moz or other sfotware like serpbook etc.
Sounds like you've dropped down in your SERP to me.
-
Hi Chris,
Thanks for your reply. The issue isn't that Google hasn't indexed those pages, though - it has. I'm not sure what you mean by 'Google won't index huge sites it just doesn't have time', as it clearly does index plenty of huge sites.The site is pretty much fully indexed so it's not that Google can't find the pages.
We have also, of course, tried using the client's Analytics to identify the issue, as you describe, but the client accidentally deleted all the historical data beyond about the six month mark (oops), so I can't do a lot of the analysis I would normally do. I have one or two odd old printouts showing some historical Analytics and ranking data, and their sales data to go on, and this does tend to suggest that organic traffic has indeed dropped off (for reasons other than seasonal ones) and that there has been some decline in their search engine rankings for some key phrases. But I can't tell a lot more than that.
What I'm looking for is to see whether anyone else has had experience of this or a similar issue - whether anyone has seen excessive links between subdomains have a negative impact on rankings & traffic. I've been working in SEO for ten years and never come across anyone who has quite this many links within their own website, so it's not something I've encountered before.
Anyone else out there come across this before?
-
First thing I always do is pretend i don't work for the company etc. go to the site as a user and see how easy it is to navigate, can i find the product i need easily?(try to imagine you want a product prior to going on) Can i get back to the home page easily etc. I try to make sure I can access my home page (or the main products) only 3 pages away (5 max). Google won't index huge sites it just doesn't have time so if your structure is bad it may be Google bot giving up as it can't get all the way down to where you in fact want it to go.
If you find your self lost in "megamenus" imagine the user or Google bot, can you reduce the menus to achieve a good result?
Other factor could be the decline in traffic has there be a Decline in your placement in the SERP or seasonal traffic ? Although not a permanent fix PPC can help top up traffic to your site whilst you jiggle it a bit.
I hope some of the questions above help you look at the site in a different light, there are obviously other things it could be but first off I would look into your SERP placement and seasonal dips. You can use GA to look at users drop off points see where they are getting bored or getting lost too!
Best of luck!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Subdomain 403 error
Hi Everyone, A crawler from our SEO tool detects a 403 error from a link from our main domain to a a couple of subdomains. However, these subdomains are perfect accessibly. What could be the problem? Is this error caused by the server, the crawlbot or something else? I would love to hear your thoughts.
Technical SEO | | WeAreDigital_BE
Jens0 -
Is it a problem to have an image + link in your menu
Hi, My menu has a image with links to some of the main pages on the site and text underneath it explaining what the banner is. Will it be beneficial or harmful to have the text hyperlinked to the same pages the images go to?
Technical SEO | | theLotter0 -
Too many links in header menu
I'm working on a few clients who are starting to get big header menus. Their site now easily exceeds the 100 links per page recommendation. Normally I would recommend them to cut down on the links, bit in this case these sites have menus that makes navigation easier. I honestly think these menus adds value for the users. The dilemma is that I think the menus provide value from an UX standpoint, but I'm not sure from the SEO standpoint. Any recommendations to this dilemma? Some examples: http://moodsofnorway.com/no/ http://www.gmax.no/ http://www.flust.no/
Technical SEO | | Inevo0 -
How is Google finding our preview subdomains?
I've noticed that Google is able to find, crawl and index preview subdomains we set up for new client sites (e.g. clientpreview.example.com). I know now to use "meta name="robots" and robots.txt) to block the search engines from crawling these subdomains. My question though, is how is Google finding these subdomains? We don't link to these preview domains from anywhere else, so I can't figure out how Google is even getting there. Does anybody have any insight on this?
Technical SEO | | ZeeCreative0 -
Google Shows 24K Links b/w 2 sites that are not linked
Good Morning, Does anyone have any idea why Google WMT shows me that i have 24,101 backlinks from one of my sites ( http://goo.gl/Jb4ng ) pointing to my other site ( http://goo.gl/JgK1e ) ... These sites have zero links between them, as far as I can see/tell. Can someone please help me figure out why Google is showing 24k backlinks? Thanks
Technical SEO | | Prime850 -
Campaigns Domain and Subdomain... ?
I made two separate campaigns before I understood the meaning of "subdomain". I make one campaign for my www.com and another for my .com. I now realize I should have made the .com the domain and the www. the subdomain in the same campaign. Is there a way to edit this? Thanks!
Technical SEO | | musicforkids0 -
Too many on page links
Hi All, As we all know, having to much links on a page is an obstacle for search engine crawlers in terms of the crawl allowance. My category pages are labeled as pages with to many "one page" links by the SEOmoz crawler. This probably comes from the fact that each product on the category page has multiple links (on the image and model number). Now my question is, would it help to setup a text-link with a clickable area as big as the product area? This means every product gets just one link. Would this help get the crawlers deeper in these pages and distribute the link-juice better? Or is Google smart enough already to figure out that two links to the same product page shouldn't be counted as two? Thanks for your replies guys. Rich
Technical SEO | | Horlogeboetiek0 -
How not to lose link juice when linking to thousands of PDF guides?
Hi All, I run an e-commerce website with thousands of products.
Technical SEO | | BeytzNet
In each product page I have a link to a PDF guide of that product. Currently we link to it with a "nofollow" <a href="">tag.</a> <a href="">Should we change it to window.open in order not to lose link juice? Thanks</a>0