Effects of significant cross linking between subdomains
-
A client has noticed in recent months that their traffic from organic search has been declining, little by little.
They have a large ecommerce site with several different categories of product - each product type has its own subdomain. They have some big megamenus going on, and the end result is that if you look in their Webmaster Tools for one of their subdomains, under Links to your Site, it says they have nearly 22 million links from their own domain!
Client is wondering if this is what is causing the decline in traffic and wondering whether to change the whole structure of their site.
Interested to hear the thoughts of the community on this one!
-
Helen,
I know people who have had success in reducing the number of links within mega menus by turning some of them (after the first two levels, for instance, but you could get much more sophisticated if you wanted) into javascript links. If the javascript is not too complex Google will still have no trouble getting to those pages, but the links won't be "hrefs" and therefore won't waste pagerank on pages that are not as important relative to the others. The upside to this is that the links are still there for users, assuming that is a good thing.
As someone else mentioned, consider whether having those links there really is good for the users, or if they'd rather see a simpler menu. The search engines are beside the point in that case.
Any time an eCommerce site experiences slow, steady traffic drops I always look into the uniqueness of their product copy. That is often a sign that they are sharing product copy with other sites, either due to manufacturer description use, or by publishing feeds to 3rd party sites like Amazon, eBay or price comparison shopping engines.
Good luck!
-
You mentioned it's a large site Google only goes so deep into a site but as its an irrelevant detail it doesn't matter. Have you tried blocking some of the unused pages by robots and/or implementing tags like canonical &/or pagination tag
http://googlewebmastercentral.blogspot.co.uk/2011/09/pagination-with-relnext-and-relprev.html
https://support.google.com/webmasters/answer/139394?hl=en
you could look in Google trends for a rough idea of search volume over the years but that wont help your site as you mentioned. you can try tracking your SERP rank in Moz or other sfotware like serpbook etc.
Sounds like you've dropped down in your SERP to me.
-
Hi Chris,
Thanks for your reply. The issue isn't that Google hasn't indexed those pages, though - it has. I'm not sure what you mean by 'Google won't index huge sites it just doesn't have time', as it clearly does index plenty of huge sites.The site is pretty much fully indexed so it's not that Google can't find the pages.
We have also, of course, tried using the client's Analytics to identify the issue, as you describe, but the client accidentally deleted all the historical data beyond about the six month mark (oops), so I can't do a lot of the analysis I would normally do. I have one or two odd old printouts showing some historical Analytics and ranking data, and their sales data to go on, and this does tend to suggest that organic traffic has indeed dropped off (for reasons other than seasonal ones) and that there has been some decline in their search engine rankings for some key phrases. But I can't tell a lot more than that.
What I'm looking for is to see whether anyone else has had experience of this or a similar issue - whether anyone has seen excessive links between subdomains have a negative impact on rankings & traffic. I've been working in SEO for ten years and never come across anyone who has quite this many links within their own website, so it's not something I've encountered before.
Anyone else out there come across this before?
-
First thing I always do is pretend i don't work for the company etc. go to the site as a user and see how easy it is to navigate, can i find the product i need easily?(try to imagine you want a product prior to going on) Can i get back to the home page easily etc. I try to make sure I can access my home page (or the main products) only 3 pages away (5 max). Google won't index huge sites it just doesn't have time so if your structure is bad it may be Google bot giving up as it can't get all the way down to where you in fact want it to go.
If you find your self lost in "megamenus" imagine the user or Google bot, can you reduce the menus to achieve a good result?
Other factor could be the decline in traffic has there be a Decline in your placement in the SERP or seasonal traffic ? Although not a permanent fix PPC can help top up traffic to your site whilst you jiggle it a bit.
I hope some of the questions above help you look at the site in a different light, there are obviously other things it could be but first off I would look into your SERP placement and seasonal dips. You can use GA to look at users drop off points see where they are getting bored or getting lost too!
Best of luck!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Optimizing internal links or over-optimizing?
For a while I hated the look of the internal links page of Google Web Master Tools account for a certain site. With a total of 120+K pages, the top internal link was the one pointing to "FAQ". With around 1M links. That was due to the fact, on every single page, both the header and the footer where presenting 5 links to the most popular questions. The traffic of those FAQ pages is non-existent, the anchor text is not SEO interesting, and theoretically 1M useless internal links is detrimental for page juice flow. So I removed them. Replacing the anchor with javascript to keep the functionality. I actually left only 1 “pure” link to the FAQ page in the footer (site wide). And overnight, the internal links page of that GWT account disappeared. Blank, no links. Now... Mhhh... I feel like... Ops! Yes I am getting paranoid at the idea the sudden disappearance of 1M internal links was not appreciated by google bot. Anyone had similar experience? Could this be seen by google bot as over-optimizing and be penalized? Did I possibly triggered a manual review of the website removing 1M internal links? I remember Matt Cutts saying adding or removing 1M pages (pages) would trigger a flag at google spam team and lead to a manual review, but 1M internal links? Any idea?
Technical SEO | | max.favilli0 -
Backlink Profile: Should I disavow these links? Auto-Generated Links etc
Hello Moz Community, At first I wanted to say that I really like the Q&A section and that I read and learned a lot - and today it is time for my first own question 😉 I checked our backlink-profile these days and I found in my opinion a few bad/spammy links, most of them are auto-generated by pickung up some (meta) information from our webpage. Now my question is if I should dasavow these links over webmasters or if these links shouldn't matter as I guess basically every webpage will be picked up from them. Especially from the perspective that our rankings dropped significantly last weeks, but I am not sure if this can be the real reason. Examples are pages like: https://www.askives.com/ -Auto-Generates for example meta descriptions with links http://www.websitesalike.com/ -find similar websites http://mashrom.ir/ -no idea about this, really crazy Or we are at http://www.europages.com/, which makes sense for me and we get some referral traffic as well, but they auto-generated links from all their TLDs like .gr / .it / .cn etc. -just disavow all other TLDs than .com? Another example would be links from OM services like: seoprofiler.com Moreover we have a lot of links from different HR portals (including really many outdated job postings). Can these links “hurt” as well? Thanks a lot for your help! Greez Heiko
Technical SEO | | _Heiko_0 -
Page for Link Building
Hello guys, My question is about link building and reciprocal links. Since many directories request a reciprocal link, makes me wonder if is not better to create a unique page in the website only for this kind of links. What do you guys recommend? Thanks in advance, PP
Technical SEO | | PedroM0 -
Unwanted spam pharmacy links
Somebody has been building spam pharmacy links to one of our client sites. I presume they hacked the site and were trying to get their injected pages to rank for pharmacy keywords. The hack appears to be gone now, but we will check more code to be sure. However, we're still left with a bunch of really spammy links, with pharmacy related anchor texts. Anyone had any experience dealing with this? Did the links hurt your rankings? How did you get rid of or mitigate them?
Technical SEO | | AdamThompson0 -
IFRAME WIDGET - FOLLOWED LINKS
<iframe style="border: 2px #CCCCCC solid;" src="http://www.cpsc.gov/cgi-bin/javascripts/cpscrss.aspx" title="CPSC RSS Feed" frameborder="0" scrolling="auto" width="224" height="258"></iframe> That is the code my client wants to add to an internal page where we can keep updated news on a specific subject. Only problem is this widget has links within it, these links are "followed". Should i worry about these links being followed? There are quite a few, does anyone know if they will be counted if within an iframe or is there a way to add "no-follow" attribute to them. Can i somehow tell the HTACCESS to add no follows to all links on specific pages? Any thoughts, solutions are greatly appreciated.
Technical SEO | | waqid0 -
Inter-link Optimization for Pages
Hi Guys, How much interlinking is okay for between pages? Can i start link my internal pages like Wikipedia does? 20,30 interlink per page? OR is it all depends on Page Authority? Can i go inter-link crazy on 1 year old legitimate site? Big Thanks?
Technical SEO | | DigitalJungle0 -
Internal linking to subdomains
Hi *, I have a main site called example.org, and a lot of user generated pages to foo.example.org / bar.example.org and so on. Most of those pages link back to example.org. In example.org I have a page that links to all subdomains. How can I optimize the pagerank of the list page? Should I add nofollow to subdomain sites to avoid passing link juice to those sites and keep normal linking from subdomain sites?
Technical SEO | | ngw0 -
Best cross linking strategy for micro sites?
Hi Guys. I created a micro site (A design showcase gallery) away from the main website to attract a lot of links in my space from competitors. It works so well it has become a valuable resource in my industry and I believe I will keep it running and adding content to it. Is the best SEO strategy for the main site simply to link from each page to the main site? Or should I be looking at something else? Thanks, Alan
Technical SEO | | spoiltchild0