Navigation
-
I've been wrestling with this one for a while. Take a standard small web site navigation with nav links for:
-
Products
-
Solutions
-
Support
-
Learning Center
I believe having drop downs to show the sub-pages of each category provides a better user experience, but it also bloats my links per page in the navigation from 4 to 24. Most of the additional links are useful for user experience, but not search purposes. So, 2-years after Google's changing of how it treats nofollows (which used to be the easy answer to this question), what is considered best practice?
A) Go ahead and add the full 24 nav links on each page. The user experience outweighs the SEO benefits of fewer links and Google doesn't worry too much about nav links relative to main body links.
B) Stick to only 4 nav options. Having 20 additional links on every page is a big deal and removing them is worth the user experience hit. I can still get to all levels of this small site within 2-3 clicks and do cross category linking to mitigate silos.
C) Use some technical voodoo with js links or iframes to hide the nav links from Google and get the best of both worlds.
D) Do something that is not one of the first three choices.
Does anyone feel strongly about any of the above options or is this a user-preference type of situation where it doesn't make much difference which option you choose on a small 100-200 page site?
I'm really looking forward to everyone's thoughts on this.
-DV
-
-
Thanks, Alan, you captured the dilemma perfectly. UI is important and SEO is important, so how does one quantify the pros and cons of each in the planning stages of a site. It's really kind of an educated guess.
I tend to lean towards your assessment for all of the reasons you cite. I'm in a competitive keyword space. So while I put a lot of weight on UI issues, I'm not inclined to ignore SEO opportunities for just minimal UI gains.
-
Derek
There are hundreds of factors right, so there's no way to know with 100% certainty in advance what the SEO hit would be with the nav change. Any single change could have a significant impact, yet if all other core SEO factors are optimal, it might not have any impact at all. Without testing, no way to know.
Personally I prefer to avoid the possible hit when I recommend nav to clients just because I do want to squeeze out every last ounce of value and honestly, if a site really only has 20 - 30 nav links, it's not such an inconvenience to users to have to click a main nav link and then find the sub-nav in each section as long as it's visually presented well.
If the information provided is highly relevant, that one extra click is not going to hurt the site.
-
Interesting point about duplicate content. I suspect I'd have less of an issue with 20-30 links, but I could definitely see where DC could become a problem with larger navs. I like the breadcrumbs idea too.
I'm wondering if I'm placing too much emphasis on the too many links issue. However, to me it seems like a huge advantage to funnel link juice where I want it with well placed internal links in the body of content. I've had good success with this before. I would think that these targeted links would carry significantly more juice if I can reduce the number of links per page from 30 to 10 --- all by eliminating 20 nav links to less important pages. It just feels like a big SEO performance hit to me to have 200% more links in the nav? Am I wrong? Does Google not flow much link juice through nav links?
-
Derek
I encounter this scenario a lot on sites with not 24 but 100 or more links in that drop-down setup. Here's what I've found.
A) User Experience MAY be improved, however only heat-map and click testing can prove if this is the case or not, and only when you do a/b split testing on the two versions of navigation. Sometimes giving people these choices is only barely helpful unless you also supplement this with additional user experience signals to help someone know where they actually are, especially when they come directly into the middle of the site.
B) From an SEO standpoint, it's not so much a "too many links" issue for distributing individual link value. It's more a case where if you have all those links on every page of the entire site, at the code level, every page becomes slightly more diluted (all the extra text and words in the code) from a single-page topical focus issue. You also have more of a duplicate content potential (the top area of the page now has a lot more "content" that's not unique on every page of the site.
The way to address this, if you believe the site-wide drop-down nav is important, is by taking the following action:
-
Be sure you have proper microformat coded breadcrumb navigation directly within the top of the main content area on each page. This both helps users know more readily where they are in the site's content grouping and topical separation scheme, but also provides reinforced signals to search engines about content relationships that you lose with the top drop-downs.
-
Even with the top drop-downs, it's still beneficial to include section-specific navigation in a sidebar. When a user has so many choices on every page of the site, it's easy to get lost in knowing which drop-down to use. Not always, yet can be an issue. Giving them the alternative that's always visible within each section, and unique to each section reinforces ease of navigation for those who prefer it. It also communicates to Google topical relationships between all the pages in the section those side navigation links show up in.
3) You may need to increase the depth of the actual content area descriptive paragraph based unique content.
-
-
I see where you are coming from, but with each required click you lose users. There are definitely times where I would sacrifice some usability for the sake of traffic, but accessibility is everything.
You are right on the mark with the give and take of SEO and user experience.
-
Thanks for the reply, but please allow me to play devil's advocate.
I generally subscribe to users first mantra too, but is using subnavs on category pages vs. full dropdowns on every page a huge user experience hit? Taken to extremes, always choosing either a users always first approach or a SEO always first approach is not optimal. There has to be some measure of balance even if you lean heavily towards the users first approach (as I generally do).
Is there no meaningful benefit to removing from every page 20 links that provide no additional SEO benefit and only serve to dilute the impact of other more important links? Or, do you think that using full dropdown navs provides a truly significant user experience benefit?
-
A. You have to create it with users in mind. It will also help the site's connectivity which is good for SEO. Above all else the site should be easy to navigate for users.
-
I should add that the 24 link scenario already includes consolidation of related topics, so the answer can't be more consolidation!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The main navigation is using JS, will this have a negative impact on SEO?
Hi mozzers, We just redesigned our homepage and discovered that our main nav is using JS and when disabling JS, no main nav links was showing up. Is this still considered bad practice for SEO? https://cl.ly/14ccf2509478 thanks
Intermediate & Advanced SEO | | Ty19861 -
Question regarding Site and URL structure + Faceted Navigation (Endeca)
We are currently implementing the SEO module for Endeca faceted navigation. Our development team has proposed URLs to be structured in this way: Main category example: https://www.pens.com/c/pens-and-writing/ As soon as a facet is selected, for example "blue ink" - The URL path would change to https://www.pens.com/m/pens-and-writing/blue-ink/_/Nvalue (the "N" value is a unique identifier generated by Endeca that determines what products from the catalog are served as a match for the selected facet and is the same every time that facet is selected, it is not unique per user). My gut instinct says that this change from "/c/" to "/m/" might be very problematic in terms of search engines understanding that /m/pens-and-writing/blue-ink/ as part of the /c/pens-and-writing/ category. Wouldn't this also potentially pose a problem for the flow of internal link equity? Has anyone ever seen a successful implementation using this methodology?
Intermediate & Advanced SEO | | danatanseo0 -
Internal search pages (and faceted navigation) solutions for 2018! Canonical or meta robots "noindex,follow"?
There seems to conflicting information on how best to handle internal search results pages. To recap - they are problematic because these pages generally result in lots of query parameters being appended to the URL string for every kind of search - whilst the title, meta-description and general framework of the page remain the same - which is flagged in Moz Pro Site Crawl - as duplicate, meta descriptions/h1s etc. The general advice these days is NOT to disallow these pages in robots.txt anymore - because there is still value in their being crawled for all the links that appear on the page. But in order to handle the duplicate issues - the advice varies into two camps on what to do: 1. Add meta robots tag - with "noindex,follow" to the page
Intermediate & Advanced SEO | | SWEMII
This means the page will not be indexed with all it's myriad queries and parameters. And so takes care of any duplicate meta /markup issues - but any other links from the page can still be crawled and indexed = better crawling, indexing of the site, however you lose any value the page itself might bring.
This is the advice Yoast recommends in 2017 : https://yoast.com/blocking-your-sites-search-results/ - who are adamant that Google just doesn't like or want to serve this kind of page anyway... 2. Just add a canonical link tag - this will ensure that the search results page is still indexed as well.
All the different query string URLs, and the array of results they serve - are 'canonicalised' as the same.
However - this seems a bit duplicitous as the results in the page body could all be very different. Also - all the paginated results pages - would be 'canonicalised' to the main search page - which we know Google states is not correct implementation of canonical tag
https://webmasters.googleblog.com/2013/04/5-common-mistakes-with-relcanonical.html this picks up on this older discussion here from 2012
https://mza.bundledseo.com/community/q/internal-search-rel-canonical-vs-noindex-vs-robots-txt
Where the advice was leaning towards using canonicals because the user was seeing a percentage of inbound into these search result pages - but i wonder if it will still be the case ? As the older discussion is now 6 years old - just wondering if there is any new approach or how others have chosen to handle internal search I think a lot of the same issues occur with faceted navigation as discussed here in 2017
https://mza.bundledseo.com/blog/large-site-seo-basics-faceted-navigation1 -
Will two navigation components (one removed by Javascript) impact Google rankings?
We are trying to eliminate tedium when developing complexly designed responsive navigations for mobile, desktop and tablet. The changes between breakpoints in our designs are too complex to be handled with css, so we are literally grabbing individual elements with javascript and moving them around. What we'd like to do instead is have two different navigations on the page, and toggle which one is on the DOM based on breakpoint. These navigations will have the same links but different markup. Will having two navigation components on the page at page load negatively impact our Google SEO rankings or potential to rank, even if we are removing one or the other from the DOM with JavaScript?
Intermediate & Advanced SEO | | CaddisInteractive0 -
Realtor site with external links in navigation
I have a client with a realtor site that uses IDX for the listings feed. We have several external links going over to the IDX site for various live custom searches (ie: luxury listings, waterfront listings, etc...). We are getting a Moz spam ranking of 2/7 for both "Large Number of External Links" and "External Links in Navigation". Chances are, these are related. My question is this: (1) Being the score is only 2/7, should I bother with fixing this? (2) If I add a rel="nofollow" to all the site-wide links (in header, footer & menu) will this help? I couldn't find anything definitive in the Q&A search. Looking forward to any insights!!!
Intermediate & Advanced SEO | | lcallander1 -
Are Navigation links different to static links
We are trying to reduce the number of links on our homepage. We could remove some fly out navigation links, We rank 1st on Google for some of these links. Would removing these hurt our SEO. The links are accessible 1 level down if we remove the homepage.
Intermediate & Advanced SEO | | Archers0 -
Best method to update navigation structure
Hey guys, We're doing a total revamp of our site and will be completely changing our navigation structure. Similar pages will exist on the new site, but the URLs will be totally changed. Most incoming links just point to our root domain, so I'm not worried about those, but the rest of the site does concern me. I am setting up 1:1 301 redirects for the new navigation structure to handle getting incoming links where they need to go, but what I'm wondering is what is the best way to make sure the SERPs are updated quickly without trashing my domain quality, and ensuring my page and domain authority are maintained. The old links won't be anywhere on the new site. We're swapping the DNS record to the new site so the only way for the old URLs to be hit will be incoming links from other sites. I was thinking about creating a sitemap with the old URLs listed and leaving that active for a few weeks, then swapping it out for an updated one. Currently we don't have one (kind of starting from the bottom with SEO) Also, we could use the old URLs for a few weeks on the new site to ensure they all get updated as well. It'd be a bit of work, but may be worth it. I read this article and most of that seems to be covered, but just wanted to get the opinions of those who may have done this before. It's a pretty big deal for us. http://www.seomoz.org/blog/uncrawled-301s-a-quick-fix-for-when-relaunches-go-too-well Am I getting into trouble if I do any of the above, or is this the way to go? PS: I should also add that we are not changing our domain. The site will remain on the same domain. Just with a completely new navigation structure.
Intermediate & Advanced SEO | | CodyWheeler0 -
Ever Wise to Intentionally Use Javascript for Global Navigation?
I may be going against the grain here, but I'm going to throw this out there and I'm interested in hearing your feedback... We are a fairly large online retailer (50k+ SKUs) where all of our category and subcategory pages show well over 100 links (just the refinement links on the left can quickly add up to 50+). What's worse is when you hover on our global navigation, you see the hover menu (bot sees them as ) of over 80 links. Now I realize the good rule of thumb is not to exceed 100 links on a page (and if you did your math, you can see we already exceeded that well before we let the bots get to the good stuff we really wanted them to crawl in the first place). So... Is it wise to intentionally shield these global nav links from the bots by using javascript?
Intermediate & Advanced SEO | | mrwestern0