Should we always avoid drop-down menus?
-
In Google's SEO Guide, they say avoid the use of drop-down menus, page 12: http://static.googleusercontent.com/external_content/untrusted_dlcp/www.google.com/en/us/webmasters/docs/search-engine-optimization-starter-guide.pdf
But, is this always true? What if you create the drop down purely using HTML & CSS? Is it fine to use a bit of javascript to create the drop-down menu, or should it only be HTML & CSS?
-
Ensure you add a sitemap and you should be fine.
-
Agreed with Simon. Look at tons of huge online retailers like Zappos. Just needs to be done right and you're fine.
-
Hi Michelle
Drop-down menus are usually fine for SEO, so long as the navigational links within them are text links that search engine spiders can crawl and follow.
HTML and CSS are usually the preferred choice, JavaScript can sometimes be troublesome for bots, though certainly has it's useful place in web design. So long as the links within the menus are text links, then it will be fine.
Could always run them through a spider simulator to make sure.
Regards
Simon
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Site hacked and now being spammed and rankings dropping
I have a client's site that was hacked in December and a whole bunch of spam pages were created. We've since removed those pages and they serve a 404 error now. But I pick up a new link almost every day from a another spammy site linking back to the pages that were created. I'm updating the Disavow as I see the new links reported in ahrefs but what else can I do to stop these buggers and how can I recover rankings other than building links and creating content
Intermediate & Advanced SEO | | Marketing_Today0 -
Why is my Bing traffic dropping?
In the middle of September we launched a redesigned version of our site. The urls all stayed the same. Since site launch traffic in Google has steadily increased but Bing traffic has dropped by about 50%. Any ideas on what I should look at?
Intermediate & Advanced SEO | | EcommerceSite0 -
My Domain authority dropped 9 points... Does anyone have any suggestions to fix this significant drop.
My domain authority dropped by 9 points and I haven't done anything differently since the last scan. What is going on?
Intermediate & Advanced SEO | | infotrust20 -
Traffic drop after Facebook push
Hi all, We experienced a strange phenonema after a Facebook push, it appears the Google organic traffic was all but dead for five days after. Totally not sure why! It has since returned to about 80% of previous levels. http://postimg.org/image/3n1b7m7hf/
Intermediate & Advanced SEO | | ScottOlson0 -
Avoiding Duplicate Content with Used Car Listings Database: Robots.txt vs Noindex vs Hash URLs (Help!)
Hi Guys, We have developed a plugin that allows us to display used vehicle listings from a centralized, third-party database. The functionality works similar to autotrader.com or cargurus.com, and there are two primary components: 1. Vehicle Listings Pages: this is the page where the user can use various filters to narrow the vehicle listings to find the vehicle they want.
Intermediate & Advanced SEO | | browndoginteractive
2. Vehicle Details Pages: this is the page where the user actually views the details about said vehicle. It is served up via Ajax, in a dialog box on the Vehicle Listings Pages. Example functionality: http://screencast.com/t/kArKm4tBo The Vehicle Listings pages (#1), we do want indexed and to rank. These pages have additional content besides the vehicle listings themselves, and those results are randomized or sliced/diced in different and unique ways. They're also updated twice per day. We do not want to index #2, the Vehicle Details pages, as these pages appear and disappear all of the time, based on dealer inventory, and don't have much value in the SERPs. Additionally, other sites such as autotrader.com, Yahoo Autos, and others draw from this same database, so we're worried about duplicate content. For instance, entering a snippet of dealer-provided content for one specific listing that Google indexed yielded 8,200+ results: Example Google query. We did not originally think that Google would even be able to index these pages, as they are served up via Ajax. However, it seems we were wrong, as Google has already begun indexing them. Not only is duplicate content an issue, but these pages are not meant for visitors to navigate to directly! If a user were to navigate to the url directly, from the SERPs, they would see a page that isn't styled right. Now we have to determine the right solution to keep these pages out of the index: robots.txt, noindex meta tags, or hash (#) internal links. Robots.txt Advantages: Super easy to implement Conserves crawl budget for large sites Ensures crawler doesn't get stuck. After all, if our website only has 500 pages that we really want indexed and ranked, and vehicle details pages constitute another 1,000,000,000 pages, it doesn't seem to make sense to make Googlebot crawl all of those pages. Robots.txt Disadvantages: Doesn't prevent pages from being indexed, as we've seen, probably because there are internal links to these pages. We could nofollow these internal links, thereby minimizing indexation, but this would lead to each 10-25 noindex internal links on each Vehicle Listings page (will Google think we're pagerank sculpting?) Noindex Advantages: Does prevent vehicle details pages from being indexed Allows ALL pages to be crawled (advantage?) Noindex Disadvantages: Difficult to implement (vehicle details pages are served using ajax, so they have no tag. Solution would have to involve X-Robots-Tag HTTP header and Apache, sending a noindex tag based on querystring variables, similar to this stackoverflow solution. This means the plugin functionality is no longer self-contained, and some hosts may not allow these types of Apache rewrites (as I understand it) Forces (or rather allows) Googlebot to crawl hundreds of thousands of noindex pages. I say "force" because of the crawl budget required. Crawler could get stuck/lost in so many pages, and my not like crawling a site with 1,000,000,000 pages, 99.9% of which are noindexed. Cannot be used in conjunction with robots.txt. After all, crawler never reads noindex meta tag if blocked by robots.txt Hash (#) URL Advantages: By using for links on Vehicle Listing pages to Vehicle Details pages (such as "Contact Seller" buttons), coupled with Javascript, crawler won't be able to follow/crawl these links. Best of both worlds: crawl budget isn't overtaxed by thousands of noindex pages, and internal links used to index robots.txt-disallowed pages are gone. Accomplishes same thing as "nofollowing" these links, but without looking like pagerank sculpting (?) Does not require complex Apache stuff Hash (#) URL Disdvantages: Is Google suspicious of sites with (some) internal links structured like this, since they can't crawl/follow them? Initially, we implemented robots.txt--the "sledgehammer solution." We figured that we'd have a happier crawler this way, as it wouldn't have to crawl zillions of partially duplicate vehicle details pages, and we wanted it to be like these pages didn't even exist. However, Google seems to be indexing many of these pages anyway, probably based on internal links pointing to them. We could nofollow the links pointing to these pages, but we don't want it to look like we're pagerank sculpting or something like that. If we implement noindex on these pages (and doing so is a difficult task itself), then we will be certain these pages aren't indexed. However, to do so we will have to remove the robots.txt disallowal, in order to let the crawler read the noindex tag on these pages. Intuitively, it doesn't make sense to me to make googlebot crawl zillions of vehicle details pages, all of which are noindexed, and it could easily get stuck/lost/etc. It seems like a waste of resources, and in some shadowy way bad for SEO. My developers are pushing for the third solution: using the hash URLs. This works on all hosts and keeps all functionality in the plugin self-contained (unlike noindex), and conserves crawl budget while keeping vehicle details page out of the index (unlike robots.txt). But I don't want Google to slap us 6-12 months from now because it doesn't like links like these (). Any thoughts or advice you guys have would be hugely appreciated, as I've been going in circles, circles, circles on this for a couple of days now. Also, I can provide a test site URL if you'd like to see the functionality in action.0 -
Big Drop in Traffic, No change in Position
Penguin 2.0 was a great update for one of my biggest client. A website that was using terrible black hat techniques and ranked first on the most important keyword in my clients niche got kickt from the SERP's and my client jumped from 4th to 1st. The jump in traffic was enormous and on top of that 5% of the traffic converted instead of the usual 2,5%- 3% on other traffic. Untill July 2nd. Traffic from the keyword dropped by 80% while we were still in position 1, after a lot of digging I thaught I found what caused it, Google booted the keyword from their autofill. My question is if anyone has seen a removal from tthe autocomplete making that big of a difference in search volume.
Intermediate & Advanced SEO | | Laurensvda1 -
Why our site dropped in rank for a main keyword
Hello, Our site nlpca(dot)com dropped in rank for a few terms, including the main term "NLP". Could you look at our site and tell us what might be the cause? Thank you so much, Bob
Intermediate & Advanced SEO | | BobGW0 -
Ranking & Traffic drops in last month
Over the last month, our rankings have been in a slow slide - that is until this week, when they absolutely crashed. Here are some example phrases: Phrase 11-Mar 5-Mar bug shields 24 9
Intermediate & Advanced SEO | | ShawnHerrick
floor mats 25 14
nerf bars 23 12
running boards 61 14
snow plows 25 18 For the life of me, I can't see what would have caused such drastic changes. Our site is almost completely unique content. Some things, like Warranty & Install instructions, are from the manufacturer to protect us from liabilities. We come up with our own feature text, and we have custom written articles, blog posts, research guides, etc. We also appear to be the only one of our competitors being affected in this fashion. Any thoughts would be helpful. Domain is realtruck.com.0