Is Keyword Density Still Relevant?
-
Good afternoon everyone!
I wanted to ask everyone here a question, one that has been being discussed around my office with a lot of different sides being taken.
Does Keyword Density matter? If it does, what percentage do you try to have your keyword hit?
-
In my experience its important to mention your keyword as much as you can provided that where you write that keyword actually makes sense, ie.. it should be surrounded by relevant text. I run a shopping site and department pages with say a 500-800 word block of text which mention the key phrase maybe 5-10 times rank much better than those without.
-
This is excellent, thank you! I had no idea that mentioning a keyword 15 times in a 500-word copy is something that gets flagged by Moz.
-
I was actually also looking for an answer to this and found threads way back 2013 with the same advice that keyword density doesn't matter. What I found using MozPro is that target keywords should be mentioned in the meta title, description, H1, URL (if possible), and at least once in the copy without breaking the keywords apart. I also found that mentioning a target keyword in a 500-word copy more than 15 times is flagged by Moz as stuffing. I just use these as a rough guide. When I'm feeling particularly worried that some keywords have been mentioned too much, I use a free keyword density analyser to see how many times they appeared and reduce if necessary (especially if the copy doesn't read too well because of too much repetitiveness). Hope this helps!
-
this a great list to keep in mind
-
Based on my experience "The Keyword Density" does not have a big impact on the rank performance of a web page.
Don't get me wrong you need your Keyword in Title Tag, Header 1 etc. But there other factors with a better impact and more weight than "Keyword Density"
1. Domain Age
2. Keyword Appears in Top Level Domain
3. Keyword As First Word in Domain
4. Domain registration length
5. Keyword in Subdomain Name
6. Domain History
7. Exact Match Domain
8. Public vs. Private WhoIs
9. Penalized WhoIs Owner
10. Country TLD extension -
I agree with what you have to say about KW stuffing and making sure that the content on our sites are useful, well-written and purposeful. But should we avoid focusing on keyword density entirely? Even if it's just hitting a 3%-5% margin?
-
No, you should not be stuffing KW or worried about density, make your content user friendly and useful based on the KW phrases that you want to attract visitors with.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Welsh Language Keyword Research
Helping a friend with some keyword research, their business is based in Wales. I am not a Welsh speaker, whats the best way to do Keyword Research?
Local Website Optimization | | GrouchyKids0 -
No Index, No Follow Short *but relevant) content?
One of the sections of our blog is "Community Involvement." In this section, we post pictures of the event, what it was for, and what we did to help. We want our clients, and potential clients, to see that we do give back to our local community. However, thee are all very short posts (maybe a few hundred words). I'm worried this might look like spam, or at the very least, thin content to google, so should I no index no follow the posts or just leave them as is? Thanks, Ruben
Local Website Optimization | | KempRugeLawGroup0 -
Google still indexing home page even after with 301 - Ecommerce Website
Hi all,
Local Website Optimization | | David1986
We have a 301 redirect problem. Google seems to continue indexing a 301 redirect to our old home page. Even after months. We have a multiple language domain, with subfolders: www.example.com (ex page, now with a redirect to the right locale in the right country) www.example.com/it/home (canonical) www.example.com/en/home (canonical) www.example.com/es/home (canonical) www.example.com/fr/home (canonical) www.example.com/de/home (canonical) We still see the old page (www.example.com) in Google results, with old metadata in English and, just in some countries (i.e.: France), we see the correct result, the "new" homepage, www.example.com/fr/home in first position.
The real problem is that Google is still indexing and showing www.example.com as the "real" and "trusted" URL, even if we set: a 301 redirect the right language for every locale in Google Search Console a canonical tag to the locale url an hreflang tag inside the code a specific sitemap with hreflang tag specified for the new homepages Now our redirect process is the following (Italy example).
www.example.com -->301
www.example.com/en/home --> default version --->301
www.example.com/it/home --> 200 Every online tool, from Moz to Bot simulators see that there is a 301. So Correct. Google Search Console says that: on www.example.com there is a 301 (correct) in the internal link section of Google Search Console the www.example.com is still in first position with 34k links. Many of these links are cominig from property subdomains. Should we change those links inside those third level domain? From www.example.com to www.example.com/LOCALE/home? the www.example.com/LOCALE/home are the real home page, they give 200 code Do you know if there's a way to delete the old home page from Google results since this is 301? Do you think that, even after a 301 redirect, if Google sees too many internal links decides to ignore the 301? Thanks for your help! Davide0 -
Theory: Local Keywords are Hurting National Rankings?
I've read a good amount here and in other blog posts about strategies for national brands to rank locally as well with local landing pages, citations, etc. I have noticed something strange that I'd like to hear if anyone else is running into, or if anyone has a definitive answer for. I'm looking at a custom business printing company where the products can and are often shipped out of state, so it's a national brand. On each product page, the client is throwing in a few local keywords near where the office is to help rank for local variations. When looking at competitors that have a lower domain authority, lower volume of linking root domains, less content on the page, and other standard signals, they are ranking nationally better than the client. The only thing they're doing that could be better is bolding and throwing in the page keyword 5-10 times (which looks unnatural). But when you search for keyword + home city, the client ranks better. My hypothesis is that since the client is optimizing product pages for local keywords as well as national, it is actually hurting on national searches because it's seen as local-leaning business. Has anyone run into this before, or have a definitive answer?
Local Website Optimization | | Joe.Robison2 -
Benefits of adding keywords to site structure?
Hello fellow Mozzers, This is kind of a hypothetical, but it might have implications for future projects. Do you think there would be any benefits (or drawbacks) to placing pages of a site into a directory named after a keyword? For example, if I had a local store that sold hockey equipment, and "hockey", "equipment", and "hockey equipment" were the main targets being optimized for, would it be better (assuming the actual pages were the same) to structure the site as hypotheticalwebsite.com/about-us/ hypotheticalwebsite.com/hockey-skates/ hypotheticalwebsite.com/hockey-sticks/ hypotheticalwebsite.com/blog/ or hypotheticalwebsite.com/hockey-equipment/about-us/ hypotheticalwebsite.com/hockey-equipment/hockey-skates/ hypotheticalwebsite.com/hockey-equipment/hockey-sticks/ hypotheticalwebsite.com/hockey-equipment/blog/ Additionally, would any of this change if the root domain or the individual pages ALSO used those keywords (or if both of them used it)? pseudonyms-hockey-gear.com/hockey-equipment/skates/ pseudonyms-penalty-box.com/hockey-equipment/hockey-skates/ pseudonyms-hockey-gear.com/hockey-equipment/hockey-skates/ I've got a hunch that some of these are overkill, but I'm not sure where the scale tips from helpful to negligible to actively counterproductive. Thanks, everyone!
Local Website Optimization | | BrianAlpert780 -
Drastic changes in keyword rankings on a daily basis
Anybody ever seen keyword rankings for a site change drastically from day to day? I've got a client, a local furniture store, whose local keywords (furniture + city) rank consistently well without much change, but when it comes to broader keyword rankings (like "furniture" or "furniture store") in their zip code, they'll go from ranking at the top of Google one day to not being ranked at all the next (at least according to Raven Tools). My best guess is that it's just a reflection of personalized results from Google, but such a dramatic change day in and day out makes me wonder.
Local Website Optimization | | ChaseMG0 -
Is it possible to target a keyword which is english but targeted to google.com.tr user
Hey I want to know, is it possible to target a keyword which is english, but target market .com.tr For that purpose must we take backlink from site written english but target to turkish ?
Local Website Optimization | | atakala
Or site written english but target to anywhere? I know this question is a bit confusing but my boss want me to that.0 -
Does Google play fair? Is 'relevant content' and 'usability' enough?
It seems there are 2 opposing views, and as a newbie this is very confusing. One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly. The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well. Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair. Here's an example to illustrate one related concern I have: I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content: Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind.. Thoughts?
Local Website Optimization | | couponguy0