Should I avoid duplicate url keywords?
-
I'm curious to know
Can having a keyword repeat in the URL cause any penalties ?
For example
xyzroofing.com/commercial-roofing
xyzroofing.com/roofing-repairs
My competitors with the highest rankings seem to be doing it without any trouble but I'm wondering if there is a better way.
Also
One of the problems I've noticed is that my /commercial-roofing page outranks my homepage for both residential and commercial search inquiries. How can this be straightened out?
-
Thank you Boyd
-
There is no penalty for using the same keyword twice in a URL, especially if it's part of your domain name.
There are many examples of sites that have a sub folder that contain the same keyword as their domain name that have no problem ranking including your competition:
- runningwarehouse.com/mens-running-shoes.html ranks #2 for 'running shoes'
- seo.com/seo ranks #5 for 'professional seo'
- overthetopseo.com/professional-seo-services-what-to-expect/ ranks #2 for 'professional seo' (in fact only 3 url's that rank for that phrase don't repeat the term 'seo' in their url.)
- contentmarketinginstitute.com/what-is-content-marketing ranks #1 for 'content marketing'
- etc.
**Ranking the correct page: **
Whenever you have an issue with the wrong page ranking better than the one you want, you just need to work on tweaking your onsite optimization for those pages. (And you may have to continue building more links to the page you want to rank.)
Here is a list of things that I'd make some test changes to: (Keep in mind that you can always revert things back if a test makes rankings go down.)
- Test different title tags on the two pages making one less optimized for the keyword and the other more optimized.
- Add more copy to the page you want to rank.
- Do an internal link audit. You want to make sure that anytime you are linking from one page to another with a specific keyword as the anchor text, that it links to the page that you want to rank for that phrase.
After you make a change, you need to wait until Google re-caches that page and sees the update (which can take a few days or more sometimes) and then check your rankings after that to see if there was any movement or not.
Boyd
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Remove URLs from App
Hi all, our tech team inherited a bit of an SEO pickle. I manage a freemium React JS app built for 80k unique markets worldwide (and associated dedicated URL schema). Ex/ https://www.airdna.co/vacation-rental-data/app/us/california/santa-monica/overview Mistake - App, in its entirety, was indexed by Google in July 2018, which basically resulted in duplicate content penalties because the unique on-page content wasn't readable. Partial Solution - We no indexed all app pages until we were able to implement a "pre-render" / HTML readable solution with associated dynamic meta data for the Overview page in each market. We are now selectively reindexing only the free "Overview" pages that have unique data (with a nofollow on all other page links), but want to persist a noindex on all other pages because the data is not uniquely "readable" before subscribing. We have the technical server-side rules in place and working to ensure this selective indexing. Question - How can we force google to abandoned the >300k cached URLs from the summer's failed deploy? Ex/ https://screencast.com/t/xPLR78IbOEao, would lead you to a live URL such as this which has limited value to the user, https://www.airdna.co/vacation-rental-data/app/us/arizona/phoenix/revenue (Note Google's cached SERPs also have an old URL structure, which we have since 301ed, because we also updated the page structure in October). Those pages are currently and will remain noindexed for the foreseeable future. Our sitemap and robots.txt file is up-to-date, but the old search console only has a temporary removal on a one-by-one basis. Is there a way to do write a rule-based page removal? Or do we simply render these pages in HTML and remove the nofollow to those links from the Overview page so a bot can get to them, and then it would see that there's a noindex on them, and remove them from the SERPs? Thanks for your help and advice!
Local Website Optimization | | Airbnb_data_geek1 -
Is CNAME / URL flattening a bad practice?
I recently have moved a number of websites top a new server and have made the use of CNAME / URL flattening (I believe these are the same?). A network admin had said this is an unrecommended practice. From what I have read it seems flattening can be beneficial for site speed and SEO even if very little.
Local Website Optimization | | Dissident_SLC0 -
Duplicate content, hijacked search console, crawl errors, ACCCK.
My company employed a national marketing company to create their site, which was obviously outsourced to the lowest bidder. It looks beautiful, but has a staging site with all duplicate content in the installation. I am not seeing these issues in search console, and have had no luck getting the staging site removed from the files. How much should I be banging the drum on this? We have hundreds of high level crawl errors and over a thousand in midlevel. Of course I was not around to manage the build. I also do not have ftp access I'm also dealing with major search console issues. The account is proprietarily owned by a local SEO company and I can not remove the owner who is there by delegation. The site prefers the www version and does not read the same traffic for the non www version We also have something like 90,000 backlinks from 13 sites. And a shit ton of ghost spam. Help!
Local Website Optimization | | beth_thesomersteam0 -
Location Pages and Duplicate Content and Doorway Pages, Oh My!
Google has this page on location pages. It's very useful but it doesn't say anything about handling the duplicate content a location page might have. Seeing as the loctions may have very similar services. Lets say they have example.com/location/boston, example.com/location/chicago, or maybe boston.example.com or chicago.example.com etc. They are landing pages for each location, housing that locations contact information as well as serving as a landing page for that location. Showing the same services/products as every other location. This information may also live on the main domains homepage or services page as well. My initial reaction agrees with this article: http://moz.com/blog/local-landing-pages-guide - but I'm really asking what does Google expect? Does this location pages guide from Google tell us we don't really have to make sure each of those location pages are unique? Sometimes creating "unique" location pages feels like you're creating **doorway pages - **"Multiple pages on your site with similar content designed to rank for specific queries like city or state names". In a nutshell, Google's Guidelines seem to have a conflict on this topic: Location Pages: "Have each location's or branch's information accessible on separate webpages"
Local Website Optimization | | eyeflow
Doorway Pages: "Multiple pages on your site with similar content designed to rank for specific queries like city or state names"
Duplicate Content: "If you have many pages that are similar, consider expanding each page or consolidating the pages into one." Now you could avoid making it a doorway page or a duplicate content page if you just put the location information on a page. Each page would then have a unique address, phone number, email, contact name, etc. But then the page would technically be in violation of this page: Thin Pages: "One of the most important steps in improving your site's ranking in Google search results is to ensure that it contains plenty of rich information that includes relevant keywords, used appropriately, that indicate the subject matter of your content." ...starting to feel like I'm in a Google Guidelines Paradox! Do you think this guide from Google means that duplicate content on these pages is acceptable as long as you use that markup? Or do you have another opinion?0 -
Overuse of Keyword in Blog Articles
I have a business that I just recently started working with and he has written a ton of blog articles (some of which are really good) and decided it would be a good idea to insert the keyword hes targeting into the content or title of many of his blogs. Lets just say for example that the keyword is plumbers chicago. So I searched on Google and the words "plumbers chicago" (in that exact order) are on 199 pages on his site, not including filtered content. He is on page 2-3 for that keyword and I'm wondering what people's advice would be for a site that has so much content referencing the same exact keyword. It is a keyword that is service + location too so it actually doesn't even make sense when you read it in half the sentences it's in. I think one of the main issues of why it's on so many pages is he has a tag for it in Wordpress and tons of the blog articles have that particular tag.
Local Website Optimization | | ImprezzioMarketing1 -
Keywords with locations
I've seen quite a few threads that orbit around my questions, but none in the last year, so I'll ask it 🙂 I'm seeing some strange results when testing various keywords with and without locations included. For a foundation repair company in Indiana, we've optimized for all the big cities, since the company services the whole state. Here's a sample of weird stuff: Test 1: If I set my location (all other Google 'helps' turned off) to Indianapolis and search 'foundation repair' result is #3 'foundation repair indianapolis' result is #20 'indiana foundation repair' result is #18 Test 2: Location set to the small town the company is based in (Rossville, IN) 'foundation repair' result is #1 'foundation repair rossville' result is #3 behind other companies located in Rossville, GA, and Rossville, PA!! I suppose I was under the impression that the ip location data Google gathers would weigh more heavily than how place names are optimized as part of keywords (or just that the physical location would supplant the place name typed into the search if it happened to be the same). But according to these tests, it seems that inferred location is by far a secondary factor. I can deduce that we're more optimized than our competitors for 'foundation repair', but less optimized for keywords with place names in them (we feel like we'd be verging on stuffing if we did more). Am I missing something here? Has anyone else seen this sort of thing?
Local Website Optimization | | clearlyseo0 -
URL structure for local SEO
Hi fokes, question; which url structure is best for local rankings. For example: when I want to rank on the keyword: "Plumber Londen". And I dont have plumber in my brand. What is the best url structure: example.com/plumber/londen example.com/plumber-londen
Local Website Optimization | | remkoallertz1 -
Are there any suggestions when you completly redesign your web page keeping the same domain but change the host? I want it to go smoothly and want to avoid the rankings we already have including sub pages.
I am currently having our website completely redone by a design company. Are there any suggestions on this process as to not lose the rankings we currently have for our site? The domain will remain the same however we are planning on changing our host. We also have a good amount of sub domains that the web company will not be changing for us.
Local Website Optimization | | molchman0