What is better for SEO keywords in folder or in filename - also dupe filename question
-
Hey folks,
I've got a question regarding URL structure. What is best for SEO given that there will be millions of lawyer names and 4 pages per lawyer
www.lawyerz.com/office-locations/dr-al-pacino
www.lawyerz.com/phone-number/dr-al-pacino
www.lawyerz.com/reviews/dr-al-pacino
www.lawyerz.com/ratings/dr-al-pacino
OR
www.lawyerz.com/office-locations-dr-al-pacino
www.lawyerz.com/phone-number-dr-al-pacino
www.lawyerz.com/reviews-dr-al-pacino
www.lawyerz.com/ratings-dr-al-pacino
OR
www.lawyerz.com/dr-al-pacino/office-locations
www.lawyerz.com/dr-al-pacino/phone-number
www.lawyerz.com/dr-al-pacino/reviews
www.lawyerz.com/dr-al-pacino/ratings
Also, concerning duplicate file names:
In the first example there are 4 duplicate file names with the lawyers name. (would this cause Google to not index some)
In the second example there are all unique file names (would this look spammy to Google or the user)
In the third example there are millions of duplicate file names (if 1 million lawyers then 1 million files called "office-locations" etc (could so many duplicate filenames cause ranking issues)
Should the lawyers name (which is the main keyword target) appear in the filename or in the folder - which is better for SEO in your opinion? Thanks for your input!
-
I like all of the answers here and I would definitely focus on how the user is searching for the lawyers. If you have a site with millions of lawyers, they would each have an area of practice so it would make sense to develop a structure around this first:
lawyerz.com/practice-area/state/city/attorney-name
WIth this structure, a searcher that types in "estate planning lawyer" would be sent to the estate planning lawyers page and allowed to search further for their city and then lawyer names. I would attach the contact info, reviews directly on that lawyer's page.
Since your higher volume keywords are going be found within the "practice areas", this would seem the next step after the main domain target of "attorney" or "lawyers". Then, location can come third, attorney name is most likely a lesser searched keyword but using a url structure such as "attorney-john-doe" reinforces.
I would LOVE to hear all the expert opinions about this as I am a newbie to seomoz but am finding some great experts and advice over here.
-
while pages with such file names can be indexed, the long-term view dictates avoiding pages with filenames in the URL due to future potential conversion to other frameworks. It makes a site less than ideal for portability.
For example, if every page has index.php or whatever.asp and you change platform, you'll end up with every page needing a 301 redirect. So it's better to avoid that whenever possible.
-
Although the filename will be duplicate, the content on those filenames will be okay. Google will look more at the content on the page rather than anything else. There are sites out there that have weird file structures, like:
/index.php
/services/index.php
/products/index.php
Some CMS's will automatically do this, but they rank fine because they have quality content, even though the index.php is technically a duplicate filename.
You should be fine with this method.
-
It's about users for sure. The last set you show communicates "lawyer name" is more important/valuable. Which is the valid perspective, since all of those elements relate to that lawyer. If some users still want to find lawyers based on reviews, you can offer a filter for that in your database sorting. Same with locations.
On the other side of the coin, instead of "locations", if you had town names, you could group by those so it would be /town-name/lawyer-name/ where all lawyers in the same town fall within that town-name grouping. If it's just /locations/ that's an invalid sort hierarchy.
-
yes navigation-wise this definitely makes the most sense
www.lawyerz.com/dr-al-pacino/office-locations
i guess what I am mostly looking for an answer about is which is better for rankings, the keyword in the folder or file name and if duplicate file names will harm rankings.
thanks so much for your assistance guys.
-
Ok gotcha- well if that is the case, then think about how the user will navigate to the end result if they started from the home page. Logically, you could assume the following
If URL structure is as follows:
www.lawyerz.com/office-locations/dr-al-pacino
then /office-locations/ should contain links to all office locations of multiple lawyers.
But with this structure
www.lawyerz.com/dr-al-pacino/office-locations
/dr-al-pacino/ should contain links to the 4 other pages. **This option will probably be your best structure. **
-
If I am not mistaken it really depends on what users are searching
if they are only searching lawyers names than just find a structure that looks pretty and has the lawyer name in it.
But if there is any traffic data that points that people search the city or phone number along with the lawyer name than it might be wise to have that in the url structure
also ever thought of using subdomains? havent seen that in a lawyer directory yet but some of the major article sites switched to subdomains
-
Assume there will be enough content on these pages to not get hit by panda.
The reason for doing this is to hopefully secure more than one first page result since these are names and very low competition, we see some sites doing this successfully.
We will have locations pages too which will list all the docs in that city
-
Is there any particular reason why office location, phone number, reviews, and ratings need to be on 4 separate pages? I could see there being a lot of thin content which won't really rank well or provide a ton of user value. Can you give some more info as to why this would be? I could easily see all 4 of these pages combined into one.
With that, you can focus your URL structure into categories or local regions or both, depending on how dynamic you want the site to be. For example:
http://www.lawyerz.com/nevada/personal-injury/dr-al-pacino
OR
http://www.lawyerz.com/personal-injury/dr-al-pacino
OR
http://www.lawyerz.com/nevada/dr-al-pacino
Unless there is something that I missing, I think no matter how you structure your URLs, thin content just won't rank.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Javascript and SEO
I've done a bit of reading and I'm having difficulty grasping it. Can someone explain it to me in simple language? What I've gotten so far: Javascript can block search engine bots from fully rendering your website. If bots are unable to render your website, it may not be able to see important content and discount these content from their index. To know if bots could render your site, check the following: Google Search Console Fetch and Render Turn off Javascript on your browser and see if there are any site elements shown or did some disappear Use an online tool Technical SEO Fetch and Render Screaming Frog's Rendered Page GTMetrix results: if it has a Defer parsing of Javascript as a recommendation, that means there are elements being blocked from rendering (???) Using our own site as an example, I ran our site through all the tests listed above. Results: Google Search Console: Rendered only the header image and text. Anything below wasn't rendered. The resources googlebot couldn't reach include Google Ad Services, Facebook, Twitter, Our Call Tracker and Sumo. All "Low" or blank severity. Turn off Javascript: Shows only the logo and navigation menu. Anything below didn't render/appear. Technical SEO Fetch and Render: Our page rendered fully on Googlebot and Googlebot Mobile. Screaming Frog: The Rendered Page tab is blank. It says 'No Data'. GTMetrix Results: Defer parsing of JavaScript was recommended. From all these results and across all the tools I used, how do I know what needs fixing? Some tests didn't render our site fully while some did. With varying results, I'm not sure where to from here.
Intermediate & Advanced SEO | | nhhernandez1 -
Hyphens in Keyword
Hi everyone, I was wondering if anyone had any experience of whether Google treats keywords with hyphens differently. One of my websites main keywords is 'buy to let', however all across our website it is referred to as 'buy-to-let'. People always search without the hyphens. I recently heard that Google may only treat us as a highly relevant match and not an exact match for this keyword. I was wondering if anyone had any experience of this, and what is the best course of action to take. Thanks
Intermediate & Advanced SEO | | brian-madden0 -
Seo for international sites
Hello, I have a question for the group, our main US site- http://www.datacard.com is utilized to move content to other regional sites like http://www.datacard.co.uk/ and http://www.datacard.fr/ and http://www.datacard.com.br/. Anyhow, we essentially have some regional content on those sites, but for ease of maintaining and updating the content we have a company translate this for us and then undergo an in country review for local people in our company to review the content. That being said the meta descriptions, titles, code, everything gets translated to that language. I know there are issue for SEO for these purposes as we get much better rankings with http://www.datacard.com. The regional sites are newer so this could be part of it. We don't have an agency helping us with SEo and i get a lot of questions on what can be done internally for this for regional sites with our current structure. Any tips you have? It would be greatly appreciated! Laura
Intermediate & Advanced SEO | | lauramrobinson320 -
Migration Challenge Question
I work for a company that recently acquired another company and we are in the process of merging the brands. Right now we have two website, lets call them: www.parentcompanyalpha.com www.acquiredcompanyalpha.com We are working with a web development company who is designing our brand new site, which will launch at the end of September, we can call that www.parentacquired.com. Normally it would be simple enough to just 301 redirect all content from www.parentcompanyalpha.com and www.acquiredcompanyalpha.com to the mapped migrated content on www.parentacquired.com. But that would be too simple. The reality is that only 30% of www.acquiredcompanyalpha.com will be migrating over, as part of that acquired business is remaining independent of the merged brands, and might be sold off. So someone over there mirrored the www.acquiredcompanyalpha.com site and created an exact duplicate of www.acquiredcompanybravo.com. So now we have duplicate content for that site out there (I was unaware they were doing this now, we thought they were waiting until our new site was launched). Eventually we will want some of the content from acquiredcompanyalpha.com to redirect to acquiredcompanybravo.com and the remainder to parentacquired.com. What is the best interim solution to maintain as much of the domain values as possible? The new site won't launch until end of September, and it could fall into October. I have two sites that are mirrors of each other, one with a domain value of 67 and the new one a lowly 17. I am concerned about the duplicate site dragging down that 67 score. I can ask them to use rel=canonical tags temporarily if both sites are going to remain until Sept/Oct timeframe, but which way should they go? I am inclined to think the best result would be to have acquiredcompanybravo.com rel=canonical back to acquiredcompanyalpha.com for now, and when the new site launches, remove those and redirect as appropriate. But will that have long term negative impact on acquiredcomapnybravo.com? Sorry, if this is convoluted, it is a little crazy with people in different companies doing different things that are not coordinated.
Intermediate & Advanced SEO | | Kenn_Gold0 -
How do I create a strategy to get rid of dupe content pages but still keep the SEO juice?
We have about 30,000 pages that are variations of "<product-type>prices/<type-of-thing>/<city><state "<="" p=""></state></city></type-of-thing></product-type> These pages are bringing us lots of free conversions because when somebody searches for this exact phrase for their city/state, they are pretty low-funnel. The problem that we are running into is that the pages are showing up as dupe content. One solution we were discussing is to 301-redirect or canonical all the city-state pages back to jus tthe "<type of="" thing="">" level, and then create really solid unique content for the few hundred pages we would have at that point.</type> My concern is this. I still want to rank for the city-state because as I look through our best-converting search-terms, they nearly always have the city-state in the search term, so the search is some variation of " <product-type><type of="" thing=""><city><state>"</state></city></type></product-type> One thing we thought about doing is dynamically changing the meta-data & headers to add the city-state info there. Are there other potential solutions to this?
Intermediate & Advanced SEO | | editabletext0 -
SEO Tools
Anyone have any experience and thoughts about the woo rank website and seo tool?
Intermediate & Advanced SEO | | casper4341 -
Can a Hosting provider that also hosts adult content sites negatively affect our SEO rankings on a non-adult site hosted on same platform?
We're considering moving a site to a host that also offers hosting for adult websites. Can this have a negative affect on SEO, if our hosting company is in any way associated with adult websites?
Intermediate & Advanced SEO | | grapevinemktg0 -
SEO Strategy for Microsite
I am working on a project to build a microsite of sorts that will represent a joint program between two large organizations with established web presences and strong domains. Each of the organizations has dedicated sections on their sites speaking to the program, but the leadership has decided the joint program deserves it's own site with dedicated content. The two larger sites perform very well for SEO, and I don't necessarily want to jeopordize thir rankings by delivering content that competes directly with them. So I am doing some keyword research to find some opportunities that will alllow me to use the new site to target keywords not yet being captialized by the larger sites. My grand scheme is to have the three sites targeting the broadest array of keywords possible, thus maximizing exposure and avoiding competition. Here is the rub: the content between the three sites will be different but very similar, and there will be plenty of cross linking, especially from the existing sites to the new site, as we grow the brand of the joint program. I'm curious to here some expert opinions on what the puitfalls of the strategy are and what are some of the things I can do to avoid falling in the black hat category - I recognize that proliferating sites around a single topic and cross linking them is black hat. The organizations simply want to build a brand around a joint program and we are striggling to do that without a dedicated website.
Intermediate & Advanced SEO | | AmyLB0