How to approach SEO for a national website that has multiple chapter/location websites all under different URLs
-
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national umbrella site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local website.
They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago.)
We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
-
Hey Kat. I'd tend to agree with Andy's response that moving this into one site makes sense. You could then redirect the local pages into the appropriate page on the main site, creating a single authoritative domain. You'd want to make sure as you do this that each local page is truly unique...if the Chicago and Detroit local sites both contain similar pages about a dog safety campaign, you'd want to consolidate those (probably) into a single dog safety campaign page that could work for both locations (or find a way to distinguish the content for each location).
Now, having gone through a few projects like these, I know full well that what is best for SEO or UX isn't always the most popular solution. Internal politics play a role, and I'm guessing you might be in a spot where some of the local chapters don't want to relinquish control over their site. So, as an alternative solution you could look for some other ways to link these sites together. For instance, you could have a dog safety page on the main site that all the local chapters can link to and possibly, let the local chapters adjust the content slightly via the link. As in, if the url includes a query sting referencing Chicago (maincompanysite.net/dog-safety.php?location=Chicago), the content on that dog safety page could be programmed to show Chicago's phone number and address. That way you have a definitive page that can rank, but also a way for local chapters to share that content through their domain.
A lot of that alternative comes down to content governance rules, communication about who creates what page, a clear understanding of how people link to sites, and probably some clear design/brand standards. It is a bit of a mess, but not an unrealistic reality if you can't convince all the stakeholders in the value of consolidating to a single domain.
Hope that helps!
-
Hi Kat,
I would be looking to bring everything into one site. What they are doing isn't really good for user experience, and as you said, they are competing against themselves.
It shouldn't be too difficult to do and 301 the local sites to the relevant areas within the main site.
-Andy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
The best option for expanding a website to another country
Hi everyone, I have a couple questions about expanding my website from US to another country. Which option would be better to use: Country-Coded Top Level Domains, Sub-Directories or Sub-Domains? If I use a new domain, but I want to keep all the content, would canonicalization affect SEO of my new website? I want the new website (or another option) look just like the one I have, except for changing contact information, pictures and adding another language. Thank you!
Local Website Optimization | | KseniaLL0 -
Applying NAP Local Schema Markup to a Virtual Location: spamming or not?
I have a client that has multiple virtual locations to show website visitors where they provide delivery services. These are individual pages that include unique phone numbers, zip codes, city & state. However there is no address (this is just a service area). We wanted to apply schematic markup to these landing pages. Our development team successfully applied schema to the phone, state, city, etc. However for just the address property they said VIRTUAL LOCATION. This checked out fine on the Google structured data testing tool. Our question is this; can just having VIRTUAL LOCATION for the address property be construed as spamming? This landing page is providing pertinent information for the end user. However since there is no brick and mortar address I'm trying to determine if having VIRTUAL LOCATION as the value could be frowned upon by Google. Any insight would be very helpful. Thanks
Local Website Optimization | | RosemaryB1 -
Using IP Detection to Filter Directory Listings without Killing Your SEO?
I have a client who maintains a directory of surgeons across the United States (approx. 2,000 members at present), and wishes to use IP detection to dynamically filter their surgeon directory to a sub-set that is relevant to the geography of the visitor. At the same time, however, we want the pages in the surgeon directory to rank nationally for terms like "[insert specialty] surgeons". Any tips/best practices for implementing an IP detection solution without shooting yourself in the foot from an SEO perspective? Is it even possible? Thanks! Jeremy
Local Website Optimization | | Jeremy_Lopatin0 -
Store Locator Apps - Which Do You Use?
Hey Everybody! I'd so appreciate feedback from our web developers and Local SEO wizards here regarding store locator apps (you know - type in a city/zip and get shown the stores nearest you). There are a number of different paid options out there on the market, and a couple of free ones. If you are managing the websites/SEO for multi-location clients, would you share with me which store locator app you chose, why you chose it and how you like it? I am particularly interested in two things about these: Does you app allow you to build a permanent landing page for each store location, including the ability to fully customize the content on that page? In terms of ensuring that these landing pages get crawled, have you used an html sitemap, some type of directory page with crawlable links or some other feature that allows bots to reach the landing pages? Or, if you're not doing any of that, do you believe Google is crawling javascript/ajax/something else to get through your store locator widget to the landing pages? Thanks, in advance, for helping me with my research on this topic!
Local Website Optimization | | MiriamEllis0 -
Which URL structure should I use?
samhillbands.com/bands/Charlottesville-VA-Wedding-Bands samhillbands.com/wedding/bands/Charlottesville-VA-Bands
Local Website Optimization | | brianvest0 -
Collapsing Location-Specific Subdomains
My client has 24 separate subdomains for its nationwide business, one for each specific location. Much of the content is very similar, as the site serves as a lead-generator for rental reservations. After years of suggesting the approach of using one domain, we have finally gotten the client onboard to eliminating the subdomains and maintaining a subdirectory/page approach for location-specific content and allowing universal content to live at the root domain. I've been looking for any case studies that have any watch-outs or demonstrated benefits when collapsing domestic subdomains (phoenix.client.com; albuquerque.client.com, etc.) into the root, and have been fairly unsuccessful so far. We will be setting up a rigorous 301 redirect tree to ensure we retain as much link juice as possible from any existing subdomain-specific inbound links. Any advice/guidance to help set expectations of what will shake down from this change? It feels like we should see increased domain authority and less cannibalization, as the client ranks nationally for important broad-level keywords, with significantly higher DA at the root level than any tracked competitors, but I'm a little nervous about how localized search results will be affected. Thank you!
Local Website Optimization | | ClassicPartyRentals1 -
SEO geolocation vs subdirectories vs local search vs traffic
My dear community and friends of MOZ, today I have a very interesting question to you all. Although I´ve got my opinion, and Im sure many of you will think the same way, I want to share the following dilemma with you. I have just joined a company as Online Marketing Manager and I have to quickly take a decision about site structure. The site of the company has just applied a big structure change. They used to have their information divided by country (each country one subdirectory) www.site.com/ar/news www.site.com/us/news . They have just changed this and erased the country subdirectory and started using geolocation. So if we go to www.site.com/news although the content is going to be the same for each country ( it’s a Latinamerican site, all the countries speak the same language except Brazil) the navigation links are going to drive you to different pages according to the country where you are located. They believe that having less subdirectories PA or PR is going to be higher for each page due to less linkjuice leaking. My guess is that if you want to have an important organic traffic presence you should A) get a TLD for the country you want to targe… if not B)have a subdirectory or subdomain for each country in your site. I don’t know what local sign could be a page giving to google if the URL and html doesn’t change between countries- We can not use schemas or rich formats neither…So, again, I would suggest to go back to the previous structure. On the other hand…I ve been taking a look to sensacine.com and although their site is pointing only to Spain | |
Local Website Optimization | | facupp1
| | |
| | |
| | |
| | |
| | |
| | |
| | | They have very good rankings for big volume keywords in all latinamerica, so I just want to quantify this change, since I will be sending to the designers and developers a lot of work1