Will having a big list of cities for areas a client services help or damage SEO on a page?
-
We have a client we inherited that has flat text list of all the cities and counties they service on their contact page.
They service the entire southeast so the list just looks crazy ridiculous.
--------- Example: ----
South Carolina:
Abbeville, Aiken, Allendale, Anderson, Bamberg, Barnwell, Beaufort, Berkeley, Calhoun, Charleston, Cherokee, etc etc
------ end example ------
The question is, will this help or hinder their seo for their very specific niche industry? Is this key word spamming? It has an end-user purpose so it technically isn't spam, but perhaps the engines may look at it otherwise. I couldn't find a definitive answer to the question, any help would be appreciated.
-
Right on! It worked for the tortoise.
-
Excellent suggestion. Slow and steady wins the race.
-
Scott,
Curious if the business in question has a blog? Could he blog about 'an engine I fixed for a client in Abbeville, SC', and put a content strategy in place to start blogging about his projects in his major cities? Maybe just start with the top 10 cities from which he gets orders for engine repair? Craft writeups of each project he accomplishes for a unique client in each city and make it a blog post. Then, move onto the 10 next-most-important cities. So, maybe he would be starting with the capitols of South Carolina, Florida, Georgia, Alabama, and then moving on to other busy cities.
Eventually, you could have a page on the site (or a menu area) designated Successful Project Showcase that would link permanently to these posts.
My goal here would be to find an authentic and natural approach for showcasing his work in a way that adds great content to the site and doesn't simply list every city in the South East. This strategy, in combination with his service area map, could work well, I believe.
-
That certainly solves the design problem, but would not help someone in in Abbeville, South Carolina find the business (and the business certainly won't have a unique landing page for such a small city). Decisions, decisions. Thanks for the suggestions.
-
While I can't say this would results in an actual penalty, as you say, it looks spammy, so anything like that is kind of shaky ground.
Have you considered making a service area map instead, showing all of the client's service states/cities?
If he services every city in every state of the South East, I simply cannot find a logical justification for listing them all. A map would send the same message, but in a logical, visual manner.
-
Good answers. They do some seriously technical stuff with broken engines. They only have one location, but because it's so niche and there are so few competitors they have clients all over the country that ship their engines to the client in Florida for repairs.
It certainly looks spammy design wise (and we'll find ways to rectify that with some jquery drops), but I'm more concerned with any potential penalty this might cause, if any.
-
Hi Christopher,
Yes, I'd say that would end up looking pretty spammy if they've got a list like this for every state in the South East on their contact page. For the same reason that an e-commerce website wouldn't list all 1000 items they carry on a single page, this is not something I'd recommend.
What's the business model? Virtual or Local? If local, a more natural approach to this would be to have unique pages for each of their physical offices. I very much doubt they have an office in every one of those cities in South Carolina, right? But, perhaps they have 10 offices throughout the South East and could have a unique page for each of them?
Maybe you could share a few more details about the type of business this is?
-
I would create a page called "Service Area" and put an unordered list (ul) may look nicer, and is less spammy. Without knowing the product or service, I'm not sure if that will work for you.
Ex:
South Carolina
- Abbeville
- Aiken
- Allendale
- Anderson
- etc.
Georgia
- Atlanta
- Blah
- Clah
- Dlah
Most importantly - DO NOT post that list in the footer or sidebar of every page. It will significantly dilute the effectiveness. Containing this information on a single page, and peppering the rest of the site with some of your larger markets will be likely most effective for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO Content Audits Questions (Removing pages from website, extracting data, organizing data).
Hi everyone! I have a few questions - we are running an SEO content audit on our entire website and I am wondering the best FREE way to extract a list of all indexed pages. Would I need to use a mix of Google Analytics, Webmaster Tools, AND our XML sitemap or could I just use Webmaster Tools to pull the full list? Just want to make sure I am not missing anything. As well, once the data is pulled and organized (helpful to know the best way to pull detailed info about the pages as well!) I am wondering if it would be a best practice to sort by high trafficked pages in order to rank them for prioritization (ie: pages with most visits will be edited and optimized first). Lastly, I am wondering what constitutes a 'removable' page. For example, when it is appropriate to fully remove a page from our website? I understand that it is best, if you need to remove a page, to redirect the person to another similar page OR the homepage. Is this the best practice? Thank you for the help! If you say it is best to organize by trafficked pages first in order to optimize them - I am wondering if it would be an easier process to use MOZ tools like Keyword Explorer, Page Optimization, and Page Authority to rank pages and find ways to optimize them for best top relevant keywords. Let me know if this option makes MORE sense than going through the entire data extraction process.
Technical SEO | | PowerhouseMarketing0 -
Is there a way to index important pages manually or to make sure a certain page will get indexed in a short period of time??
Hi There! The problem I'm having is that certain pages are waiting already three months to be indexed. They even have several backlinks. Is it normal to have to wait more than three months before these pages get an indexation? Is there anything i can do to make sure these page will get an indexation soon? Greetings Bob
Technical SEO | | rijwielcashencarry0400 -
SEO of Social Media Pages
I have noticed something odd about how Google ranks social media pages, and was hoping someone would have a good explanation. When I search for a particular name in Google, the first two results are Twitter pages of two people who share the same name. #1 is an older account with more Tweets, but it has fewer followers, no external backlinks, and the URL is unrelated to the name #2 is a newer account, but it has more followers, a few external backlinks, and the name itself is in the URL. It has fewer overall Tweets, but has Tweeted more frequently over the past several months. #2 is also happens to be in the same City as I am. Given my understanding of Google's ranking factors, I would not have expected #1 to outrank #2. In fact, I would not have expected #1 to even be on the first page. What could be causing #1 to rank so highly? Does it make sense that the age of the account or the number of Tweets would affect SEO at all? Really, I am just trying to understand what are the main factors that determine the ranking of social media profile pages. Thanks
Technical SEO | | timsegraves0 -
Best way to manage SEO for a massive events listing website.
I run a website that tracks entertainment for the entire state of South Dakota. While I've made some fantastic strides in gaining traffic, I feel lost on how to manage all those entries in an SEO friendly manner. I have a TON of errors showing on my crawl diagnostics and I just don't know what to do. The nature of the website is such that there are going to be duplications all over the place. I know that I can help some of this by getting my canonical links setup properly (that's coming in my next version of the site's theme), but what else should I do to make those event listings friendly for the SE's?? http://www.entertainsd.com
Technical SEO | | jcherland0 -
Client with Very Very Bad Onsite SEO
So one of my clients has a really really bad website from the technical perspective. I am talking over 75k in violations and warnings. Granted, the tagging is done well but any other SEO violation you can think of is occurring. In any case, they are building a new website, and I am on a retainer for a couple hours a week to do some link building. I am feeling like I am not getting anywhere. What is your advice? Should I keep on keeping on or advice the client to put SEO on hold until the technical issues are resolved. I feel like all of this link building isn't having the value that it could have with a site like this.
Technical SEO | | runnerkik0 -
We have set up 301 redirects for pages from an old domain, but they aren't working and we are having duplicate content problems - Can you help?
We have several old domains. One is http://www.ccisound.com - Our "real" site is http://www.ccisolutions.com The 301 redirect from the old domain to the new domain works. However, the 301-redirects for interior pages, like: http://www.ccisolund.com/StoreFront/category/cd-duplicators do not work. This URL should redirect to http://www.ccisolutions.com/StoreFront/category/cd-duplicators but as you can see it does not. Our IT director supplied me with this code from the HT Access file in hopes that someone can help point us in the right direction and suggest how we might fix the problem: RewriteCond%{HTTP_HOST} ccisound.com$ [NC] RewriteRule^(.*)$ http://www.ccisolutions.com/$1 [R=301,L] Any ideas on why the 301 redirect isn't happening? Thanks all!
Technical SEO | | danatanseo0 -
NOINDEX,NOFOLLOW - Any SEO benefit to these pages?
Hi I could use some advice on a site architecture decision. I am developing something akin to an affiliate scheme for my business. However it is not quite as simple as an affliate setup because the products sold through "affiliates" will be slightly different, as a result I intend to run the site from a subdomain of my main domain. I am intending to NOINDEX,NOFOLLOW the subdomained site because it will contain huge amounts of duplication from my main site (it is really a subset of the main site with some slightly different functionality in places). I don't really want or need this subdomain site indexed, hence my decision to NOINDEX,NOFOLLOW it. However given I will, hopefully, be having lots of people link into the subdomain I am hoping to come up with some sort of arrangement that will mean that my main domain derives some sort of benefit from the linking. They are, after all, votes for my business so they feel like "good links". I am assuming here that a direct link into my NOFOLLOW,NOINDEX subdomain is going to provide ZERO benefit to my main domain. Happy to be corrected! The best I can come up with is to have a "landing page" on my main domain which links into parts of my main domain and then provides a link through to the subdomain site. However this feels like a bad experience from the user's point of view (i.e. land on a page and then have to click to get to the real action) and feels a bit spammy, i.e. I don't really have a good reason for this page other than linking! Equally I could NOINDEX,FOLLOW the homepage of the affiliate site and link back to the main domain from there. However this also feels a bit spammy and would be far less beneficial, I guess, because the subdomain homepage would have many more outgoing links than I envisaged for my "landing page" idea above. Also, it also looks a bit spammy (i.e. why follow the homepage and nofollow everything else?)! The trouble, I guess, is that whatever I do feels a bit spammy. I suppose this is because IT IS spammy! 🙂 Has anyone got any good ideas how I could setup an arrangement like I described above and derive benefit to my main domain without it looking (or being) spammy? I just hate to think of all of those links being wasted (in an SEO sense). Thanks Gary
Technical SEO | | gtrotter6660 -
Page not Accesible for crawler in on-page report
Hi All, We started using SEOMoz this week and ran into an issue regarding the crawler access in the on-page report module. The attached screen shot shows that the HTTP status is 200 but SEOMoz still says that the page is not accessible for crawlers. What could this be? Page in question
Technical SEO | | TiasNimbas
http://www.tiasnimbas.edu/Executive_MBA/pgeId=307 Regards, Coen SEOMoz.png0