Will having a big list of cities for areas a client services help or damage SEO on a page?
-
We have a client we inherited that has flat text list of all the cities and counties they service on their contact page.
They service the entire southeast so the list just looks crazy ridiculous.
--------- Example: ----
South Carolina:
Abbeville, Aiken, Allendale, Anderson, Bamberg, Barnwell, Beaufort, Berkeley, Calhoun, Charleston, Cherokee, etc etc
------ end example ------
The question is, will this help or hinder their seo for their very specific niche industry? Is this key word spamming? It has an end-user purpose so it technically isn't spam, but perhaps the engines may look at it otherwise. I couldn't find a definitive answer to the question, any help would be appreciated.
-
Right on! It worked for the tortoise.
-
Excellent suggestion. Slow and steady wins the race.
-
Scott,
Curious if the business in question has a blog? Could he blog about 'an engine I fixed for a client in Abbeville, SC', and put a content strategy in place to start blogging about his projects in his major cities? Maybe just start with the top 10 cities from which he gets orders for engine repair? Craft writeups of each project he accomplishes for a unique client in each city and make it a blog post. Then, move onto the 10 next-most-important cities. So, maybe he would be starting with the capitols of South Carolina, Florida, Georgia, Alabama, and then moving on to other busy cities.
Eventually, you could have a page on the site (or a menu area) designated Successful Project Showcase that would link permanently to these posts.
My goal here would be to find an authentic and natural approach for showcasing his work in a way that adds great content to the site and doesn't simply list every city in the South East. This strategy, in combination with his service area map, could work well, I believe.
-
That certainly solves the design problem, but would not help someone in in Abbeville, South Carolina find the business (and the business certainly won't have a unique landing page for such a small city). Decisions, decisions. Thanks for the suggestions.
-
While I can't say this would results in an actual penalty, as you say, it looks spammy, so anything like that is kind of shaky ground.
Have you considered making a service area map instead, showing all of the client's service states/cities?
If he services every city in every state of the South East, I simply cannot find a logical justification for listing them all. A map would send the same message, but in a logical, visual manner.
-
Good answers. They do some seriously technical stuff with broken engines. They only have one location, but because it's so niche and there are so few competitors they have clients all over the country that ship their engines to the client in Florida for repairs.
It certainly looks spammy design wise (and we'll find ways to rectify that with some jquery drops), but I'm more concerned with any potential penalty this might cause, if any.
-
Hi Christopher,
Yes, I'd say that would end up looking pretty spammy if they've got a list like this for every state in the South East on their contact page. For the same reason that an e-commerce website wouldn't list all 1000 items they carry on a single page, this is not something I'd recommend.
What's the business model? Virtual or Local? If local, a more natural approach to this would be to have unique pages for each of their physical offices. I very much doubt they have an office in every one of those cities in South Carolina, right? But, perhaps they have 10 offices throughout the South East and could have a unique page for each of them?
Maybe you could share a few more details about the type of business this is?
-
I would create a page called "Service Area" and put an unordered list (ul) may look nicer, and is less spammy. Without knowing the product or service, I'm not sure if that will work for you.
Ex:
South Carolina
- Abbeville
- Aiken
- Allendale
- Anderson
- etc.
Georgia
- Atlanta
- Blah
- Clah
- Dlah
Most importantly - DO NOT post that list in the footer or sidebar of every page. It will significantly dilute the effectiveness. Containing this information on a single page, and peppering the rest of the site with some of your larger markets will be likely most effective for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's the best way to test Angular JS heavy page for SEO?
Hi Moz community, Our tech team has recently decided to try switching our product pages to be JavaScript dependent, this includes links, product descriptions and things like breadcrumbs in JS. Given my concerns, they will create a proof of concept with a few product pages in a QA environment so I can test the SEO implications of these changes. They are planning to use Angular 5 client side rendering without any prerendering. I suggested universal but they said the lift was too great, so we're testing to see if this works. I've read a lot of the articles in this guide to all things SEO and JS and am fairly confident in understanding when a site uses JS and how to troubleshoot to make sure everything is getting crawled and indexed. https://sitebulb.com/resources/guides/javascript-seo-resources/ However, I am not sure I'll be able to test the QA pages since they aren't indexable and lives behind a login. I will be able to crawl the page using Screaming Frog but that's generally regarded as what a crawler should be able to crawl and not really what Googlebot will actually be able to crawl and index. Any thoughts on this, is this concern valid? Thanks!
Technical SEO | | znotes0 -
Noindex PPC landing pages or optimise for SEO?
Organic seems to be down YoY on one of the categories of a large ecommerce website that I work on. This particular category has multiple landing pages set up for PPC consisting of filtered products. So these landing pages are prone to duplicate content due to the products listed. e.g. Blue Thingamajigs White Thingamajigs Black Thingamajigs High Gloss Thingamajigs Oak Thingamajigs Glass Thingamajigs etc These landing pages do well for PPC, but are nowhere to be seen in organic (51+). The main category page however ranks quite well for quite a variety of root and longtail keywords, though not as well as it used to. For example, it does rank for "thingamajigs", "white thingamajigs", "white gloss thingamajigs" and "white gloss thingamajigs with cherries on top". Would it benefit the main category page if the PPC landing pages were noindexed? Or, despite Google's preference for the main category, work on further optimising the landing pages for SEO? Or is there another solution that I'm completely overlooking? (It is a Friday afternoon after all...)
Technical SEO | | Ria_0 -
My SEO friend says my website is not being indexed by Google considering the keywords he has placed in the page and URL what does that mean?
My SEO friend says my website is not being indexed by Google considering the keywords he has placed in the page and URL what does that mean? We have added some text in the pages with keywords thats related the page
Technical SEO | | AlexisWithers0 -
Need Help With WWW vs. Non-WWW Duplicate Pages
A friend I'm working with at RedChairMarket.com is having duplicate page issues. Among them, both www and non-www URLs are being generated automatically by his software framework, ASP.net mvc 3. How should we go about finding and tackling these duplicates? Thanks!
Technical SEO | | BrittanyHighland0 -
Crawl Test Report only shows home page and no inner site pages?
Hi, My site is [removed] When I first tried to set up a new campaign for the site, I received the error: Roger has detected a problem: We have detected that the root domain [removed] does not respond to web requests. Using this domain, we will be unable to crawl your site or present accurate SERP information. I then ran a Crawl Test per the FAQ. The SEOmoz crawl report only shows my home page URL and does not have any inner site pages. This is a Joomla site. What is the problem? Thanks! Dave
Technical SEO | | crave810 -
Over 700+ duplicate content pages -- help!
I just signed up for SEO Moz pro for my site. The initial report came back with over 700+ duplicate content pages. My problem is that while I can see why some of the content is duplicated on some of the pages I have no idea why it's coming back as duplicated. Is there a tutorial for a novie on how to read the duplicate content report and what steps to take? It's an e-commerce website and there is some repetitive content on all the product pages like our "satisfaction guaranteed" text and the fabric material... and not much other text. There's not a unique product description because an image speaks for itself. Could this be causing the problem? I have lots of URLs with over 50+ duplicates. Thx for any help.
Technical SEO | | Santaur0 -
A client will be translating their entire site into French in addition to English. For SEO purposes, should I host it on the same domain or create its own dedicated domain?
The current site is a long-standing site with good authority and a good number of links. Thanks....
Technical SEO | | JamesBSEO0 -
On Page 301 redirect for html pages
For php pages youve got Header( "HTTP/1.1 301 Moved Permanently" );
Technical SEO | | shupester
Header( "Location: http://www.example.com" );
?> Is there anything for html pages? Other then Or is placing this code redirect 301 /old/old.htm http://www.you.com/new.php in the .htaccess the only way to properly 301 redirect html pages? Thanks!0