Will having a big list of cities for areas a client services help or damage SEO on a page?
-
We have a client we inherited that has flat text list of all the cities and counties they service on their contact page.
They service the entire southeast so the list just looks crazy ridiculous.
--------- Example: ----
South Carolina:
Abbeville, Aiken, Allendale, Anderson, Bamberg, Barnwell, Beaufort, Berkeley, Calhoun, Charleston, Cherokee, etc etc
------ end example ------
The question is, will this help or hinder their seo for their very specific niche industry? Is this key word spamming? It has an end-user purpose so it technically isn't spam, but perhaps the engines may look at it otherwise. I couldn't find a definitive answer to the question, any help would be appreciated.
-
Right on! It worked for the tortoise.
-
Excellent suggestion. Slow and steady wins the race.
-
Scott,
Curious if the business in question has a blog? Could he blog about 'an engine I fixed for a client in Abbeville, SC', and put a content strategy in place to start blogging about his projects in his major cities? Maybe just start with the top 10 cities from which he gets orders for engine repair? Craft writeups of each project he accomplishes for a unique client in each city and make it a blog post. Then, move onto the 10 next-most-important cities. So, maybe he would be starting with the capitols of South Carolina, Florida, Georgia, Alabama, and then moving on to other busy cities.
Eventually, you could have a page on the site (or a menu area) designated Successful Project Showcase that would link permanently to these posts.
My goal here would be to find an authentic and natural approach for showcasing his work in a way that adds great content to the site and doesn't simply list every city in the South East. This strategy, in combination with his service area map, could work well, I believe.
-
That certainly solves the design problem, but would not help someone in in Abbeville, South Carolina find the business (and the business certainly won't have a unique landing page for such a small city). Decisions, decisions. Thanks for the suggestions.
-
While I can't say this would results in an actual penalty, as you say, it looks spammy, so anything like that is kind of shaky ground.
Have you considered making a service area map instead, showing all of the client's service states/cities?
If he services every city in every state of the South East, I simply cannot find a logical justification for listing them all. A map would send the same message, but in a logical, visual manner.
-
Good answers. They do some seriously technical stuff with broken engines. They only have one location, but because it's so niche and there are so few competitors they have clients all over the country that ship their engines to the client in Florida for repairs.
It certainly looks spammy design wise (and we'll find ways to rectify that with some jquery drops), but I'm more concerned with any potential penalty this might cause, if any.
-
Hi Christopher,
Yes, I'd say that would end up looking pretty spammy if they've got a list like this for every state in the South East on their contact page. For the same reason that an e-commerce website wouldn't list all 1000 items they carry on a single page, this is not something I'd recommend.
What's the business model? Virtual or Local? If local, a more natural approach to this would be to have unique pages for each of their physical offices. I very much doubt they have an office in every one of those cities in South Carolina, right? But, perhaps they have 10 offices throughout the South East and could have a unique page for each of them?
Maybe you could share a few more details about the type of business this is?
-
I would create a page called "Service Area" and put an unordered list (ul) may look nicer, and is less spammy. Without knowing the product or service, I'm not sure if that will work for you.
Ex:
South Carolina
- Abbeville
- Aiken
- Allendale
- Anderson
- etc.
Georgia
- Atlanta
- Blah
- Clah
- Dlah
Most importantly - DO NOT post that list in the footer or sidebar of every page. It will significantly dilute the effectiveness. Containing this information on a single page, and peppering the rest of the site with some of your larger markets will be likely most effective for you.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Client wants to repackage in-depth content as PowerPoint files and embed on site. SEO implications?
Hi, I've a client who is planning to build out "courses" for their site. Their ultimate goal is to have videos (which will have transcriptions) but since the videos are not yet ready they want to launch with the content in PowerPoint format instead. Thing is, the pages they have now are really good content/in-depth. In short it seems videos are Phase 2, so their Phase 1 preference is to take all their courses content and put them in PowerPoint slides and add them to their web site. While I understand standalone files like PDFs and PPTs can be indexable, my recollection is that embedded slides are not (like SlideShare). Is that correct? My worry is that by taking this content and reformatting it into PowerPoints will hurt their site instead of helping. Any insight is appreciated!
Technical SEO | | CR-SEO0 -
SEO Content Audits Questions (Removing pages from website, extracting data, organizing data).
Hi everyone! I have a few questions - we are running an SEO content audit on our entire website and I am wondering the best FREE way to extract a list of all indexed pages. Would I need to use a mix of Google Analytics, Webmaster Tools, AND our XML sitemap or could I just use Webmaster Tools to pull the full list? Just want to make sure I am not missing anything. As well, once the data is pulled and organized (helpful to know the best way to pull detailed info about the pages as well!) I am wondering if it would be a best practice to sort by high trafficked pages in order to rank them for prioritization (ie: pages with most visits will be edited and optimized first). Lastly, I am wondering what constitutes a 'removable' page. For example, when it is appropriate to fully remove a page from our website? I understand that it is best, if you need to remove a page, to redirect the person to another similar page OR the homepage. Is this the best practice? Thank you for the help! If you say it is best to organize by trafficked pages first in order to optimize them - I am wondering if it would be an easier process to use MOZ tools like Keyword Explorer, Page Optimization, and Page Authority to rank pages and find ways to optimize them for best top relevant keywords. Let me know if this option makes MORE sense than going through the entire data extraction process.
Technical SEO | | PowerhouseMarketing0 -
Duplicate content on Places to Stay listings pages
Hello, I've just crawled our website https://www.i-escape.com/ to find we have a duplicate content issue. Every places to stay listing page has identical content (over 1,500 places) due to the fact it's based on user searches or selections. If we hide this pages using canonical tags, will we lose our visibility for each country and/or region we promote hotels? Any help on this would be hugely appreciated! Thanks so much Clair
Technical SEO | | iescape0 -
Help! Pages not being indexed
Hi Mozzers, I need your help.
Technical SEO | | bshanahan
Our website (www.barnettcapitaladvisors.com) stopped being indexed in search engines following a round of major changes to URLs and content. There were a number of dead links for a few days before 301 redirects were properly put in place. And now, only 3 pages show up in bing when I do the search "site:barnettcapitaladvisors.com". A bunch of pages show up in Google for that search, but they're not any of the pages we want to show up. Our home page and most important services pages are nowhere in search results. What's going on here?
Our sitemap is at http://www.barnettcapitaladvisors.com/sites/default/files/users/AndrewCarrillo/sitemap/sitemap.xml
Robots.txt is at: http://www.barnettcapitaladvisors.com/robots.txt Thanks!0 -
Help! Same page with multiple urls. How is this handled?
I'm using DotNetNuke for many of our sites. DotNetNuke's home page can have multiple VALID URLs that go to the same home page. Example: http://aviation-sms.com http://www.aviation-sms.com http://aviation-sms.com/default.aspx http://www.aviation-sms.com/default.aspx and http://aviation-sms.com/aviationSMS.aspx http://www.aviation-sms.com/aviationSMS.aspx All the above URLs have the same content. In the page header tag, I have: Should I be doing something else? such as removing the "default.aspx"??? I have a blog also that has a boatload of pages. I tried this canonical approach, but I'm not sure SEO Moz likes it and the tool offers me little guidance on this issue.
Technical SEO | | manintights280 -
Will Google index a site with white text? Will it give it bad ratings?
Will google not rank a site bc pretty much all the copy is white (and the background is all white)? Here's the site in question: https://www.dropbox.com/s/6w24f6h5p0zaxhg/Garrison_PLAY.vs2-static.pdf https://www.dropbox.com/sh/fwudppvwy2khpau/t43NozpG3E/Garrison_PLAY.vs3.jpg thanks--if you need me to clarify more let me know TM Humphries LocalSearched.com
Technical SEO | | CloudGuys0 -
SEOMoz Crawl Diagnostic indicates duplicate page content for home page?
My first SEOMoz Crawl Diagnostic report for my website indicates duplicate page content for my home page. It lists the home page URL Page Title and URL twice. How do I go about diagnosing this? Is the problem related to the following code that is in my .htaccess file? (The purpose of the code was to redirect any non "www" backlink referrals to the "www" version of the domain.) RewriteCond %{HTTP_HOST} ^whatever.com [NC]
Technical SEO | | Linesides
RewriteRule ^(.*)$ http://www.whatever.com/$1 [L,R=301] Should I get rid of the "http" reference in the second line? Related to this is a notice in the "Crawl Notices Found" -- "301 Permanent redirect" which shows my home page title as "http://whatever.com" and shows the redirect address as http://http://www.whatever.com/ I'm guessing this problem is again related to the redirect code I'm using. Also... The report indicates duplicate content for those links that have different parameters added to the URL i.e. http://www.whatever.com?marker=Blah Blah&markerzoom=13 If I set up a canonical reference for the page, will this fix this? Thank you.0 -
What is the largest page size a searchbot will crawl?
When setting up pagination, what should we limit the page size to? When will a searchbot stop crawling a particular page?
Technical SEO | | nicole.healthline0