Large scale geo-targeting?
-
Hi there. We are an internet marketing agency and recently did a fair amount of working trying to optimise for a number of different locations. Although we are based in Preston (UK), we would like to attract clients from Manchester, Liverpool, etc.
We created landing pages for each of the locations that we wanted to target and each of the services - so we had an SEO Manchester page and a Web Design Manchester page for example. These were all written individually by a copywriter in order to avoid duplicate content. An example of one of the first of these pages is here: http://www.piranha-internet.co.uk/places/seo-blackpool.php
We created a 'where we cover' page and used a clickable map rather than huge long list of text links, which we felt would be spammy, to link through to these pages. You can see this page here: http://www.piranha-internet.co.uk/where-we-cover.php
Initially we gained a great deal of success from this method - with the above Blackpool page ranking #7 for "SEO Blackpool" within a week. However these results quickly disappeared and now we don't rank at all, though the pages remain in the index. I'm aware that we don't have many external links pointing to these pages, but this cannot explain why these pages don't rank at all, as some of the terms are relatively non-competitive.
A number of our competitors rank for almost all of these terms, despite their pages being exact duplicates with simply the city/town name being changed. Any ideas where we've gone wrong?
-
I'm from Burnley originally and I've worked in Blackburn and Manchester previously but now I live and work in Dublin, Ireland It's nice to see somebody local on here.
I would suggest Social Bookmarking the new pages that you have created and I think you'll be surprised at what will happen, something so simple. Have you updated your sitemap as well?
-
Thanks for the reply Glenn. I really can't see why we would have been penalised as everything we do is above board, although it does seem as if that might be the case. I certainly think that the QDF point you make is a valid one, although it could have been around the time of the latest Panda update too, so perhaps that might have flagged up something.
I think our next step might be to recreate the pages from scratch on entirely new URLs and see if that has any effect. We will certainly try and poach some of our competitor's links too!
-
It's possible that your site has been penalized, though I don't see too many reasons why it would be in reviewing your OSE report. From a cursory investigation, I'd say you've done a great job earning the links pointing to your site... though if any trickery was involved, you may be penalized, so you may want to investigate how to get out of that trap.
I suggest you investigate the link profiles of the competitors who rank for almost all of your targeted terms. If your on-page SEO is truly better than there's, it's likely that their external link profile is earning them the rankings you desire. Learn from their strategy.
Your initial high rankings could have been related to QDF.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonicals for Splitting up large pagination pages
Hi there, Our dev team are looking at speeding up load times and making pages easier to browse by splitting up our pagination pages to 10 items per page rather than 1000s (exact number to be determined) - sounds like a great idea, but we're little concerned about the canonicals on this one. at the moment we rel canonical (self) and prev and next. so b is rel b, prev a and next c - for each letter continued. Now the url structure will be a1, a(n+), b1, b(n+), c1, c(n+). Should we keep the canonicals to loop through the whole new structure or should we loop each letter within itself? Either b1 rel b1, prev a(n+), next b2 - even though they're not strictly continuing the sequence. Or a1 rel a1, next a2. a2 rel a2, prev a1, next a3 | b1 rel b1, next b2, b2 rel b2, prev b1, next b3 etc. Would love to hear your points of view, hope that all made sense 🙂 I'm leaning towards the first one even though it's not continuing the letter sequence, but because it's looping the alphabetically which is currently working for us already. This is an example of the page we're hoping to split up: https://www.world-airport-codes.com/alphabetical/airport-name/b.html
Intermediate & Advanced SEO | | Fubra0 -
Keep ranking homepage for target keyword, or switch to another page?
Hi Moz Community! I've researched Moz to find the answer to this question but nothing for my situation. I'm hoping some experienced SEOs can help me out. Here's the situation: I'm up against some fairly stiff competition for my main keyword - the front page is dominated by major manufacturers with high brand recognition and loads of money, where as my client is a much smaller manufacturer trying to compete. However, their DA is only 37-53 so not impossible to outrank... just many links and a significant advantage. We've honed in on a keyword that still drives good traffic, that's a great term to drive paying customers, and that we can get competitive with. My strategy was to attempt to rank my client's _homepage _for this term, rather than a specific product page, as I knew that they'd have many more links and social shares of their main site. (I've been successful with this strategy before). We've risen 60+ positions for the keyword in the past 3 months, to position 12, but we seem to have plateaued for the past month. We're ranking in top 5 positions for a number of our other keywords, so I know we're trending well. However, I'm concerned that despite our quick rise to #12, I may have made a seemingly fatal decision to rank their homepage for our target keyword term. After we had plateaued for a while, I did a more thorough side by side comparison and found that 8 out of 10 competitors on the front page have 2 main things we don't (and can't, because we're ranking the homepage)... 1- The keyword in the url (they're ranking for product pages, i.e. homepage.com/keyword-here/) 2- Their keyword comes first, or early in the meta title. Ours is _supposed to _, but as you know- Google can do what it likes with your homepage title as it's your brand, so they've put our company name- _then _the keyword we added in the title. e.g. Our Company | The Term We're Ranking For We've done a lot of work, and gained many reputable, high quality links, and we did see a significant rank increase across all our pages. My question is- did I shoot myself in the foot? Or is ranking the homepage still viable in this situation? If ultimately this is going to be impossible to get in the top #5 spots, what can I do to fix it? We've already gained a PA of 38 on the homepage from our work. Or would you let it go and just keep working at it, expecting that eventually we'll break onto the front page? Thanks in advance! Let me know if you need more info. I tried to be general with terms/site for my client's sake.
Intermediate & Advanced SEO | | TheatreSolutionsInc0 -
Advanced: SEO best practice for a large forum to minimise risk...?
Hi Hope someone can offer some insight here. We have a site with an active forum. The transactional side of the site is about 300 pages totals, and the forum is well over 100,000 (and growing daily) meaning the 'important' pages account for less that 0.5% of all pages on the site. Rankings are pretty good and we're ticking lots of boxes with the main site, with good natural links, logical architecture, appropriate keyword targeting. I'm worried about the following: crawl budget PR flow Panda We actively moderate the forum for spam and generally the content is good (for a forum anyway), so I'm just looking for any best practice tips for minimising risk. I've contemplated moving the forum to a subdomain so there's that separation, or even noindexing the forum completely, although it does pull in traffic. Has anyone been in a similar situation? Thanks!
Intermediate & Advanced SEO | | iProspect_Manchester1 -
Question regarding geo-targeting in Google Webmaster Tools.
I understand that it's possible to target both domains/subdomains and subfolders to different geographical regions in GWT. However, I was wondering about the effect of targeting the domain to a single country, say the UK. Then targeting subfolders to other regions (say the US and France). e.g. www.domain.com -> UK
Intermediate & Advanced SEO | | TranslateMediaLtd
www.domain.com/us -> US
www.domain.com/fr -> France etc Would it be better to leave the main domain without a geographical target but set geo-targeting for the subfolders? Or would it be best to set geo-targeting for both the domain and subfolders.0 -
Is it Wortwhile to have a HTML site map for a Large Site
We are a large, enterprise site with many pages (some on our CMS and some old pages that exist outside our CMS). Every month we submit various an XML site map. Some pages on our site can no longer be found via following links from one page to another (orphan pages). Some of those pages are important and some not. Is it worth our while to create a HTML site map? Does any one have any recent stats or blog posts to share, showing how a HTML site map may have benefited a large site. Many thanks
Intermediate & Advanced SEO | | CeeC-Blogger0 -
How would you target three synonymous phrases for the same product?
I have a site that I'm working on that sells waste oil heaters, and I'm beginning to run into an issue. As one would assume, our primary keyword phrase is "waste oil heaters" for which we're doing rather well. The issue is that there are two other phrases that are directly synonymous to our primary term that users are actively searching for (i.e. the product can accurate be called three different things). Phrases are listed below w/ phrase match search volumes "waste oil heater" - 6600 "waste oil burner" - 2400 "waste oil furnace" - 1900 I'm not one who likes to engage in trying to "trick" anything, so I'm fairly opposed to listing all three of these in the title tag or something similar. This is being done by our competitors, but only one outranks us as this point for the primary phrase. My initial thoughts are that we should be targeting our home page and category page for "waste oil heater(s)", and then lightly pepper our content with the use of these synonyms. Then from there we can focus on other term variations w/ our blog posts and try to vary up the anchor text coming into the site when we launch link building. What do you guys think? Have you guys been a situation like this with three phrases describing the same product? I appreciate any feedback or advice. Thanks guys!
Intermediate & Advanced SEO | | CaddisInteractive0 -
How do I geo-target continents & avoid duplicate content?
Hi everyone, We have a website which will have content tailored for a few locations: USA: www.site.com
Intermediate & Advanced SEO | | AxialDev
Europe EN: www.site.com/eu
Canada FR: www.site.com/fr-ca Link hreflang and the GWT option are designed for countries. I expect a fair amount of duplicate content; the only differences will be in product selection and prices. What are my options to tell Google that it should serve www.site.com/eu in Europe instead of www.site.com? We are not targeting a particular country on that continent. Thanks!0 -
Large volume of ning files in subdomain - hurting or helping?
I have a client that has 600 pages in their root domain and a subdomain that contains 7500 pages of un-seoable Ning pages. PLUS another 650 pages from Sched.com that also is contributing to a large volume of errors. My question is - should I create a new domain for the Ning content - or am I better off with the volume of pages - even if they have loads of errors? Thanks!
Intermediate & Advanced SEO | | robertdonnell0