Large scale geo-targeting?
-
Hi there. We are an internet marketing agency and recently did a fair amount of working trying to optimise for a number of different locations. Although we are based in Preston (UK), we would like to attract clients from Manchester, Liverpool, etc.
We created landing pages for each of the locations that we wanted to target and each of the services - so we had an SEO Manchester page and a Web Design Manchester page for example. These were all written individually by a copywriter in order to avoid duplicate content. An example of one of the first of these pages is here: http://www.piranha-internet.co.uk/places/seo-blackpool.php
We created a 'where we cover' page and used a clickable map rather than huge long list of text links, which we felt would be spammy, to link through to these pages. You can see this page here: http://www.piranha-internet.co.uk/where-we-cover.php
Initially we gained a great deal of success from this method - with the above Blackpool page ranking #7 for "SEO Blackpool" within a week. However these results quickly disappeared and now we don't rank at all, though the pages remain in the index. I'm aware that we don't have many external links pointing to these pages, but this cannot explain why these pages don't rank at all, as some of the terms are relatively non-competitive.
A number of our competitors rank for almost all of these terms, despite their pages being exact duplicates with simply the city/town name being changed. Any ideas where we've gone wrong?
-
I'm from Burnley originally and I've worked in Blackburn and Manchester previously but now I live and work in Dublin, Ireland It's nice to see somebody local on here.
I would suggest Social Bookmarking the new pages that you have created and I think you'll be surprised at what will happen, something so simple. Have you updated your sitemap as well?
-
Thanks for the reply Glenn. I really can't see why we would have been penalised as everything we do is above board, although it does seem as if that might be the case. I certainly think that the QDF point you make is a valid one, although it could have been around the time of the latest Panda update too, so perhaps that might have flagged up something.
I think our next step might be to recreate the pages from scratch on entirely new URLs and see if that has any effect. We will certainly try and poach some of our competitor's links too!
-
It's possible that your site has been penalized, though I don't see too many reasons why it would be in reviewing your OSE report. From a cursory investigation, I'd say you've done a great job earning the links pointing to your site... though if any trickery was involved, you may be penalized, so you may want to investigate how to get out of that trap.
I suggest you investigate the link profiles of the competitors who rank for almost all of your targeted terms. If your on-page SEO is truly better than there's, it's likely that their external link profile is earning them the rankings you desire. Learn from their strategy.
Your initial high rankings could have been related to QDF.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Canonical tag on a large site
when would you reccomend using a canonical tag on a large site?
Intermediate & Advanced SEO | | Cristiana.Solinas0 -
Totally lost ranking for a targetted page - and dont understand why
I am trying to rank this page https://www.vitari.no/regnskapssystem/visma-net/ for keyword visma.net on google.no. At first it went really well, and I was on first page, ranking as number 8. Then it fell to second page. But then it totally dissapeared. If I go to google.no now and search for visma.net, its not on any of the pages, I have looked through them all. I have used moz and other seo tools to analyze, and try to understand what has happened. But I simply cant understand it. And I cant get rankings back. I've been fighting with this for a long time. However - other pages on the site is ranking for the same keyword. If anyone has an answer leading to me solving this, they will be my hero of the day!
Intermediate & Advanced SEO | | contenting0 -
Will a GEO Localization site create thousands of duplicates?
Hi mozzers, We are about to launch a new site and right now I am worried that this new site may create thousands of duplicate content which will harm all the SEO that has been done in the last few years. Here is a situation: You land on the example.com/Los-angeles page (geo located) but if you modify URI to example.com/chico then a pop up appears and ask you for the location you want to be in (pop up attached). When choosing chico the URI switches to example.com/chico?franchise=chico instead of /chico only. This site has over 40 different microsites so my question are all these arguments ?franchise=city going to be indexed and create thousands of dups? or are we safe because this geo localization happens thanks to javascript? Thanks! GopRinh.png
Intermediate & Advanced SEO | | Ideas-Money-Art0 -
Large-Scale Penguin Cleanup - How to prioritize?
We are conducting a large-scale Penguin cleanup / link cleaning exercise across 50+ properties that have been on the market mostly all for 10+ years. There is a lot of link data to sift through and we are wondering how we should prioritize the effort. So far we have been collecting backlink data for all properties from AHref, GWT, SeoMajestic and OSE and consolidated the data using home-grown tools. As a next step we are obviously going through the link cleaning process. We are interested in getting feedback on how we are planning to prioritize the link removal work. Put in other words we want to vet if the community agrees with what we consider are the most harmful type of links for penguin. Priority 1: Clean up site-wide links with money-words; if possible keep a single-page link Priority 2: Clean up or rename all money keyword links for money keywords in the top 10 anchor link name distribution Priority 3: Clean up no-brand sitewide links; if possible keep a single-page link Priority 4: Clean up low-quality links (other niche or no link juice) Priority 5: Clean up multiple links from same IP C class Does this sound like a sound approach? Would you prioritize this list differently? Thank you for any feedback /T
Intermediate & Advanced SEO | | tomypro1 -
Silo This! Siloing issue with KW targets and multiple categories
I am having a difficult time determining how to silo the content for this website (douwnpour). The issue I am having is that as I see it there are several different top-level keyword targets to put at the top of the silos, however due to the nature of the products they fit in almost every one of the top-level categories. For instance our main keyword term is "Audio Books" (and derivatives thereof). but we also want to target "Audiobook Downloads" and "Books on CD". Due to the nature of the products, almost every product would fit in all 3 categories. It gets even worse when you consider normal book taxonomy. The normal breakdown would be from audiobooks>Fiction(or Nonfiction). Now each product also belongs to one of these categories, as well as "download", "CD", and "Audiobook". And still worse, our navigation menus link every page on the site back to all of these categories (except audiobooks, as we don't really have a landing page for that besides the home page, which is lacking in optimized content, but is linked from every page on the site.) So, I am finding siloing, or developing a cross-linking plan that makes sense very difficult. It's much easier at the lower levels, but at the top things become muddy. Throw in the idea that we may eventually get e-books as well, and it gets even muddier. I have some ideas of how to deal with some of this, such as having the site navigation put in an i frame, instituting basic breadcrumbs, and building landing pages, but I'm open to any advice or ideas that might help, especially with the top level taxonomy structure. TIA!
Intermediate & Advanced SEO | | DownPour0 -
Geo-specific SERP Rank Tracker that is good for hyper local results?
We've had a lot of success using Raven Tools, as well as some other tools for SERP Rankings for our clients; however, most only go down to the country level. We're researching into some good hyper local trackers (down to the city/zip level). Does anyone have any suggestions?
Intermediate & Advanced SEO | | BlastAM0 -
Geo-tagging using cookie - Is it Good or Bad for Rankings
We have a fairly large site which does a cookie-based 302 redirect to the the specific city page if someone types in the Home page URL. Though if the cookie is not available (first time user) it goes to the Homepage and asks user to select the city as our services are city specific. Everything is working fine with this setup. Though our tech team now wants to display the contents of city page on homepage URL itself if the cookie is available without 302 redirecting to new URL. Though no-cookie available scenario remains unchanged. Technically, I think this change should work fine without any ranking issues as still the first time users see the actual homepage as does Googlebot. Please confirm possible issues in rankings with this change from your experiences as based upon city present in the cookie homepage will display different content.
Intermediate & Advanced SEO | | Webmaster_SEO0 -
Which page to target? Home or /landing-page
I have optimized my home page for the keyword "computer repairs" would I be better of targeting my links at this page or an additional page (which already exists) called /repairs it's possible to rename & 301 this page to /computer-repairs The only advantage I can see from targeting /computer-repairs is that the keywords are in the target URL.
Intermediate & Advanced SEO | | SEOKeith0