Large scale geo-targeting?
-
Hi there. We are an internet marketing agency and recently did a fair amount of working trying to optimise for a number of different locations. Although we are based in Preston (UK), we would like to attract clients from Manchester, Liverpool, etc.
We created landing pages for each of the locations that we wanted to target and each of the services - so we had an SEO Manchester page and a Web Design Manchester page for example. These were all written individually by a copywriter in order to avoid duplicate content. An example of one of the first of these pages is here: http://www.piranha-internet.co.uk/places/seo-blackpool.php
We created a 'where we cover' page and used a clickable map rather than huge long list of text links, which we felt would be spammy, to link through to these pages. You can see this page here: http://www.piranha-internet.co.uk/where-we-cover.php
Initially we gained a great deal of success from this method - with the above Blackpool page ranking #7 for "SEO Blackpool" within a week. However these results quickly disappeared and now we don't rank at all, though the pages remain in the index. I'm aware that we don't have many external links pointing to these pages, but this cannot explain why these pages don't rank at all, as some of the terms are relatively non-competitive.
A number of our competitors rank for almost all of these terms, despite their pages being exact duplicates with simply the city/town name being changed. Any ideas where we've gone wrong?
-
I'm from Burnley originally and I've worked in Blackburn and Manchester previously but now I live and work in Dublin, Ireland It's nice to see somebody local on here.
I would suggest Social Bookmarking the new pages that you have created and I think you'll be surprised at what will happen, something so simple. Have you updated your sitemap as well?
-
Thanks for the reply Glenn. I really can't see why we would have been penalised as everything we do is above board, although it does seem as if that might be the case. I certainly think that the QDF point you make is a valid one, although it could have been around the time of the latest Panda update too, so perhaps that might have flagged up something.
I think our next step might be to recreate the pages from scratch on entirely new URLs and see if that has any effect. We will certainly try and poach some of our competitor's links too!
-
It's possible that your site has been penalized, though I don't see too many reasons why it would be in reviewing your OSE report. From a cursory investigation, I'd say you've done a great job earning the links pointing to your site... though if any trickery was involved, you may be penalized, so you may want to investigate how to get out of that trap.
I suggest you investigate the link profiles of the competitors who rank for almost all of your targeted terms. If your on-page SEO is truly better than there's, it's likely that their external link profile is earning them the rankings you desire. Learn from their strategy.
Your initial high rankings could have been related to QDF.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to compete against search terms that use geo-modifiers?
I should start by saying we are new to SEO. We are introducing new “cycling tours” in new destinations and we are looking for a strategy to combat geo-modified keyword searches. When people search for “cycling tours” they will anchor their search with a geo-modifier such as “cycling tours France” or “cycling tours Italy”. Based in Australia we are keen to communicate to Australians searching for international cycling tours there are new Australian options that they may wish to consider. The geo-modifiers required to find our tours (“eyre peninsula” and “carnarvon gorge”) are currently not on the cycling communities radar. For example to find one of our new tours you need to use “cycling tours eyre peninsula” or “cycling tours carnarvon gorge”. Currently the only solution we have found to let people know about our new tours is by word of mouth. Is there an SEO solution?
Intermediate & Advanced SEO | | Chook10 -
Best practice for deindexing large quantities of pages
We are trying to deindex a large quantity of pages on our site and want to know what the best practice for doing that is. For reference, the reason we are looking for methods that could help us speed it up is we have about 500,000 URLs that we want deindexed because of mis-formatted HTML code and google indexed them much faster than it is taking to unindex them unfortunately. We don't want to risk clogging up our limited crawl log/budget by submitting a sitemap of URLs that have "noindex" on them as a hack for deindexing. Although theoretically that should work, we are looking for white hat methods that are faster than "being patient and waiting it out", since that would likely take months if not years with Google's current crawl rate of our site.
Intermediate & Advanced SEO | | teddef0 -
International Targeting | Language > 'fa-ir' - no return tags
I see this error in search console :International Targeting | Language > 'fa-ir' - no return tagsURLs for your site and alternate URLs in 'fa-ir' that do not have return tags.and it is really increasingi do not know what is the problem and what I have done wrong? Originating URL Crawl date Alternate URL 1 /abadan/%D8%A2%D8%A8%D8%A7%D8%AF%D8%A7%D9%86/browse/vehicles/?place=8,541&v01=0,1&saveLoc=1 11/16/16 http://divar.ir/
Intermediate & Advanced SEO | | divar0 -
Setting up the right Geo targeting/language targeting settings and not to brake the SEO
Hello the great Moz Community! Gev here from BetConstruct, a leading gaming and betting software provider in the world. Our company website is performing great on SERP. We have 20+ different dedicated pages for our 20+ softwares, event section, different landing pages for different purposes. We also run a blog section, Press section, and more... Our website's default language is EN. 4 months ago we opened the /ru and /es versions of the website! I have set the correct hreflang tags, redirects, etc.. generated correct sitemaps, so the translated versions started to rank normally! Now our marketing team is requesting different stuff to be done on the website and I would love to discuss this with you before implementing! There are different cases! For example: They have created a landing page under a url betconstruct.com/usa-home and want me to set that page as the default website page(ie homepage), if the user visits our website from a US based IP. This can be done in 2 different ways: I can set the /usa-home page as default in my CMS, in case the visitor is from US and the address will be just betconstruct.com(without /use-home). In this case the same URL (betconstruct.com) will serve different content for only homepage. I can check the visitor IP, if he is from US, I can redirect him to betconstruct.com/usa-home. In this case user can click on the logo and go to the homepage betconstruct.com and see the original homepage. Both of the cases seems to be dangerous, because in the 1st case I am not sure what google will think when he sees different homepage from different IPs. And in the 2nd case I am not sure what should be that redirection. Is it 301 or 303, 302, etc... Because Google will think I don't have a homepage and my homepage redirects to a secondary page like /usa-home After digging a lot I realised that my team is requesting from me a strange case. Because the want both language targeting(/es, /ru) and country targeting (should ideally be like /us), but instead of creating /us, they want it to be instead of /en(only for USA) Please let me know what will be the best way to implement this? Should we create a separate version of our website for USA under a /us/* URLs? In this case, is it ok to have /en as a language version and /us as a country targeting? What hreflangs to use? I know this is a rare case and it will be difficult for you to understand this case, but any help will be much appreciated! Thank you! Best,
Intermediate & Advanced SEO | | betconstruct
Gev0 -
One page ranking for all key words, when other targeted pages not ranking
Hi everyone I am fairly new to SEO but have a basic understanding. I have a page that has a lot of content on it (including brand names and product types and relevant info) ranking for a quite a few key words. This is cool, except that I have pages dedicated to each specific key word that are not ranking. The more specific page still has a lot of relevant text on it too. eg. TYRES page - Ranks first for "Tyres". Ranks okay for many tyre key words, including "truck tyres"
Intermediate & Advanced SEO | | JDadd
TRUCK TYRES page - not ranking for "truck tyres" Further on, I then have pages not ranking all that well for more specific key words when they should. eg HONDA TRUCK TYRES - Then has a page full of product listings - no actual text. Not Ranking for "honda truck tyres". ABC HONDA TRUCK TYRE - not ranking for "abc honda truck tyre" key word
These pages don't have a lot of content on them, as essentially every single tyre is the same except for the name. But they do have text. So sometimes, these terms don't rank at all. And sometimes, the first TYRES page ranks for it. I have done the basic on page seo for all these pages (hopefully properly) including meta desc, meta titles, H1, H2, using key words in text, alt texting images where possible etc. According to MOZ they are optimised in the 90%. Link building is difficult as they are product listings, so other sites don't really link to these pages. Has anyone got ideas on why the top TYRES page might be so successful and out ranking more specific pages? Any ideas on how I can get the other pages ranking higher as they are more relevant to the search term? We are looking in to a website redesign/overhaul so any advice on how I can prevent this from happening on essentially a new site would be great too. Thanks!0 -
Large-Scale Penguin Cleanup - How to prioritize?
We are conducting a large-scale Penguin cleanup / link cleaning exercise across 50+ properties that have been on the market mostly all for 10+ years. There is a lot of link data to sift through and we are wondering how we should prioritize the effort. So far we have been collecting backlink data for all properties from AHref, GWT, SeoMajestic and OSE and consolidated the data using home-grown tools. As a next step we are obviously going through the link cleaning process. We are interested in getting feedback on how we are planning to prioritize the link removal work. Put in other words we want to vet if the community agrees with what we consider are the most harmful type of links for penguin. Priority 1: Clean up site-wide links with money-words; if possible keep a single-page link Priority 2: Clean up or rename all money keyword links for money keywords in the top 10 anchor link name distribution Priority 3: Clean up no-brand sitewide links; if possible keep a single-page link Priority 4: Clean up low-quality links (other niche or no link juice) Priority 5: Clean up multiple links from same IP C class Does this sound like a sound approach? Would you prioritize this list differently? Thank you for any feedback /T
Intermediate & Advanced SEO | | tomypro1 -
How Do I Generate a Sitemap for a Large Wordpress Site?
Hello Everyone! I am working with a Wordpress site that is in Google news (i.e. everyday we have about 30 new URLs to add to our sitemap) The site has years of articles, resulting in about 200,000 pages on the site. Our strategy so far has been use a sitemap plugin that only generates the last few months of posts, however we want to improve our SEO and submit all the URLs in our site to search engines. The issue is the plugins we've looked at generate the sitemap on-the-fly. i.e. when you request the sitemap, the plugin then dynamically generates the sitemap. Our site is so large that even a single request for our sitemap.xml ties up tons of server resources and takes an extremely long time to generate the sitemap (if the page doesn't time out in the process). Does anyone have a solution? Thanks, Aaron
Intermediate & Advanced SEO | | alloydigital0 -
Best way to geo redirect
Hi I have a couple of ecommerce websites which have both a UK and USA store. At the moment I have both the UK and the USA domains sending me traffic from UK and USA search engines which means that a number of users are clicking a Google page for the store not in their location, ie UK people are clicking on a .com listing and ending up on the USA website. What is the best way to automatically redirect people to the correct store for their region? If I use an IP based auto redirect system would Google see some of the pages are doorway pages? Thanks
Intermediate & Advanced SEO | | Grumpy_Carl0