Geo-Targeting Content
-
I'm to get some ideas on restructuring existing content for geo-targeting.
Example: Botox Page
Tis is a hypothetical situation -- laser cosmetics clinic in Atlanta trying to rank for Atlanta Botox. The existing content is general information about botox procedures. The problem is editing the content to add Atlanta to the H1 tag and page copy.
I'm wondering if there are some techniques to make the edits flow better? My idea is to add a geo-page for each procedure, but I'm wondering if this might interrupt or confuse users in the navigation funnel.
Your thoughts?
Thanks!
-
Thanks David.
I'm definitely trying avoid the crappy content. I'm trying to figure out how I could create internal anchor text in the internal links following this approach.
-
Generally, I would use something like the following:
"Botox Procedures in Atlanta, Georgia" for the H1 or Title tag, make sense? You just need to rewrite the content so that it is written from a geo-targeted area, does that make sense? You can't simpy find and replace to add in geo-targeting, unless you want crappy content.
Rewrite it with geo-targeting in mind. The biggest thing would be a few hyperlinks with the geotargeted anchor text, title tag, and H1 tag and some links.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blog with all copied content, should it be rewritten?
Hi, I am auditing a blog where their goal is to get approved to on ad networks but the whole blog has copied content from different sources, so no ad network is approving them. Surprisingly (at least to me), is that the blog ranks really well for a few keywords (#1's and rich snippets ), has a few hundred of natural backlinks, DA is high, has never been penalized (they have always used canonical tags to the original content), traffic is a few thousand sessions a month with mostly 85% organic search, etc. overall Google likes it enough to show them high on search. So now the owner wants to monetize it. I suggested that the best approach was to rewrite their most visited articles and deleted the rest with 301 redirects to the posts that stay. But I actually haven't worked on a similar project before and can't find precise information online so I'm looking to know if anyone has a similar experience to this. A few of my questions are: If they rewrite most of the pages and delete the rest so there is no repeated/copied content, would ad networks (eg. adsense) approve them? Assuming the new articles are at least as good quality as the current ones but with original content, is there a risk on losing DA? since pretty much it will look like a new site once they are done They have thousands of articles but only about 200 hundred get most visits, which would be the ones getting rewritten, so it should be fine to redirect the deleted ones to the remaining? Thanks for any suggestions and/or tips on this 🙂
Intermediate & Advanced SEO | | ArturoES0 -
Targeting KWDs
Hi I'm looking into competitors for a high volume keyword and reviewing their top ranked page to see what else they rank for in this category. How is it possible that one page of theirs ranks for over 500 key phrases? They have a little bit of content at the bottom http://www.homebase.co.uk/en/homebaseuk/homeware/storage-and-shelving/storage-boxes-and-drawers
Intermediate & Advanced SEO | | BeckyKey0 -
Duplicate content in external domains
Hi,
Intermediate & Advanced SEO | | teconsite
I have been asking about this case before, but now my question is different.
We have a new school that offers courses and programs . Its website is quite new (just a five months old) It is very common between these schools to publish the courses and programs in training portals to promote those courses and to increase the visibility of them. As the website is really new, I found when I was doing the technical audit, that when I googled a text snipped from the site, the new school website was being omitted, and instead, the course portals are being shown. Of course, I know that the best recommendation would be to create a different content for that purpose, but I would like to explore if there is more options. Most of those portals doesn't allow to place a link to the website in the content and not to mention canonical. Of course most of them are older than the new website and their authority is higher. so,... with this situation, I think the only solution is to create a different content for the website and for the portals.
I was thinking that maybe, If we create the content first in the new website, send it to the index, and wait for google to index it, and then send the content to the portals, maybe we would have more opportunites to not be ommited by Google in search results. What do you think? Thank you!0 -
User generated content (Comments) - What impact do they have?
Hello MOZ stars! I have a question regarding user comments on article pages. I know that user generated content is good for SEO, but how much impact does it really have? For your information:
Intermediate & Advanced SEO | | idg-sweden
1 - All comments appears in source code and is crawled by spiders.
2 - A visitor can comment a page for up to 60 days.
3 - The amount of comments depends on the topic, we usually gets between 3-40 comments. My question:
1 - If we were to remove comments completely, what impact would it have from seo perspective? (I know you cant be certain - but please make an educated guess if possible)
2 - If it has a negative and-/or positive impact please specify why! 🙂 If anything is unclear or you want certain information don't hesitate to ask and I'll try to specify. Best regards,
Danne0 -
No-index pages with duplicate content?
Hello, I have an e-commerce website selling about 20 000 different products. For the most used of those products, I created unique high quality content. The content has been written by a professional player that describes how and why those are useful which is of huge interest to buyers. It would cost too much to write that high quality content for 20 000 different products, but we still have to sell them. Therefore, our idea was to no-index the products that only have the same copy-paste descriptions all other websites have. Do you think it's better to do that or to just let everything indexed normally since we might get search traffic from those pages? Thanks a lot for your help!
Intermediate & Advanced SEO | | EndeR-0 -
Does Google bot read embedded content?
Is embedded content "really" on my page? There are many addons nowadays that are used by embedded code and they bring the texts after the page is loaded. For example - embedded surveys. Are these read by the Google bot or do they in fact act like iframes and are not physically on my page? Thanks
Intermediate & Advanced SEO | | BeytzNet0 -
Mobile Site - Same Content, Same subdomain, Different URL - Duplicate Content?
I'm trying to determine the best way to handle my mobile commerce site. I have a desktop version and a mobile version using a 3rd party product called CS-Cart. Let's say I have a product page. The URLs are... mobile:
Intermediate & Advanced SEO | | grayloon
store.domain.com/index.php?dispatch=categories.catalog#products.view&product_id=857 desktop:
store.domain.com/two-toned-tee.html I've been trying to get information regarding how to handle mobile sites with different URLs in regards to duplicate content. However, most of these results have the assumption that the different URL means m.domain.com rather than the same subdomain with a different address. I am leaning towards using a canonical URL, if possible, on the mobile store pages. I see quite a few suggesting to not do this, but again, I believe it's because they assume we are just talking about m.domain.com vs www.domain.com. Any additional thoughts on this would be great!0 -
Subdomains - duplicate content - robots.txt
Our corporate site provides MLS data to users, with the end goal of generating leads. Each registered lead is assigned to an agent, essentially in a round robin fashion. However we also give each agent a domain of their choosing that points to our corporate website. The domain can be whatever they want, but upon loading it is immediately directed to a subdomain. For example, www.agentsmith.com would be redirected to agentsmith.corporatedomain.com. Finally, any leads generated from agentsmith.easystreetrealty-indy.com are always assigned to Agent Smith instead of the agent pool (by parsing the current host name). In order to avoid being penalized for duplicate content, any page that is viewed on one of the agent subdomains always has a canonical link pointing to the corporate host name (www.corporatedomain.com). The only content difference between our corporate site and an agent subdomain is the phone number and contact email address where applicable. Two questions: Can/should we use robots.txt or robot meta tags to tell crawlers to ignore these subdomains, but obviously not the corporate domain? If question 1 is yes, would it be better for SEO to do that, or leave it how it is?
Intermediate & Advanced SEO | | EasyStreet0