What is the most effective way of selecting a top keyword per page on a site?
-
We are creating fresh content for outdated sites and I need to identify the most significant keyword per page for the content developers, What is the best way to do this?
-
Hello,
The answer is research. I primary use Google's Keyword Tool, but there are other tools out there that I have see many Mozers reference like Majestic's Keyword tool.
The first thing if you're working with an existing page is to know what the page is about, that should focus you to a main keyword or 2. From there you do the research to find the valuable keywords for the content that is going to be created or is already there.
If you're looking at it from a high level trying to ascertain which pages are relevant or need updating I would then suggest start with brainstorming possible keywords. Just think if you were a user what would keywords would you use to find this page? That should generate some keyword ideas; you then take over to a keyword tool and check the popularity, thus the potential traffic.
The process is not usually quick, but it is effective and I highly recommend that while doing the research you document the results. By doing so, if you ever have to go back and say which keywords did we target, and why, you'll have your answer.
I hope this helps,
Don
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does intercom support pages and redirect issue can affect the SEO performance of our website?
I noticed that in the redirect issues I have, most of the issues are coming from our Intercom support links. I want to ask, does intercom support pages and redirect issue can affect the SEO performance of our website?
Reporting & Analytics | | Envoke-Marketing0 -
Deleted Rarely Visited Pages - Traffic Dropped (Big Time)
Hi folks: I'd appreciate any thoughts you might have on a problem I am having with organic traffic. One of our sites has about 500 pages/blog posts. We had about 200 pages that no one was visiting, or only one to ten people had visited in an entire year. As a result, we decided to experiment, and delete any page which had fewer than 5 visits in a year. This resulted in a deletion of about 90 pages.We did this on April 6 or 7 of this year. Two days later, we had a substantial drop in visits to the site. We had been getting about 300 sessions a day. Now, we are lucky to get that in a month. I know there was an algorithm update in late March, but our traffic dropped about two weeks after that, and a day or so after the deletion of the pages. There is a clear demarcation on analytics. I gave it a month, the traffic did not recover, so we decided to restore the pages. Traffic has not recovered and it has been about 3 months now. Does anyone have any thoughts on why we might have experienced such a drastic drop as well as what we might do to recover from it? Thanks very much
Reporting & Analytics | | jnfere0 -
Landing page URL appearing as keyword
Hi Mozers, I've recently experienced the URLs of my key landing pages coming up as keywords. This has been on the rise since early July (when it was relatively insignificant) to the current position (see image below) where they make up the majority of my top keywords. Drilling down into a bit more detail, this seems to be almost exclusively Desktop traffic but in terms of Technology there are no clear standouts (seems to be mostly Windows OS and Chrome). Has anyone else been experiencing this?
Reporting & Analytics | | mopland0 -
What is the best way to eliminate this specific image low lying content?
The site in question is www.homeanddesign.com where we are working on recovering from some big traffic loss. I finally have gotten the sites articles properly meta titled and descriptioned now I'm working on removing low lying content. The way there CMS is built, images have their own page (every one that's clickable). So this leads to a lot of thin content that I think needs to be removed from the index. Here is an example: http://www.homeanddesign.com/photodisplay.asp?id=3633 I'm considering the best way to remove it from the index but not disturb how users enjoy the site. What are my options? Here is what I'm thinking: add Disallow: /photodisplay to the robots.txt file See if there is a way to make a lightbox instead of a whole new page for images. But this still leaves me with 100s of pages with just an image on there with backlinks, etc. Add noindex tag to the photodisplay pages
Reporting & Analytics | | williammarlow0 -
Any harm and why the differences - multiple versions of same site in WMT
In Google Webmaster Tools we have set up: ourdomain.co.nz
Reporting & Analytics | | zingseo
ourdomain.co.uk
ourdomain.com
ourdomain.com.au
www.ourdomain.co.nz
www.ourdomain.co.uk
www.ourdomain.com
www.ourdomain.com.au
https://www.ourdomain.co.nz
https://www.ourdomain.co.uk
https://www.ourdomain.com
https://www.ourdomain.com.au As you can imagine, this gets confusing and hard to manage. We are wondering whether having all these domains set up in WMT could be doing any damage? Here http://support.google.com/webmasters/bin/answer.py?hl=en&answer=44231 it says: "If you see a message that your site is not indexed, it may be because it is indexed under a different domain. For example, if you receive a message that http://example.com is not indexed, make sure that you've also added http://www.example.com to your account (or vice versa), and check the data for that site." The above quote suggests that there is no harm in having several versions of a site set up in WMT, however the article then goes on to say: "Once you tell us your preferred domain name, we use that information for all future crawls of your site and indexing refreshes. For instance, if you specify your preferred domain as http://www.example.com and we find a link to your site that is formatted as http://example.com, we follow that link as http://www.example.com instead." This suggests that having multiple versions of the site loaded in WMT may cause Google to continue crawling multiple versions instead of only crawling the desired versions (https://www.ourdomain.com + .co.nz, .co.uk, .com.au). However, even if Google does crawl any URLs on the non https versions of the site (ie ourdomain.com or www.ourdomain.com), these 301 to https://www.ourdomain.com anyway... so shouldn't that mean that google effectively can not crawl any non https://www versions (if it tries to they redirect)? If that was the case, you'd expect that the ourdomain.com and www.ourdomain.com versions would show no pages indexed in WMT, however the oposite is true. The ourdomain.com and www.ourdomain.com versions have plenty of pages indexed but the https versions have no data under Index Status section of WMT, but rather have this message instead: Data for https://www.ourdomain.com/ is not available. Please try a site with http:// protocol: http://www.ourdomain.com/. This is a problem as it means that we can't delete these profiles from our WMT account. Any thoughts on the above would be welcome. As an aside, it seems like WMT is picking up on the 301 redirects from all ourdomain.com or www.ourdomain.com domains at least with links - No ourdomain.com or www.ourdomain.com URLs are registering any links in WMT, suggesting that Google is seeing all links pointing to URLs on these domains as 301ing to https://www.ourdomain.com ... which is good, but again means we now can't delete https://www.ourdomain.com either, so we are stuck with 12 profiles in WMT... what a pain.... Thanks for taking the time to read the above, quite complicated, sorry!! Would love any thoughts...0 -
Analytics tagging parameters effect on site SEO
One of the effective tools used in analytics tagging is the use of analytics parameters that starts with '?' or '#'. Example on site tagging: Main link: www.domainname.com./category/sub-category/ www.domainname.com./category/sub-category/?lid=topnav www.domainname.com./category/sub-category/?lid=sidenav All three links link to the same landing page, with an extra parameter. Using email or campaign tagging: www.domainname.com./category/sub-category/ www.domainname.com./category/sub-category/?utm_source=launch&utm_medium=email&utm_term=html&utm_content=getscoop&utm_campaign=hwdyrwm2012 With that we create many tagged links based on the campaign internal strategy. How do these effect indexing, and link juice? How do thy effect SEO in general?
Reporting & Analytics | | RAPPLA0 -
Duplicate Content From My Own Site?!
When I ran the SEO Moz report it says that I have a ton of duplicate content. The first one I looked at was my home page. http://www.kisswedding.com/ http://www.kisswedding.com/index.html http://kisswedding.com/index.html All of the above 3 have varying internal links, page authority, and link root domains. Only the first has any external links. All of the others only seem to have 1 other duplicate page. It's a difference between the www and the non-www version. I have a verified acct for www.kisswedding.com in google webmaster tools. The non-www version is in there too but has not been verified. Under settings for the verified account (www.kisswedding.com), "Don't set a preferred domain" is checked off. Is that my mistake. And if so, which should I select? The www version or the non-www version? Thanks!
Reporting & Analytics | | annasus0