H2's are already ranking well. Should I rock the boat?
-
I recently began work for a company and discovered that they are not using h1's (using h2's) and rank in the top 5 for ~90% of their keywords.
The site is one of the original players in their industry, has massive amounts of domain authority and tens of thousands of linking root domains. However, they are currently being beaten on some of their top keywords by a few of their younger competitors.
Moving their current h2 text into h1 tags could be helpful. But to what extent? Since they already rank well for so many competitive keywords, Is it worth it to rock the boat by moving their h2 text into h1 tags and risk affecting their current rankings?
-
Thanks for taking the time to answer my question Claudio.
-
Thanks for taking the time to detail your explanation, Nakul. Your method is a good one for testing. Cheers!
-
Thanks for your feedback and experience Mark. I appreciate it.
-
I really appreciate your encouragement, Brad. Your experience provides me with some hope of further boosting organic rankings and besting the competition. Thanks for sharing.
-
I've been doing SEO for a decent amount of time. You would think I would remember "when in doubt, test it out."
It's funny how when you're in the thick of things that the most obvious of answers can elude you.This is why I love the Moz community. Thanks for the reminder Ade!
-
I agree with all of the others especially about testing on a few keywords that are not mission critical.
My experience and past sites I have optimized are not affected much by H2 tags. The main factors that really made a difference were the title tags, H1, page interlinking using anchor text (not over-optimized - vary it).
There are 200+ (from the last info I heard) ranking factors. Obviously having all of these work together in synergy is best.
The good news you already have domain authority etc. I would think that it definitely won't hurt and may even help slightly.
-
Hi Collin,
I recently did this for a client site of mine (alas it didn't have a huge amount of domain authority). The results were very pleasing, we noticed a jump in keywords simply from switching from H2 to H1.
My advice - definitely go ahead with testing like the others have suggested. You've got nothing to lose!
Thanks,
Brad
-
Dear Collin,
I agree 100% with my mates, but I want to add some of my experience, for years I was doing this :
1. the exactly content of your title tag (I assume you're using your most important keywords) should be in the H! tag and this H1 tag should be as close to tag as is possible (prominence) you can use only one H1 tag on the page.
2. The use the H2 tag is optional and you can use it two or more times (no bother), usually I use two H2 tags with keywords related to my H1 (main keyword).
In the past months I was feeling the H2 is not important as in the past, and google now disagree with "over optimization" thats mean "pages with perfect keywords distribution, prominence and density".....
So you should use your primary keyword in the title tag and H1 tag and optionally H2 tags, the key is original and useful content for visitors.
Hope this help
Claudio
-
I agree with Ade. Just test it on a small scale and see what the results look like. I would suggest you try a couple different options.
1. Keywords Currently in Positions 2-5
2. Keywords Currently in Positions 6-10
3. Keywords Currently on Page 2
Find 5 in each category, make the changes and watch and see where the most impact was and make sure you are not directly impacting the test with any other items of influence (as much as you can control) to get an accurate read on the test.
-
Hi Colin,
Why don't you take it slowly and carry out some testing?
I would choose a few pages where they are ranking well for some (not so important) keywords and switch these titles to H1's, leave it for a few weeks and see how it goes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Ranking fluctuation
According to Miles Rank tracking I have a keyword that is fluctuating by 5 or 6 positions in a 24-hour period every time it goes 10 down to 15 and back to 10 again with very little stopping point in between and keeps fluctuating like that. As far as I know the website is stable it uses Magento 1.9 and is a fixed category page. The page URL is relatively new as we migrated from an old site with a different URL structure I added 301s at server level from the old page URL to the new one. There is only two things I can think that can be the problem -
Technical SEO | | seoman10
1. Lack of links directly to the page (there is literally only one or two)
2. Loading speed problem But I don't see any of these would cause a 5 position fluctuation regularly every day. But do you have any thoughts? Thanks in advance.0 -
Specific question about pagination prompted by Adam Audette's Presentation at RKG Summit
This question is prompted by something Adam Audette said in this excellent presentation: http://www.rimmkaufman.com/blog/top-5-seo-conundrums/08062012/ First, I will lay out the issues: 1. All of our paginated pages have the same URL. To view this in action, go here: http://www.ccisolutions.com/StoreFront/category/audio-technica , scroll down to the bottom of the page and click "Next" - look at the URL. The URL is: http://www.ccisolutions.com/StoreFront/IAFDispatcher, and for every page after it, the same URL. 2. All of the paginated pages with non-unique URLs have canonical tags referencing the first page of the paginated series. 3. http://www.ccisolutions.com/StoreFront/IAFDispatcher has been instructed to be neither crawled nor indexed by Google. Now, on to what Adam said in his presentation: At about minute 24 Adam begins talking about pagination. At about 27:48 in the video, he is discussing the first of three ways to properly deal with pagination issues. He says [I am somewhat paraphrasing]: "Pages 2-N should have self-referencing canonical tags - Pages 2-N should all have their own unique URLs, titles and meta descriptions...The key is, with this is you want deeper pages to get crawled and all the products on there to get crawled too. The problem that we see a lot is, say you have ten pages, each one using rel canonical pointing back to page 1, and when that happens, the products or items on those deep pages don't get get crawled...because the rel canonical tag is sort of like a 301 and basically says 'Okay, this page is actually that page.' All the items and products on this deeper page don't get the love." Before I get to my question, I'll just throw out there that we are planning to fix the pagination issue by opting for the "View All" method, which Adam suggests as the second of three options in this video, so that fix is coming. My question is this: It seems based on what Adam said (and our current abysmal state for pagination) that the products on our paginated pages aren't being crawled or indexed. However, our products are all indexed in Google. Is this because we are submitting a sitemap? Even so, are we missing out on internal linking (authority flow) and Google love because Googlebot is finding way more products in our sitemap that what it is seeing on the site? (or missing out in other ways?) We experience a lot of volatility in our rankings where we rank extremely well for a set of products for a long time, and then disappear. Then something else will rank well for a while, and disappear. I am wondering if this issue is a major contributing factor. Oh, and did I mention that our sort feature sorts the products and imposes that new order for all subsequent visitors? it works like this: If I go to that same Audio-Technica page, and sort the 125+ resulting products by price, they will sort by price...but not just for me, for anyone who subsequently visits that page...until someone else re-sorts it some other way. So if we merchandise the order to be XYZ, and a visitor comes and sorts it ZYX and then googlebot crawls, google would potentially see entirely different products on the first page of the series than the default order marketing intended to be presented there....sigh. Additional thoughts, comments, sympathy cards and flowers most welcome. 🙂 Thanks all!
Technical SEO | | danatanseo0 -
Need advice for new site's structure
Hi everyone, I need to update the structure of my site www.chedonna.it Basicly I've two main problems: 1. I've 61.000 index tag (more with no post)2. The category of my site are noindex I thought to fix my problem making the category index and the tag noindex, but I'm not sure if this is the best solution because I've a great number of tag idexed by Google for a long time. Mybe it is correct just to make the category index and linking it from the post and leave the tag index. Could you please let me know what's your opinion? Regards.
Technical SEO | | salvyy0 -
Robots.txt crawling URL's we dont want it to
Hello We run a number of websites and underneath them we have testing websites (sub-domains), on those sites we have robots.txt disallowing everything. When I logged into MOZ this morning I could see the MOZ spider had crawled our test sites even though we have said not to. Does anyone have an ideas how we can stop this happening?
Technical SEO | | ShearingsGroup0 -
Toggle Menu's and Collapsible Nav Structure Good For SEO?
Does anyone have any insights on toggle menu's or collapsible navigation structure and if its good/bad for Search?
Technical SEO | | Your_Workshop0 -
Should we use Google's crawl delay setting?
We’ve been noticing a huge uptick in Google’s spidering lately, and along with it a notable worsening of render times. Yesterday, for example, Google spidered our site at a rate of 30:1 (google spider vs. organic traffic.) So in other words, for every organic page request, Google hits the site 30 times. Our render times have lengthened to an avg. of 2 seconds (and up to 2.5 seconds). Before this renewed interest Google has taken in us we were seeing closer to one second average render times, and often half of that. A year ago, the ratio of Spider to Organic was between 6:1 and 10:1. Is requesting a crawl-delay from Googlebot a viable option? Our goal would be only to reduce Googlebot traffic, and hopefully improve render times and organic traffic. Thanks, Trisha
Technical SEO | | lzhao0 -
Redirect and ranking
Wehave 2 websites for the same keyword Website 1 is indexed on place 2 but we do not like that name any longer it does not fit our long term marketing Website 2 is indexed on place 5 and this domain fits better What will happen if we redirect website 1 to website 2? Fall down to postion 5 Fall down to position 5 and after a certain period we get back at position 2 or 3 thanx in advance for your reply
Technical SEO | | turnon0 -
Slashes In Url's
If your cms has created two urls for the same piece of content that look like the following, www.domianname.com/stores and www.domianname.com/stores/, will this be seen as duplicate content by google? Your tools seem to pick it up as errors. Does one of the urls need 301 to the other to clear this up, or is it not a major problem? Thanks.
Technical SEO | | gregster10000