Is This Worth Fixing?
-
Hi,
I'm working on a site that was last optimized some years ago. It has a fair number of pages that the url, h1, title tag and image alt exact match. Although this comes back as A+ in Moz's on page grader, it seems a bit much.
What do you think, is all this too heavy an SEO fingerprint for Google?
-
HI Erica,
It's not on the page too many times, it's just relentlessly exact-matched between h1, url, title tag and img alt, but I guess that's okay.
Thanks
-
No, if you put the keyword too many times, the page grader will give you a bad grade.
-
Hi Erica,
Thanks for the message. Whaddya mean... the page grader gives it an "A" for the keyword it hammered?
Thanks...
-
Very true.
-
The page grader will give you a bad grade if you keyword stuff.
-
Sounds like it should be fine. Unless Google slaps you for keyword stuffing you haven't done anything wrong...hell it could even be argued you are using a best practice. As long as the URL, H1, title tag, and image are relevant and accurately depict the keyword you are golden.
If anything, you may want to strengthen your keyword strategy and work in some synonyms in there...catch a few more eyes and rank for additional keywords. This can also help to water down the super optimized pages and make it feel a bit safer.
Hope that helps,
Christopher
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to fix issues from 301s
Case: We are currently in the middle of a site migration from .asp to .net and Endeca PageBuilder, and from a homebrewed search provider to Endeca Search. We have migrated most of our primary landing pages and our entire e-commerce site to the new platforms. During the transition approximately 100 of our primary landing pages were inadvertently 302ed to the new version. Once this was caught they were immediately changed to 301s and submitted to the Google’s index through webmaster tools. We initially saw increases in visits to the new pages, but currently (approximately 3 weeks after the change from 301 to 302) are experiencing a significant decline in visits. Issue: My assumption is many of the internal links (from pages which are now 301ed as well) to these primary landing pages are still pointing to the old version of the primary landing page in Google’s cache, and thus have not passed the importance and internal juice to the new versions. There are no navigational links or entry points to the old supporting pages left, and I believe this is what is driving the decline. Proposed resolution: I intend to create a series of HTML sitemaps of the old version (.asp) of all pages which have recently been 301ed. I will then submit these pages to Google’s index (not as sitemaps, just normal pages) with the selection to index all linked pages. My intention is to force Google to pick up all of the 301s, thus enforcing the authority channels we have set up. Question 1: Is the assumption that the decline could be because of missed authority signals reasonable? Question 2: Could the proposed solution be harmful? Question 3: Will the proposed solution be adequate to resolve the issue? Any help would be sincerely appreciated. Thank you in advance, David
Intermediate & Advanced SEO | | FireMountainGems0 -
Understanding how to fix a 403 issue with my website
Hi guys, I hope you can help solve a mystery for me! My site FranceForFamilies.com has been around for 9 years and has always ranked well - at least until I launched a new Wordpress version earlier this year. The purpose of the relaunch was to improve the look of the site, so I kept the content and meta titles the same but created a new design. However, from the day of the new launch the search engine rankings have plummeted, to the point where most seem to have disappeared all together. I have found that when Moz crawls the site, it only crawls one page. I asked the Moz team about this and they said that the site is returning a 403. They also tested this using a curl and received a 406 response: curl -I www.franceforfamilies.com/ HTTP/1.1 406 Not Acceptable However, when I check our Google Webmaster tools I can't recreate the issue. I don't really know what is going on, and I don't have the technical knowledge to solve this - can you help? Thanks, Daniel
Intermediate & Advanced SEO | | LeDanJohnson0 -
Worth removing keywords...?
I was just going over a site I manage and noticed it had a load of meta keywords on it. Probably 10-15 keywords per page... Do you think this is harming the site? Is it worth removing them?
Intermediate & Advanced SEO | | JohnW-UK0 -
Penalised for duplicate content, time to fix?
Ok, I accept this one is my fault but wondering on time scales to fix... I have a website and I put an affiliate store on it, using merchant datafeeds in a bid to get revenue from the site. This was all good, however, I forgot to put noindex on the datafeed/duplicate content pages and over a period of a couple of weeks the traffic to the site died. I have since nofollowed or removed the products but some 3 months later my site still will not rank for the keywords it was ranking for previously. It will not even rank if I type in the sites' name (bright tights). I have searched for the name using bright tights, "bright tights" and brighttights but none of them return the site anywhere. I am guessing that I have been hit with a drop x place penalty by Google for the duplicate content. What is the easiest way around this? I have no warning about bad links or the such. Is it worth battling on trying to get the domain back or should I write off the domain, buy a new one and start again but minus the duplicate content? The goal of having the duplicate content store on the site was to be able to rank the category pages in the store which had unique content on so there were no problems with that which I could foresee. Like Amazon et al, the categories would have lists of products (amongst other content) and you would click through to the individual product description - the duplicate page. Thanks for reading
Intermediate & Advanced SEO | | Grumpy_Carl0 -
Is it worth submitting a blog's RSS feed...
to as many RSS feed directories as possible? Or would this have a similar negative impact that you'd get from submitting a site to loads to "potentially spammy" site directories?
Intermediate & Advanced SEO | | PeterAlexLeigh0 -
Fixing Duplicate Content Errors
SEOMOZ Pro is showing some duplicate content errors and wondered the best way to fix them other than re-writing the content. Should I just remove the pages found or should I set up permanent re-directs through to the home page in case there is any link value or visitors on these duplicate pages? Thanks.
Intermediate & Advanced SEO | | benners0 -
URL formating is it worth changing?
One of my clients sites has almost OK URL's, set up something like the following: keyword2_keyword3_keyword1 Ideally the URL's would be more like this: keyword1-keyword2-keyword3 My question is is there any point in changing them and 301 redirecting them over just to get the target keywords in a better order and change the _ to a - ? Has anyone tried this and its worked or not worked, I don't want to throw the baby out with the bath water. Justin
Intermediate & Advanced SEO | | GrouchyKids0 -
Fixing Google Places once Banned
I have a lot of clients who have somehow botched up their Google Places listing, and now are not showing up in local search results. In one particular case, they were using 2 different Gmail accounts and submitted their listing twice by accident. It appears Google has banned them from local search results. How does one undo steps like this and start fresh? Thanks!
Intermediate & Advanced SEO | | ocsearch0