Over Optimization - What exactly is it?
-
I keep hearing this term "over optimization" and people assuming they have received an over optimization penalty. I also hear people say, "You need to optimize every page on your site."
So at what point does a site become "over optimized?" Are there any good reads around here that explain this better?
-
This is what I thought it was, I just wanted to make sure that what I was thinking was correct.
-
Over optimization typically refers to the "Penguin" Google update, which was originally called the over optimization update. In this context, over optimization refers to having too many links with keywords as the anchor text. A natural link profile contains a mix of anchor texts such as your brand name, URL, and words like "click here". If you have too many links with just keywords, you will drop in rankings for that term, or your entire site could be penalized.
Over optimization could also refer to overusing your keyword on your site to the point of keyword stuffing. There are no hard-and-fast rules here, but if it looks spammy to you, it's probably going to look spammy to Google too. Make sure you use your keywords at least a few times in your content, but don't go overboard.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is NitroPack plugin Black Hat SEO for speed optimization
We are getting ready to launch our redesigned WP site and were considering using NitroPack performance optimization plugin, until some of our developers started ringing the alarm. Here is what some in the SEO community are saying about the tool. The rendering of the website made with the NitroPack plugin in the Page Metric Test Tools is based entirely on the inline CSS and JS in the HTML file without taking into account additional numerous CSS or JS files loaded on the page. As a result, the final metric score does not include CSS and JavaScript files evaluation and parsing. So what they are saying is that a lot of websites with the NitroPack plugin never become interactive in the Page Metric Tools because all interactivity is derived from JavaScript and CSS execution. So, their "Time to Interactive" and "Speed Index" should be reported as equal to infinity. Would Google consider this Black Hat SEO and start serving manual actions to sites using NitroPack? We are not ready to lose our hard-earned Google ranking. Please, let me know your thoughts on the plugin. Is it simply JS and CSS "lazy loading" that magically offers the first real-world implementation that works magic and yields fantastic results, or is it truly a Black Hat attempt at cheating Google PageSpeed Insights numbers? Thank you!
On-Page Optimization | | opiates0 -
When using long-tail keywords, should you exactly match for the url or delete "in" "to" etc.?
long-tail keyword - "seizures in adults with no history" Should you include "in and with" in the url?
On-Page Optimization | | Moleculera0 -
I am trying to use Page Optimization feature but it is giving me error.
Hi, I am trying to track a page optimization feature for one of my project, https://www.360degreespropertyinspections.com.au for keyword: property inspections melbourne but i keep getting this below error: "Page Optimization Error There was a problem loading this page. Please make sure the page is loading properly and that our user-agent, rogerbot, is not blocked from accessing this page." I checked robots.txt file, it all looks fine. Not sure what is the problem? Is it a problem with Moz or the website?
On-Page Optimization | | Abhijay191 -
Removing old URLs that are being used for my on page optimization?
Is there a way to remove old URL's that are still being used for my keywords for my on page optimization? They are giving me grades of F since they no longer exist and if I change the URL to the current one, the grade becomes an A, but they are still showing after the new crawl.
On-Page Optimization | | Dirty0 -
Product pages optimization impact on sub level domains.
How great of an impact will the optimization of product pages have an impact on a sub level domains?
On-Page Optimization | | Martin_Harris0 -
Post Pinguin Onpage Optimisation - How Do I Build the Perfectly Optimized Page?
Hello, So far I always follow the recomendations of the chocolate donuts example of Rand Fishkin on the Blogpost: How Do I Build the Perfectly Optimized Page? I also think that the onpage grades follow this rules. Do you think this is still all true after pinguin? Personally none of my websites have been afected by the update and I have an A grade on all my landing page. What do you think about the image alt text. Is it still a good idea to use the keyword? Are there any newer resources here on SEOMOZ (I didn't find any with my search)? Thanks for sharing your experiences.. Regards
On-Page Optimization | | stereo690 -
Better page optimization for specific locations
I have a client that gets great ranking in a certain city mainly because that is their main corporate office and the address and city name is all over the place in their content. I am about to embark on getting them higher ranking in other cities as well and am looking for the best approach to make that possible. My thoughts... 1- create seperate content for the other locations, but the body information would probably end up looking duplicate, but I could be more specific with title, description and content realting to that specific city. 2- add the additional cities to the current content??? Need some expert advice. Thanks
On-Page Optimization | | brantwadz0 -
What is the best practice for optimizing international websites? We operate a .co.uk and .com and obviously content is similar.
We have two (and soon to be more) international websites, all in English. The sites in question are WebHostingBuzz.com and WebHostingBuzz.co.uk. Obviously content is similar as we're providing a similar service but from different locations and different prices. What is best practice here? Should we completely re-write the .co.uk content (this is the newer site) so it isn't penalized for scraping? Any hints/tips would be appreciated.
On-Page Optimization | | mdrussell0