Does Google play fair? Is 'relevant content' and 'usability' enough?
-
It seems there are 2 opposing views, and as a newbie this is very confusing.
One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly.
The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well.
Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair.
Here's an example to illustrate one related concern I have:
I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content:
Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind..
Thoughts?
-
Hi David,
Sorry for such a delayed response but I keep wondering about your point on the meganav. Its known that Google is able to figure out menus and wont count those toward duplicate content? I just would like to be sure since my menus are fairly substantial when dropdowns are included.
-
You are giving me SOME hope for a site I've been working on for about 5 years and am getting ready to launch. Thanks very much.
-
Your comment in #4 about time on page and bookmarking is something I think should be taken into account by Google for search page ranking, but I've never heard before that they do. [...] Are those significant factors used by Google?
In my opinion, google has every ability to measure visitor actions. They own the Chrome browser and could measure the engagement of visitors with a page, they have access to what gets bookmarked in Chrome, they know when a visitor clicks in the SERPs and when that same visitor reappears in the SERPs, they don't have to have links because they can read when people mention your site in a forum, they know if people navigate to your site by typeing the name of your site into search... I believe that all of these things are important for rankings but how important I can't say.
I have lots of really good content that when I published it the page ranked at #150 or deeper in the SERPs. Then, I built zero links and did zero promotion and slowly that page rises in the SERPs and is now in the top three - over a year later. I have hundreds of pages that have done that. You gotta have a LOT of patience to do things that way but you spend zero effort on promotion and 100% effort producing assets for your website. That is what I have done since about 2006. Virtually zero linkbuilding. My visitors are my linkbuilders.
-
EGOL, Thanks very much. I, being a one person biz, am very interested in the idea of ranking by popularity, as my goal is to have the best site out there but I have limited funds to promote it. Your comment in #4 about time on page and bookmarking is something I think should be taken into account by Google for search page ranking, but I've never heard before that they do. After all, usage and return usage is what it is all about! Are those significant factors used by Google? If so maybe there is hope..:)
-
Egol has this summed up perfectly!
-Andy
-
One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly.
The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well.
Which is closer to the truth?
They are both a small piece of the truth. To rank on google your PAGE must be:
-
relevant to the search term and presented to google with proper title, crawability, and text visibility
-
have substantive content about the search term
-
be validated by other websites by being linked from them or mentioned by them (these are just a few validations)
-
be validated by visitors because they have queried it by name, stayed on it, bookmarked it, mentioned it by name in web readable content (these are just a few validations)
Any idiot can do #1. A good author can do #2. But, #3 and #4 are really difficult to accomplish by people who are not related to you or paid by you.
In low competion #1 and #2 can be enough to get your ranked. The higher the competition for a query the more you need #3 and #4 to rank. For some queries it can be almost impossible for a newcomer to rank on the first page of google without investing $xxx,xxx or more in website assets and promotion.... AND... having a plan in place to present the site in a way that google will be able to read it and interpret it in a way that will maximize the #3 and #4 assets.
-
-
A meganav is not considered duplicate content. Duplicate content means product description pages that are identical, having the same articles multiple places on your site, etc.
To the main parts of your question - Google does not want it to be easy for people in the SEO world. They give guidelines, but following them means nothing. What Google considers an ok tactic one years becomes an unacceptable tactic the next (see guest blogging). There are many ways to succeed in ranking. Some follow Google's rules and wait for rankings to come, others use tons of spammy tactics and rank instantly (though they always risk losing it overnight if Google catches on).
The idea that an easy to use site and relevant content will make Google rank you fairly is a joke. And though only 1 has said it publicly, there are many top minds in the SEO world who will tell you that in private.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Content writing for single entity business (The use of I)
Most of my clients consist of single entity law firms in which my clients repeatedly use the pronoun "I" to describe every service they provide. I have always preferred using the business name The Law Office of..." put lawyer name here". Is it ok to repetitively use the pronoun "I" in the content. To me it feels lack luster and childish not very professional, however I have a hard time convincing the lawyers of this. What are your thoughts? Can good content be written with the repetitive use of "I"? If not is the business name sufficient or maybe another pronoun? I will be showing responses to my clients if that is ok.
Local Website Optimization | | donsilvernail0 -
What's with Google? All metrics in my favor, yet local competitors win.
In regards to local search with the most relevant keyword, I can't seem to get ahead of the competition. I've been going through a number of analytics reports, and in analyzing our trophy keyword (which is also the most relevant, to our service and site) our domain has consistently been better with a number of factors. There is not a moz report that I can find that doesn't present us as the winner. Of course I know MOZ analytics and google analytics are different, but I'm certain that we have them beat with both. When all metrics seem to be in our favor, why might other competitors continue to have better success? We should be dominating this niche industry. Instead, I see a company using blackhat seo, another with just a facebook page only, and several others that just don't manage their site or ever add unique, helpful content. What does it take to get ahead? I'm pretty certain I've been doing everything right, and doing everything better than our local competitors. I think google just has a very imperfect algorythm, and the answer is "a tremendous amount of patience" until they manage to get things right.
Local Website Optimization | | osaka730 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
Google Structured Data Verses Yandex Structured Data Validator Query
Hi All, We have implemented Schema.org on our website and we have chosen a specific schema as opposed to just using the standard localbusiness. When we ran it through the Google Structured data tool it did not report any error , however , when we tried it on Yandex, it showed up as us having problems with the way we have tagged our addresses for our different locations so we have made additional changes to fix this. I have read somewhere that the Google Structured data tool is not 100% correct at showing any errors etc and that one should use Yandex as well for validation. I am wondering what others thoughts and if what I read should be taken as correct infomation?... I would be surprised if google did release something like the structured data validator if it wasn't as good at reporting than some others out there. thanks Pete
Local Website Optimization | | PeteC120 -
Google Maps- Get Directions not working properly
Hello All, We have added a "Get directions" for google Maps on our website for each of our depots. It works correctly on the latest version of google maps if we just put in our company name , postcode. However if users are using an older version of google maps, it doesn't work so well ,and brings up a massive list of companies in the same vicinity. If we put in our whole address, it's hit and miss if it finds it okay and brings up a list of companies as well. I have some other companies just use long/latitude but I that isn't so accurate in some cases either especially when the depots are on business parks. I am wondering if others have found this problem and whether its possible to check what version of google maps the user has say on their mobile etc before choosing which code to run ? Any thoughts ? I can't find anything to help us to code this. thanks Peter
Local Website Optimization | | PeteC120 -
Do more page links work against a Google SEO ranking when there is only 1 url that other sites will link to?
Say I have a coupon site in a major city and assume there are 20 main locations regions (suburb cities) in that city. Assume that all external links to my site will be to only the home page. www.site.com Assume also that my website business has no physical location. Which scenario is better? 1. One home page that serves up dynamic results based on the user cookie location, but mentions all 20 locations in the content. Google indexes 1 page only, and all external links are to it. 2. One home page that redirects to the user region (one of 20 pages), and therefore will have 20 pages--one for each region that is optimized for that region. Google indexes 20 pages and there will be internal links to the other 19 pages, BUT all external links are still only to the main home page. Thanks.
Local Website Optimization | | couponguy0 -
International Site Geolocation Redirection (best way to redirect and allow Google bots to index sites)
I have a client that has an international website. The website currently has IP detection and redirects you to the subdomain for your country. They have currently only launched the Australian website and are not yet open to the rest of the world: https://au.domain.com/ Google is not indexing the Australian website or pages, instead I believe that the bots are being blocked by the IP redirection every time they try to visit one of the Australian pages. Therefore only the US 'coming soon' page is being properly indexed. So, I would like to know the best way to place a geolocation redirection without creating a splash page to select location? User friendliness is most important (so we don't want cookies etc). I have seen this great Whiteboard Friday video on Where to Host and How to Target, which makes sense, but what it doesn't tell me is exactly the best method for redirection except at about 10:20 where it tells me what I'm doing is incorrect. I have also read a number of other posts on IP redirection, but none tell me the best method, and some are a little different examples... I need for US visitors to see the US coming soon page and for Google to index the Australian website. I have seen a lot about JS redirects, IP redirects and .htaccess redirects, but unfortunately my technical knowledge of how these affect Google's bots doesn't really help. Appreciate your answers. Cheers, Lincoln
Local Website Optimization | | LincolnSmith0