Does Google play fair? Is 'relevant content' and 'usability' enough?
-
It seems there are 2 opposing views, and as a newbie this is very confusing.
One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly.
The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well.
Which is closer to the truth? No one wants to have a great website that won't rank because Google wasn't sophisticated enough to see that they weren't being unfair.
Here's an example to illustrate one related concern I have:
I've read that Google doesn't like duplicated content. But, here are 2 cases in which is it more 'relevant' and 'usable' to the user to have duplicate content:
Say a website helps you find restaurants in a city. Restaurants may be listed by city region, and by type of restaurant. The home page may have links to 30 city regions. It may also have links for 20 types of restaurants. The user has a choice. Say the user chooses a region. The resulting new page may still be relevant and usable by listing ALL 30 regions because the user may want to choose a different region. Altenatively say the user chooses a restaurant type for the whole city. The resulting page may still be relevant and usable by giving the user the ability to choose another type OR another city region. IOW there may be a 'mega-menu' at the top of the page which duplicates on every page in the site, but is very helpful. Instead of requiring the user to go back to the home page to click a new region or a new type the user can do it on any page. That's duplicate content in the form of a mega menu, but is very relevant and usable. YET, my sense is that Google MAY penalize the site even though arguably it is the most relevant and usable approach for someone that may or may not have a specific region or restaurant type in mind..
Thoughts?
-
Hi David,
Sorry for such a delayed response but I keep wondering about your point on the meganav. Its known that Google is able to figure out menus and wont count those toward duplicate content? I just would like to be sure since my menus are fairly substantial when dropdowns are included.
-
You are giving me SOME hope for a site I've been working on for about 5 years and am getting ready to launch. Thanks very much.
-
Your comment in #4 about time on page and bookmarking is something I think should be taken into account by Google for search page ranking, but I've never heard before that they do. [...] Are those significant factors used by Google?
In my opinion, google has every ability to measure visitor actions. They own the Chrome browser and could measure the engagement of visitors with a page, they have access to what gets bookmarked in Chrome, they know when a visitor clicks in the SERPs and when that same visitor reappears in the SERPs, they don't have to have links because they can read when people mention your site in a forum, they know if people navigate to your site by typeing the name of your site into search... I believe that all of these things are important for rankings but how important I can't say.
I have lots of really good content that when I published it the page ranked at #150 or deeper in the SERPs. Then, I built zero links and did zero promotion and slowly that page rises in the SERPs and is now in the top three - over a year later. I have hundreds of pages that have done that. You gotta have a LOT of patience to do things that way but you spend zero effort on promotion and 100% effort producing assets for your website. That is what I have done since about 2006. Virtually zero linkbuilding. My visitors are my linkbuilders.
-
EGOL, Thanks very much. I, being a one person biz, am very interested in the idea of ranking by popularity, as my goal is to have the best site out there but I have limited funds to promote it. Your comment in #4 about time on page and bookmarking is something I think should be taken into account by Google for search page ranking, but I've never heard before that they do. After all, usage and return usage is what it is all about! Are those significant factors used by Google? If so maybe there is hope..:)
-
Egol has this summed up perfectly!
-Andy
-
One view is that as long as your site pages have relevant content and are easy for the user, Google will rank you fairly.
The other view is that Google has 'rules' you must follow and even if the site is relevant and user-friendly if you don't play by the rules your site may never rank well.
Which is closer to the truth?
They are both a small piece of the truth. To rank on google your PAGE must be:
-
relevant to the search term and presented to google with proper title, crawability, and text visibility
-
have substantive content about the search term
-
be validated by other websites by being linked from them or mentioned by them (these are just a few validations)
-
be validated by visitors because they have queried it by name, stayed on it, bookmarked it, mentioned it by name in web readable content (these are just a few validations)
Any idiot can do #1. A good author can do #2. But, #3 and #4 are really difficult to accomplish by people who are not related to you or paid by you.
In low competion #1 and #2 can be enough to get your ranked. The higher the competition for a query the more you need #3 and #4 to rank. For some queries it can be almost impossible for a newcomer to rank on the first page of google without investing $xxx,xxx or more in website assets and promotion.... AND... having a plan in place to present the site in a way that google will be able to read it and interpret it in a way that will maximize the #3 and #4 assets.
-
-
A meganav is not considered duplicate content. Duplicate content means product description pages that are identical, having the same articles multiple places on your site, etc.
To the main parts of your question - Google does not want it to be easy for people in the SEO world. They give guidelines, but following them means nothing. What Google considers an ok tactic one years becomes an unacceptable tactic the next (see guest blogging). There are many ways to succeed in ranking. Some follow Google's rules and wait for rankings to come, others use tons of spammy tactics and rank instantly (though they always risk losing it overnight if Google catches on).
The idea that an easy to use site and relevant content will make Google rank you fairly is a joke. And though only 1 has said it publicly, there are many top minds in the SEO world who will tell you that in private.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I have followed all the steps in google speed ranking on how to increase my website http://briefwatch.com/ speed but no good result
My website http://briefwatch.com/ has a very low-speed score on google page speed and I followed all the steps given to me still my website speed doesn't increase
Local Website Optimization | | Briefwatch0 -
What is the SEO effect of schema subtype deprecation? Do I really have to update the subtype if there isn't a suitable alternative?
Could someone please elaborate on the SEO effect of schema subtype deprecation? Does it even matter? The Local business properties section of developers.google.com says to: Define each local business location as a LocalBusiness type. Use the most specific LocalBusiness sub-type possible; for example, Restaurant, DaySpa, HealthClub, and so on. Unfortunately, the ProfessionalService page of schema.org states that ProfessionalService has been deprecated and many of my clients don't fit anywhere else (or if they do it's not a LocalBusiness subtype). I find it inconvenient to have to modify my different clients' JSON-LD from LocalBusiness to ProfessionalService back to LocalBusiness. I'm not saying this happens every day but how does one keep up with it all? I'm really trying to take advantage of the numerous types, attributes, etc., in structured data but I feel the more I implement, the harder it will be to update later (true of many things, of course). I do feel this is important and that a better workflow could be the answer. If you have something that works for you, please let us know. If you think it's not important tell us why not? (Why Google is wrong) I understand there is always a better use of our time, but I'd like to limit the discussion to solving this Google/Schema.org deprecation issue specifically.
Local Website Optimization | | bulletproofsearch0 -
Google can't discern the identity of my site
I have a website, http://NewYorkJazzEvents.com, that promotes jazz bands that are available for brides looking to hire a jazz band to perform at their wedding, or event planners looking to hire a jazz band to perform for a corporate event, etc. This identity, that my site is an Entertainment Agency, is made clear by all of the content on my site, as well as all of the content on its associated sites (such as its linked Facebook, YouTube, and Google Business pages, and many local citations). Yet, contrary to all of this data, the mere presence of the word "events" in my URL and business name has led Google to believe that my site is a Live Jazz Guide, i.e., a site that lists public performances of jazz groups in New York City. The problem, then, is that Google displays the site when people search for local events listings, and not when they search for jazz bands to contract for private events. For example, do a search for "jazz bands new york" and up pops the listings for sites catering to searchers looking to hire bands for private events, like Gigmasters, Gigsalad, right at the top of the list, followed by lots of individual bands. My site is buried (in my results, anyway), on the middle of page 2. (My paid Adwords ad, on the other hand, shows up at the top of paid ads.): https://www.dropbox.com/s/sv4we4gvnb6wkyb/Screenshot%202016-04-11%2019.22.40.png?dl=0 Now do a search for "new york jazz events." Boom! I'm #1 in the natural results, and, unlike in the search for "new york jazz band," my Google plus page and map (or is it the "knowledge graph"?) display right at the top of the right column: https://www.dropbox.com/s/nob24x1b8u1g4or/Screenshot%202016-04-11%2019.18.49.png?dl=0. (Pretty useless to people searching for live jazz listings in New York, though.) (This, by the way, is an additional related frustration: why does Google display all of its local information (its map, links to my Google reviews, etc.) next to my site listing when people are searching for events, but but hides this valuable information next to my site listing when people are search for jazz bands (when my site comes up on page 2)?) For a further confirmation of Google's confusion, see this data from Google that indicates the top search queries that it is using to display my site are centered around searches for local live jazz listings: Google Search Console > Search Traffic > Search Analytics > Queries: https://www.dropbox.com/s/t8blxv6a077iuw6/Screenshot%202016-03-07%2012.28.38.png?dl=0 See also see this data from Google that indicates that it see "events" (which it understands as local live jazz listings) rather than "new york jazz bands" as the essential keyword describing the identity of the site: Google Search Console > Google Index > Content Keywords: https://www.dropbox.com/s/6nk6skfgx9zjzgc/Screenshot%202016-03-07%2012.46.04.png?dl=0 It's been this way for several years. I thought Google was supposed to be smart, but it's pretty dumb in this case (all the other search engines, including Bing, are quite a bit more intelligent). All this trouble, essentially from a word within a URL? Does anyone have an idea of the cause of this issue, and any potential cures? What can I do to clear up Google's confusion?
Local Website Optimization | | ChuckBraman0 -
How do I set up 2 businesses that work together but are ran seperately with two separate websites but similar content?
How do I set up these sites so that they will not be negatively affecting their SEO efforts? I have 2 businesses with the same owner. Business A manufactures nurse call systems and Business B installs them. They are run separately with two websites. The content is very similar because the business that installs them describes the different products on their website. These are the two sites: intercallsystems.com and nursecallny.com , My thought was on nursecallny.com when you click on the nav link "Nurse Call Systems" you would be directed to the intercell website. Would this be the best method? Thank you for your help!
Local Website Optimization | | renalynd270 -
All metrics appear to be better than our local competitors yet we our ranking doesn't resemble it. Help?
Hi, I work for a marquee company and have recently been really trying to optimise our SEO through good content, link building, social media especially google + and so on. Yet a rival (www.camelotmarquees.com) who performs worse than us for the majority of the moz parameters still ranks better than us in both organic search and google places. The clear and obvious factor they beat us on is internal links which is currently over 15,000 which seems ridiculous for the size of their site, compared to our site of about 120. Would this have that match of an effect on the rankings and how on earth have they got so many? Also is there any tips and advice to help us leap frog them as we feel, we're producing regular, useful content and optimised our site the best we can? website: www.oakleafmarquees.co.uk keywords: marquee hire dorset, marquee dorset, dorset marquee hire, wedding marquee hire
Local Website Optimization | | crazymoose780 -
Google My Business
I have a question about Google my Business. Currently I have a business that's been verified. I would like to add another business with the same address. The businesses are different (name, website, phone number) but the primary address is the same. Is this something that can be done? Thanks for your help.
Local Website Optimization | | Kdruckenbrod0 -
How does duplicate content work when creating location specific pages?
In a bid to improve the visibility of my site on the Google SERP's, I am creating landing pages that were initially going to be used in some online advertising. I then thought it might be a good idea to improve the content on the pages so that they would perform better in localised searches. So I have a landing page designed specifically to promote what my business can do, and funnel the user in to requesting a quote from us. The main keyword phrase I am using is "website design london", and I will be creating a few more such as "website design birmingham", "website design leeds". The only thing that I've changed at the moment across all these pages is the location name, I haven't touched any of the USP's or the testimonial that I use. However, in both cases "website design XXX" doesn't show up in any of the USP's or testimonial. So my question is that when I have these pages built, and they're indexed, will I be penalised for this tactic?
Local Website Optimization | | mickburkesnr0 -
UK website to be duplicated onto 2 ccTLD's - is this duplicate content?
Hi We have a client who wishes to have a site created and duplicated onto 3 servers hosted in three different countries. United Kingdom, Australia and USA. All of which will ofcourse be in the English language. A long story short, the website will provide the user 3 options on the homepage asking them which "country site" they wish to view. (I know I can detect the user IP and autoredirect but this is not what they want) Once they choose an option it will direct the user to the appropriate ccTLD. Now the client wants the same information to appear on all 3 sites with some slight variations in products available and English/US spelling difference but for the most part, the sites will look the same with the same content on each page. So my question is, will these 3 sites been seen as duplicates of each other even though they are hosted in different countries and are on ccTLD's? Are there any considerations I should pass onto the client with this approach? Many thanks for reading.
Local Website Optimization | | yousayjump
Kris0