We are ignored by Google - what should we do?
-
Hi,
We believe that our website - https://en.greatfire.org - is being all but ignored by Google Search. The following two examples illustrate our case.
1. Searching for “China listening in on Skype - Microsoft assumes you approve”. This is the title of a blog post that we wrote which received some 50,000 visits. On Yahoo and Bing search, we rank first for this search. On Google, however, we rank 7th. Each of the six pages ranking higher than us are quoting and linking to our story.
2. Searching for “Online Censorship In China”. This is the title of our front page. Yahoo and Bing both rank us third for this search. On Google, however, we are not even among the first 300 results. Two of the pages among the first 10 results link to us.
Our website has an average of around 1000 visits per day. We are quoted in and linked from virtually all Western mainstream media (see https://en.greatfire.org/press). Yet to this day we are receiving almost no traffic from Google Search.
Our mission is to bring transparency to online censorship in China. If people could find us in Google, it would greatly help to spread awareness of the extent of Internet restrictions here. If you could indicate to us what the cause of our poor rankings could be, we would be very grateful. Thank you for your time and consideration.
-
Hi Matt,
Thanks for your reply. I think the fact that we gained a lot of backlinks and then lost them was due to our very highly quoted and linked story in December (the Skype story, used as an example in our first post). Many websites put links to us on their front pages. Inevitably, these only stay until pushed down and off the page by newer stories.
We have not create fake links anywhere. According to Google Analytics, visitors have entered our site through links on 904 websites since Dec 1. The top ones are Reddit, YCombinator, Twitter, habrahabr.ru, Facebook, TheNextWeb and Wikipedia. All very legitimate links, as far as I can understand.
What do you think we should do? Why does https prevent using a link profile tool?
-
Great post Matt. You nailed it.
Best regards,
Devanur Rafi.
-
http://dejanseo.com.au/hijacked/
This is a recent test - and one that may apply (though I still maintain it's link profile.)
-
Actually, I'm pretty sure your problem is in your link profile.
http://www.highonseo.com/examples/ahrefs1.jpg
The first image shows your ahrefs backlink profile. You nearly-instantly gained a couple thousand backlinks. Then lost a bunch quickly as well
So my next question was "are these legit?"
Now look at image 2.
http://www.highonseo.com/examples/ahrefs2.jpg
Out of 92,293 backlinks, you have over 90,000 dofollow links, including over 80,000 sitewide links. 1600 .govs, which is nearly more than your nofollow links.
My brain can't process a link profile that looks like this. I would love to pull it into a link profile tool to check the DA of your backlinks but because you're https, I can't.
Just speculation on my part but if someone told me they had over 97% dofollow links, as many edu as nofollow and had a huge gain and then watched those links falling off, I'd quickly believe something was wrong. I always assume Google is two steps ahead of me. So if I think this backlink profile looks wonky, they must think it's worse.
-
I heard they will give the ranking of the content to the more powerful site? not sure if thats correct. If they thought you had copied it then perhaps no ranking at all?
-
Yes. But shouldn't Google be good at determining that? For one, they all or almost all link back to our original story - not the other way around. Secondly, our story is always published before theirs and Google should detect that.
If this is the case, it doesn't explain why we have no ranking at all on the title of our front page.
-
Could it be that the big sites quoting some of your text are seen as the orgininal source as they are very high domain authority websites?
-
No problem my friend. You are most welcome. If you wanted to go for HTTPS intentionally then it is ok. However, it seems Google does not treat HTTPS the way it should as of now. Probably at some point later this may change and who know if they have already rolled it out and it is just under way. Bigger changes like this take time to propagate fully through out. Till that time, all that we can do is sit tight and have our fingers crossed
Best regards,
Devanur Rafi.
-
Thanks Devanur. Very interesting idea. However, we do want to keep our whole website as HTTPS - to make it more difficult to track what our users do on it, and also to encourage other websites to move the HTTPS as well. The more the better. For example, all of GitHub is already HTTPS-only. If HTTPS is indeed the reason it's quite a scandal that Google can't deal with it properly.
-
Hi there,
Though as per Google, it is ok (http://www.youtube.com/watch?v=xeFo4ytOk8M) to go for https for your entire website, personally, I saw in many instances where https URLs find it very difficult competing with http URLs in Google.
Normally, I do not see a need to go in for https for plain pages that do not need to be served over https. Only the secure pages that might need a login to access them may be served over https. Hope our friends over here will jump in with their views.
Let me conclude by saying, I would go for http for all the pages that I desire to rank high in Google and this view is based solely on my personal experience.
Hope it helps.
Best regards,
Devanur Rafi.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why do SEO agencies ask for access to our Google Search Console and Google Tag Manager?
What do they need GTM for? And what is the use case for setting up Google Search Console?
Intermediate & Advanced SEO | | NBJ_SM0 -
Website dropped out from Google index
Howdy, fellow mozzers. I got approached by my friend - their website is https://www.hauteheadquarters.com She is saying that they dropped from google index over night - and, as you can see if you google their name, website url or even site: , most of the pages are not indexed. Home page is nowhere to be found - that's for sure. I know that they were indexed before. Google webmaster tools don't have any manual actions (at least yet). No sudden changes in content or backlink profile. robots.txt has some weird rule - disallow everything for EtaoSpider. I don't know if google would listen to that - robots checker in GWT says it's all good. Any ideas why that happen? Any ideas what I should check? P.S. Just noticed in GWT there was a huge drop in indexed pages within first week of August. Still no idea why though. P.P.S. Just noticed that there is noindex x-robots-tag in headers... Anyone knows where this can be set?
Intermediate & Advanced SEO | | DmitriiK0 -
Reviews not pulling through to Google My Business page
OK, a local SEO question! We are working with a plumbing company. A search for (Google UK) shows the knowledge panel with 20+ reviews. This is good! However, if you search for "plumbers norwich" and look at the map, thecompany is on the third page and has no reviews. I've logged into Google My Business, and it says the profile is not up to date and only 70% complete with no reviews. This is odd, as there was a fully complete profile recently. Any ideas on how best to reconcile the two? Thanks!
Intermediate & Advanced SEO | | Ad-Rank1 -
Sitemaps recommend by google
Google in it guideline recommends to create a sitemap. Do they means a /sitemap.xml or does it need to be sitemap directly on the website ? Does it make any difference ? Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Google does not target my website properly!
Hello everyone, My website : www.pentrucadouri.ro, despite is a .ro with romanian content and is hosted in Romania appear for google.ro as a english targeted website.Google see internal pages as romanian ones but main page as english . In order to change this , I added : Also few days ago I uploaded a geositemap and I submitted this to google. Do you have suggestions ? Website ranks 2nd for "cosuri cadou" on google.com and 3rd on bing, but on google.ro ranks 11 . Thanks!
Intermediate & Advanced SEO | | VertiStudio0 -
Is Google Webmaster tools Accurate?
Is Google webmaster Tools data completely inaccurate, or am I just missing something? I noticed a recent surge in 404 errors detected 3 days ago (3/6/11) from pages that have not existed since November 2011. They are links to tag and author archives from pages initially indexed in August 2011. We switched to a new site in December 2011 and created 301 redirects from categories that no longer exist, to new categories. I am a little perplexed since the Google sitemap test shows no 404 errors, neither does SEO MOZ Crawl test, yet under GWT site diagnostics, these errors, all 125 of them, just showed up. Any thought/insights? We've worked hard to ensure a smooth site migration and now we are concerned. -Jason
Intermediate & Advanced SEO | | jimmyjohnson0 -
Site: on Google
Hello, people. I have a quick question regarding search in Google. I use search operator [site:url] to see indexing stauts of my site. Today, I was checking indexing status and I found that Google shows different numbers of indexed pages depends on search setting. 1. At default setting (set as 10 search result shows) > I get about 150 pages indexed by Google. 2. I set 100 results shows per page and tried again. > I get about 52 pages indexed by Google. Of course I used same page URL. I really want to know which data is accurate. Please help people!!
Intermediate & Advanced SEO | | Artience0 -
How do Google Site Search pages rank
We have started using Google Site Search (via an XML feed from Google) to power our search engines. So we have a whole load of pages we could link to of the format /search?q=keyword, and we are considering doing away with our more traditional category listing pages (e.g. /biology - not powered by GSS) which account for much of our current natural search landing pages. My question is would the GoogleBot treat these search pages any differently? My fear is it would somehow see them as duplicate search results and downgrade their links. However, since we are coding the XML from GSS into our own HTML format, it may not even be able to tell.
Intermediate & Advanced SEO | | EdwardUpton610