Big rise in "Keyword not defined"
-
Hi, all.
Anyone else seen a massive increase in the Not Provided keywords in their analytics in the past couple of weeks. Probably related to this (source:http://searchengineland.com/post-prism-google-secure-searches-172487) _In the past month, Google quietly made a change aimed at encrypting all search activity — except for clicks on ads. Google says this has been done to provide “extra protection” for searchers, and the company may be aiming to block NSA spying activity. _
Other than the unreliable stats from WMT, there doesn't seem too many ways which we can now find out what is sending traffic to our sites!
-
Can anyone confirm if this will have an impact on the traffic data showing in Moz? I'm assuming that the data is coming from the Google Analytics data and will therefore be affected in the same way?
-
I hope Bing steps up their game and offers everyone a free analytics suite more on par with GA in response to the loss of keywords. I've never been enamored with Bing but they have been looking for a means to steal away people from Google. They can even market it as Microsoft saving the little guy/small business while Google hoards information.
-
anyone think Google is going to come up with a way to charge businesses/seo companies to view keyword data?
-
Google has turned into a black box.
-
Absolutely agree with you Grumpy Carl! I can see that this change is just going to increase the need to check rankings in order to find out which page is ranking.
Why we can't get this link in the google webmaster tools data I just don't know (but I know it's nothing to do with privacy!). I just want to know which pages the keywords are sending traffic to... grr
I just get a feeling that it's all going to get messy and I'm going to be spending a lot more time in front of spreadsheets.
-
It is, I do not guess there is any coming back from this either. It will be interesting to see how this changes SEO.
-
I would agree, in part. However, even if you don't know which keyword is sending you traffic, If anything this makes ranking reports more important. If we see traffic going up, but cannot directly see which keyword is sending it, then one could draw a link (however tenuous) between the rise in rankings and the rise in traffic
-
Scary how the 100% date, in the chart, has become this December. Was scary enough when it was 2017!!!
-
I think Google is on a covert mission to napalm the SEO industry....
-
Apparently things have taken a change today look at this, http://www.notprovidedcount.com/ and this http://www.searchenginejournal.com/google-gone-100-provided-secure-search/70799/
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help - Losing Ranking on Plural Keywords in Google
Help! I'm losing keyword rankings for certain keywords - but still ranking well on others. The trend I can see is that we're losing rankings on plurals, so this page https://www.key.co.uk/en/key/platform-trolleys-trucks Used to rank well for: platform trolleys
Algorithm Updates | | BeckyKey
platform trucks
flat bed trolleys But now only ranks well for platform trolley flat bed trolleys platform truck I have seen a large number of keywords drop out - they're not hitting traffic massively yet, but conversions are dropping - help! I;m struggling to find the cause!0 -
How much do branded search organic traffic & direct traffic impact the ranking for their non-branded topic/keyword?
Hi Moz community, We can see many websites with a reputation will have more number of visitors landing with these two types of traffic mostly (>90%): organic traffic of brand queries and direct traffic. Will these visits help and impact the ranking of these websites for the keywords/topics they been employing? Ex: Moz will have many such visitors. Will this really impact the ranking of Moz for non-brand queries they try to rank for, like "SEO Software". If so, will this have a huge impact or it's just a minor ranking factor. Because we have this with our website and we don't see such boost in rankings compared to our competitors with less direct traffic; where as I been looking at some SEO articles that direct traffic is one of the most important ranking factors. Thanks
Algorithm Updates | | vtmoz0 -
SEO Myth-Busters -- Isn't there a "duplicate content" penalty by another name here?
Where is that guy with the mustache in the funny hat and the geek when you truly need them? So SEL (SearchEngineLand) said recently that there's no such thing as "duplicate content" penalties. http://searchengineland.com/myth-duplicate-content-penalty-259657 by the way, I'd love to get Rand or Eric or others Mozzers aka TAGFEE'ers to weigh in here on this if possible. The reason for this question is to double check a possible 'duplicate content" type penalty (possibly by another name?) that might accrue in the following situation. 1 - Assume a domain has a 30 Domain Authority (per OSE) 2 - The site on the current domain has about 100 pages - all hand coded. Things do very well in SEO because we designed it to do so.... The site is about 6 years in the current incarnation, with a very simple e-commerce cart (again basically hand coded). I will not name the site for obvious reasons. 3 - Business is good. We're upgrading to a new CMS. (hooray!) In doing so we are implementing categories and faceted search (with plans to try to keep the site to under 100 new "pages" using a combination of rel canonical and noindex. I will also not name the CMS for obvious reasons. In simple terms, as the site is built out and launched in the next 60 - 90 days, and assume we have 500 products and 100 categories, that yields at least 50,000 pages - and with other aspects of the faceted search, it could create easily 10X that many pages. 4 - in ScreamingFrog tests of the DEV site, it is quite evident that there are many tens of thousands of unique urls that are basically the textbook illustration of a duplicate content nightmare. ScreamingFrog has also been known to crash while spidering, and we've discovered thousands of URLS of live sites using the same CMS. There is no question that spiders are somehow triggering some sort of infinite page generation - and we can see that both on our DEV site as well as out in the wild (in Google's Supplemental Index). 5 - Since there is no "duplicate content penalty" and there never was - are there other risks here that are caused by infinite page generation?? Like burning up a theoretical "crawl budget" or having the bots miss pages or other negative consequences? 6 - Is it also possible that bumping a site that ranks well for 100 pages up to 10,000 pages or more might very well have a linkuice penalty as a result of all this (honest but inadvertent) duplicate content? In otherwords, is inbound linkjuice and ranking power essentially divided by the number of pages on a site? Sure, it may be some what mediated by internal page linkjuice, but what's are the actual big-dog issues here? So has SEL's "duplicate content myth" truly been myth-busted in this particular situation? ??? Thanks a million! 200.gif#12
Algorithm Updates | | seo_plus0 -
We have a terrible, severe loss in our Keywords rankings this week, can you please help us what would happen in only 7 days ?
On the date 3rd of October we had almost 55 Keywords up and 44 almost down, Today after 7 days there is a severe drop 90 keyword is down, 7 only are up, and that's one website. My other website had 57 words up and 175 down. Our strategy in building backlinks: Directory submissions, article submissions and some blog commenting, we keep into consideration to choose only websites with high page rank (Page Rank 2 and higher). Also we had an updated on one of our websites since 1st of June a new URL structure, content, new pages and sitemap. Both websites has tons of keywords that we worked on increasing its rank and some we haven't touch them and its ranks was decreased. Does the new Google "hummingbird" have something to do with that? is there anything wrong with our strategy? Please provide us with your feedback ASAP.
Algorithm Updates | | ali8810 -
How big is the effect of having your site hosted in the country you're targeting?
Other than having a ccTLD domain and assigning your target country in Google Webmaster Tools' "geotargeting" feature, how big is the effect of having your site hosted in the country you're targeting? Is it really necessary? or it is just a small signal? Thanks in advance! 🙂
Algorithm Updates | | esiow20131 -
Google "In-Depth Article" Question
Google started featuring "In-Depth Articles" a few days ago. You can read about them here and here. I have two questions about them... If you already hold a great position in the SERPs. Let's say your existing article ranks at #2 or #3. If that article becomes one of the "In-Depth Articles", will it disappear from the #2 or #3 position? I have lots of content that I could mark as an In-Depth Article, but I don't want to do that if it will pull me out of a hard-earned SERP position. Has anyone seen "In-Depth Articles" that do not have the Schema markup? Thanks!
Algorithm Updates | | EGOL1 -
Why would Google read different pages to rank for a keyword?
I have noticed a large drop in a number of keywords in the latest rankings report. I have checked the results on the 'Ranking History Graph' and it appears that Google is reading different pages for the specific keyword and therefore, giving large fluctuations in ranking dependant on the page from week to week. Why would this be happening?
Algorithm Updates | | Benjamin3790 -
Rel="alternate" hreflang="x" or Unique Content?
Hi All, I have 3 sites; brand.com, brand.co.uk and brand.ca They all have the same content with very very minor changes. What's best practice; to use rel="alternate" hreflang="x" or to have unique content written for all of them. Just wondering after Panda, Penguin and the rest of the Zoo what is the best way to run multinational sites and achieve top positions for all of them in their individual countries. If you think it would better to have unique content for each of them, please let us know your reasons. Thanks!
Algorithm Updates | | Tug-Agency0