January 2013 Google update affected my projects ?
-
I am running 400+ projects. Mostly all projects keyword rank has been effected recently. IS there any new update from google between 10-19 January 2013 ?
-
This seems more of a link algorithm with a mix of keyword rich urls update.
We also track around 20,000 keywords on a weekly basis and can see a big flux. This seems more like Google devaluating some of the backlinks or sites in masses which is causing this.
There is no specific pattern which we can see. It seems more of the small business websites getting hit than the Brands.
But again too early to single out specific reasons.
-
I agree Marie it the wrong date and from what i have seen it has impacted too many sites in the UK8% of more than the 1.8%V mentioned in the tweet. Nearly all the verticals and niches we track have had changes in rankings and some very odd results. A search like UK VPN In google.co.uk has seen a number of service providers replaced with details of UK University VPN services and even the University of Kentucky in the top 20 results. I can't for the life of me see how those results would match the broad intent of the search.
-
Google did refresh Panda today (the 22 of Jan) but this would not have caused the traffic drop between Jan 10th and 19th that the OP had.
-
Google confirms it as Panda. More details over here- http://searchengineland.com/google-panda-update-version-24-1-2-of-search-queries-impacted-146149
-
Regardless of what Google officially says, there is/ was an update around 17 Jan 2013. We track 50,000+ keywords on weekly basis and we saw 6x times SERP movement than we see every week.
For us, this was bigger than Penguin or Panda. Would take some time for rankings to stabilise after which there can be some consensus as to what happened.
Marie - thanks for sharing the link. I kind of agree with you that google will use the disavow tool data. But knowing google, I don't think they will do manual checking for more than 1000 sites? They can simply calculate top 1000 domains disavowed across all disavow requests and then use that data ? But I think that update will come in future. This looks like a version or penguin ?
-
There has been no official word from Google about an update. A lot of people have been grumbling in the forums however about something going on. When Barry from SERoundtable commented on this Google stated that there was no major update but that the algorithm is always changing.
There is also speculation from the team at Branded3 (see this post - http://www.branded3.com/blogs/google-moves-towards-continual-link-devaluation/ that Google may be changing how they detect bad links. If I understand it right, the idea is that instead of devaluing bad links in bunches every time Penguin refreshes, Google is devaluing bad links as they crawl.
I have another theory. I am wondering if Google is starting to put into use the information they are getting from the disavow tool. So, let's say that a whole pile of websites have included spammyarticles.com in their disavow.txt file. Google evaluates the site and decides that it only exists to provide spammy backlinks and as such devalues all links that are coming from this site. I have no proof for this, but it's a possibility.
-
Hi Deepak
Here is the post on "Updated: Stronger Reports Of A Google Update" : http://www.seroundtable.com/google-update-january-16230.html
-
I noticed some changes on Monday and the Google dance seems to have been going on ever since. Some keywords are changing positions by the hour. Agree with Ask Hopper than it will be a few weeks before it settles but it looks like a link based issue.
-
I noticed some changes on Monday and the Google dance seems to have been going on ever since. Some keywords are changing positions by the hour. Agree with Ask Hopper than it will be a few weeks before it settles but it looks like a link based issue.
-
Have a watch of this video from Barry Schwartz and you will see many have found this but nothing has been announced.
https://www.youtube.com/watch?feature=player_embedded&v=xNplIqrs-Os
Andy
-
Hi Yes updates to the algorithm are in progress right now and ongoing, I would suspect that it is going to take few weeks now to settle down before you get any real information on your actual page positions for your keywords and phrases.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Blocking Google from telemetry requests
At Magnet.me we track the items people are viewing in order to optimize our recommendations. As such we fire POST requests back to our backends every few seconds when enough user initiated actions have happened (think about scrolling for example). In order to eliminate bots from distorting statistics we ignore their values serverside. Based on some internal logging, we see that Googlebot is also performing these POST requests in its javascript crawling. In a 7 day period, that amounts to around 800k POST requests. As we are ignoring that data anyhow, and it is quite a number, we considered reducing this for bots. Though, we had several questions about this:
Technical SEO | | rogier_slag
1. Do these requests count towards crawl budgets?
2. If they do, and we'd want to prevent this from happening: what would be the preferred option? Either preventing the request in the frontend code, or blocking the request using a robots.txt line? The latter question is given by the fact that a in-app block for the request could lead to different behaviour for users and bots, and may be Google could penalize that as cloaking. The latter is slightly less convenient from a development perspective, as all logic is spread throughout the application. I'm aware one should not cloak, or makes pages appear differently to search engine crawlers. However these requests do not change anything in the pages behaviour, and purely send some anonymous data so we can improve future recommendations.0 -
Google News problem
Hello to all. The latest Google algorithm changes have had a big impact on the way that Google news features stories, at least in my country. I've been featured heavily in Google News until about 6th of october, when the changes had the biggest impact, but since then, I haven't been featured at all. Prior to this, I would be featured for keywords on almost any article, not necessarily on the 1st position, but I was almost always there. Posts still show up in the dedicated News category, but not in the main search pages. I've seen a lot of websites being impacted, but some with lower ranks than mine still show up there. I haven't done any changes prior to the 6th of october, and I haven't done any link building campaings, just getting links from higher ranking news sites in my country, for articles I wrote. What I'd like to know is if there were any major changes for Google News and I'm not complying with any of them, or If i could check to see if there are any other problems. I don't have any penalties disclosed by Google, and no new errors in the Webmasters console, I'm just baffled by the fact that overnight the website was completely cut off from being featured in Google News. And one other strange thing, I'm now ranking better for searches that are kind of opposite to my website's main theme. Think about mainly writing about BMW, and less about AUDI, but ranking a lot better for the latter, and a lot less for the other. Thank you.
Technical SEO | | thefrost0 -
How do I avoid this issue of duplicate content with Google?
I have an ecommerce website which sells a product that has many different variations based on a vehicle’s make, model, and year. Currently, we sell this product on one page “www.cargoliner.com/products.php?did=10001” and we show a modal to sort through each make, model, and year. This is important because based on the make, model, and year, we have different prices/configurations for each. For example, for the Jeep Wrangler and Jeep Cherokee, we might have different products: Ultimate Pet Liner - Jeep Wrangler 2011-2013 - $350 Ultimate Pet Liner - Jeep Wrangler 2014 - 2015 - $350 Utlimate Pet Liner - Jeep Cherokee 2011-2015 - $400 Although the typical consumer might think we have 1 product (the Ultimate Pet Liner), we look at these as many different types of products, each with a different configuration and different variants. We do NOT have unique content for each make, model, and year. We have the same content and images for each. When the customer selects their make, model, and year, we just search and replace the text to make it look like the make, model, and year. For example, when a custom selects 2015 Jeep Wrangler from the modal, we do a search and replace so the page will have the same url (www.cargoliner.com/products.php?did=10001) but the product title will say “2015 Jeep Wrangler”. Here’s my problem: We want all of these individual products to have their own unique urls (cargoliner.com/products/2015-jeep-wrangler) so we can reference them in emails to customers and ideally we start creating unique content for them. Our only problem is that there will be hundreds of them and they don’t have unique content other than us switching in the product title and change of variants. Also, we don’t want our url www.cargoliner.com/products.php?did=10001 to lose its link juice. Here’s my question(s): My assumption is that I should just keep my url: www.cargoliner.com/products.php?did=10001 and be able to sort through the products on that page. Then I should go ahead and make individual urls for each of these products (i.e. cargoliner.com/products/2015-jeep-wrangler) but just add a “nofollow noindex” to the page. Is this what I should do? How secure is a “no-follow noindex” on a webpage? Does Google still index? Am I at risk for duplicate content penalties? Thanks!
Technical SEO | | kirbyfike0 -
Google truncating or altering meta title - affect rankings?
I have a site that the title tag is too long and the title is simply the name of the site (I think they get it from ODP, not sure) Anyway, the rankings for the home page have dropped quite a bit. I'm wondering if the change that Google makes affects rankings (i.e. name of site doesn't have all the keywords).
Technical SEO | | santiago230 -
Google Indexing
Hi Everybody, I am having kind of an issue when it comes to the results Google is showing on my site. I have a multilingual site, which is main language is Catalan. But of course if I am looking results in Spanish (google.es) or in English (google.com) I want Google to show the results with the proper URL, title and descriptions. My brand is "Vallnord" so if you type this in Google you will be displayed the result in Catalan (Which is not optimized at all yet) but if you search "vallnord.com/es" only then you will be displayed the result in Spanish What do I have to do in order for Google to read this the way I want? Regards, Guido.
Technical SEO | | SilbertAd0 -
How can I tell Google, that a page has not changed?
Hello, we have a website with many thousands of pages. Some of them change frequently, some never. Our problem is, that googlebot is generating way too much traffic. Half of our page views are generated by googlebot. We would like to tell googlebot, to stop crawling pages that never change. This one for instance: http://www.prinz.de/party/partybilder/bilder-party-pics,412598,9545978-1,VnPartypics.html As you can see, there is almost no content on the page and the picture will never change.So I am wondering, if it makes sense to tell google that there is no need to come back. The following header fields might be relevant. Currently our webserver answers with the following headers: Cache-Control: no-cache, must-revalidate, post-check=0, pre-check=0, public
Technical SEO | | bimp
Pragma: no-cache
Expires: Thu, 19 Nov 1981 08:52:00 GMT Does Google honor these fields? Should we remove no-cache, must-revalidate, pragma: no-cache and set expires e.g. to 30 days in the future? I also read, that a webpage that has not changed, should answer with 304 instead of 200. Does it make sense to implement that? Unfortunatly that would be quite hard for us. Maybe Google would also spend more time then on pages that actually changed, instead of wasting it on unchanged pages. Do you have any other suggestions, how we can reduce the traffic of google bot on unrelevant pages? Thanks for your help Cord0 -
Google Webmaster tools error?
So I am trying to set the URL preference in google webmaster tools for my site. However when I try to save it it tells me to verify that I own the site. I have already done this so where can I go to verify I own the site exactly? Maybe I am wrong and I have not done this already but even on the homepage of webmaster tools I don't see an option to "verify".
Technical SEO | | ENSO0