January 2013 Google update affected my projects ?
-
I am running 400+ projects. Mostly all projects keyword rank has been effected recently. IS there any new update from google between 10-19 January 2013 ?
-
This seems more of a link algorithm with a mix of keyword rich urls update.
We also track around 20,000 keywords on a weekly basis and can see a big flux. This seems more like Google devaluating some of the backlinks or sites in masses which is causing this.
There is no specific pattern which we can see. It seems more of the small business websites getting hit than the Brands.
But again too early to single out specific reasons.
-
I agree Marie it the wrong date and from what i have seen it has impacted too many sites in the UK8% of more than the 1.8%V mentioned in the tweet. Nearly all the verticals and niches we track have had changes in rankings and some very odd results. A search like UK VPN In google.co.uk has seen a number of service providers replaced with details of UK University VPN services and even the University of Kentucky in the top 20 results. I can't for the life of me see how those results would match the broad intent of the search.
-
Google did refresh Panda today (the 22 of Jan) but this would not have caused the traffic drop between Jan 10th and 19th that the OP had.
-
Google confirms it as Panda. More details over here- http://searchengineland.com/google-panda-update-version-24-1-2-of-search-queries-impacted-146149
-
Regardless of what Google officially says, there is/ was an update around 17 Jan 2013. We track 50,000+ keywords on weekly basis and we saw 6x times SERP movement than we see every week.
For us, this was bigger than Penguin or Panda. Would take some time for rankings to stabilise after which there can be some consensus as to what happened.
Marie - thanks for sharing the link. I kind of agree with you that google will use the disavow tool data. But knowing google, I don't think they will do manual checking for more than 1000 sites? They can simply calculate top 1000 domains disavowed across all disavow requests and then use that data ? But I think that update will come in future. This looks like a version or penguin ?
-
There has been no official word from Google about an update. A lot of people have been grumbling in the forums however about something going on. When Barry from SERoundtable commented on this Google stated that there was no major update but that the algorithm is always changing.
There is also speculation from the team at Branded3 (see this post - http://www.branded3.com/blogs/google-moves-towards-continual-link-devaluation/ that Google may be changing how they detect bad links. If I understand it right, the idea is that instead of devaluing bad links in bunches every time Penguin refreshes, Google is devaluing bad links as they crawl.
I have another theory. I am wondering if Google is starting to put into use the information they are getting from the disavow tool. So, let's say that a whole pile of websites have included spammyarticles.com in their disavow.txt file. Google evaluates the site and decides that it only exists to provide spammy backlinks and as such devalues all links that are coming from this site. I have no proof for this, but it's a possibility.
-
Hi Deepak
Here is the post on "Updated: Stronger Reports Of A Google Update" : http://www.seroundtable.com/google-update-january-16230.html
-
I noticed some changes on Monday and the Google dance seems to have been going on ever since. Some keywords are changing positions by the hour. Agree with Ask Hopper than it will be a few weeks before it settles but it looks like a link based issue.
-
I noticed some changes on Monday and the Google dance seems to have been going on ever since. Some keywords are changing positions by the hour. Agree with Ask Hopper than it will be a few weeks before it settles but it looks like a link based issue.
-
Have a watch of this video from Barry Schwartz and you will see many have found this but nothing has been announced.
https://www.youtube.com/watch?feature=player_embedded&v=xNplIqrs-Os
Andy
-
Hi Yes updates to the algorithm are in progress right now and ongoing, I would suspect that it is going to take few weeks now to settle down before you get any real information on your actual page positions for your keywords and phrases.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google ignoring the Title Tag?
Anybody seen this too? We have a webpage with tiny different title tag and H1. If you search for let's say "Renovatie", you get to see the title tag "De kostprijs van je renovatie". However, when you search with the search term "Wat kost een renovatie", we see the H1 title in the SERP, which is "Wat kost een renovatie". So that's normal when you search a term that's exact the same as the H1 tag, Google ignores the title tag? N.
Technical SEO | | nans0 -
Vanity URLs are being indexed in Google
We are currently using vanity URLs to track offline marketing, the vanity URL is structured as www.clientdomain.com/publication, this URL then is 302 redirected to the actual URL on the website not a custom landing page. The resulting redirected URL looks like: www.clientdomain.com/xyzpage?utm_source=print&utm_medium=print&utm_campaign=printcampaign. We have started to notice that some of the vanity URLs are being indexed in Google search. To prevent this from happening should we be using a 301 redirect instead of a 302 and will the Google index ignore the utm parameters in the URL that is being 301 redirect to? If not, any suggestions on how to handle? Thanks,
Technical SEO | | seogirl221 -
Google Indexing of Site Map
We recently launched a new site - on June 4th we submitted our site map to google and almost instantly had all 25,000 URL's crawled (yay!). On June 18th, we made some updates to the title & description tags for the majority of pages on our site and added new content to our home page so we submitted a new sitemap. So far the results have been underwhelming and google has indexed a very low number of the updated pages. As a result, only a handful of the new titles and descriptions are showing up on the SERP pages. Any ideas as to why this might be? What are the tricks to having google re-index all of the URLs in a sitemap?
Technical SEO | | Emily_A0 -
What does Google PageSpeed measure?
What does the PageSpeed tool actually measure? Does it say that a webserver is fast or slow? Thanks in advanced!
Technical SEO | | DanielMulderNL0 -
Google having trouble accessing my site
Hi google is having problem accessing my site. each day it is bringing up access denied errors and when i have checked what this means i have the following Access denied errors In general, Google discovers content by following links from one page to another. To crawl a page, Googlebot must be able to access it. If you’re seeing unexpected Access Denied errors, it may be for the following reasons: Googlebot couldn’t access a URL on your site because your site requires users to log in to view all or some of your content. (Tip: You can get around this by removing this requirement for user-agent Googlebot.) Your robots.txt file is blocking Google from accessing your whole site or individual URLs or directories. Test that your robots.txt is working as expected. The Test robots.txt tool lets you see exactly how Googlebot will interpret the contents of your robots.txt file. The Google user-agent is Googlebot. (How to verify that a user-agent really is Googlebot.) The Fetch as Google tool helps you understand exactly how your site appears to Googlebot. This can be very useful when troubleshooting problems with your site's content or discoverability in search results. Your server requires users to authenticate using a proxy, or your hosting provider may be blocking Google from accessing your site. Now i have contacted my hosting company who said there is not a problem but said to read the following page http://www.tmdhosting.com/kb/technical-questions/other/robots-txt-file-to-improve-the-way-search-bots-crawl/ i have read it and as far as i can see i have my file set up right which is listed below. they said if i still have problems then i need to contact google. can anyone please give me advice on what to do. the errors are responce code 403 User-agent: *
Technical SEO | | ClaireH-184886
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/0 -
Did Google just release another massive update in September?
Our number of external links has dropped by over 50% in mid-September! So far our domain authority hasn't been impacted and traffic is only slightly down. I did not hear of any major Google changes . . . did this happen to anyone else?
Technical SEO | | BlueLinkERP0 -
Cantags within links affect Google's perception of them?
Hi, All! This might be really obvious, but I have little coding experience, so when in doubt - ask... One of our client site's has navigation that looks (in part) like this: <a <span="">href</a><a <span="">="http://www.mysite.com/section1"></a> <a <span="">src="images/arrow6.gif" width="13" height="7" alt="Section 1">Section 1</a><a <span=""></a> WC3 told us the tags invalidate, and while I ignored most of their comments because I didn't think it would impact on what search engines saw, because thesetags are right in the links, it raised a question. Anyone know if this is for sure a problem/not a problem? Thanks in advance! Aviva B
Technical SEO | | debi_zyx0 -
Will a "blog=example "parameter at the end of my URLs affect google's crawling them?
For example, I'm wondering if www.example.com/blog/blog-post is better than www.example.com/blog/blog-post?blog=example? I'm currently using the www.example.com/blog/blog-post?blog=example structure as our canonical page for content. I'm also wondering, if the parameter doesn't affect crawling, if it would hurt rankings in any way. Thanks!
Technical SEO | | Intridea0