How about a discussion on Penguin 2.0?
-
Penguin 2.0 was officially released today. I'm sure we've all seen Matt's video.
http://searchengineland.com/penguin-4-with-penguin-2-0-generation-spam-fighting-is-now-live-160544
Ideas for building sharable, linkable content? New strategies? What to avoid, what not to do, etc?
Let's get a discussion going!
-
Amazing ..... this huge update (from what I read on other sites and listening to Matt) and almost nothing here on SEOMoz.....What is up with that.
-
Good point Maria. Google will be pretty unstable for the first little while. It's better to keep calm and carry on with good content
-
Daniel,
To your first point -- for my main keywords (which according to Google have been commercial in intent for the last few years -- i.e. the kw queries bring up only retail options) are now showing reviews from authoritative/journalistic sites. So for instance, "Men's Jeans" might have used to bring up all retail stores, now it might bring 8 stores (plus ads) and 1 or 2 "Men's Jeans" reviews from GQ or something to that effect.
But having said that, I started noticing this about a week ago...
-
Google penguin 2.0 rolled out today!
I saw many some changes on Google. -
A few observations from the many keywords and sites I monitor:
1. A lot of what I track is in reviews of various products. I've seen a big jump for Consumer Reports. In many instances they now have two listings and they are higher than where their one previously was. Anyone else seeing this?
2. I'm seeing a lot of new sites I haven't seen before, especially after the first page. In the top 20 of one of my main keywords there are 4 new domains that have never been there before...nor should they be. Horrible sites. One of them is [extremelylongkeyword2013] (dot) blogspot.com. It's pretty nuts to see that.
3. In every search, whether searching for the product or reviews of the product, sears, amazon and walmart are almost always the top three in some order. I'm even seeing Wikipedia higher than ever. This is a shuffle from what used to be.
4. I'm seeing a lot more about.com results than I did before.
5. I have noticed News results on almost every search, many unwarranted. Often the news results are appearing after the top 10.
6. No major shakeups on any of my own sites. Little bit of shuffling, some up, some down, but nothing major. Biggest winners I am seeing are consumer reports, sears, amazon and walmart.
-
I find that the first few weeks after an update the SEO world is awash with dangerous speculation. I'm excited to see some real data on sites that have recovered. Until then I'm going to try to avoid speculating.
-
To me it looks like larger authority sites are ranking better. I had a few top position drop to under position 5-10 (I deal with mainly local niches), and now see larger websites although not as relevant as mine doing better imo.
-
I'm with Jesse -- it will take a bit before we can tell anything conclusive, but I'll watch my site, and if I notice anything I will report and add to the discussion. I don't think it will be anything unexpected if you have been following what Google says and watching the trends within your markets for the past few weeks/months.
But we will see...
-
Will probably be awhile before we know anything concrete but my guess would be the same strategies of 1.0 apply toward 2.0. I'd imagine they just widened the umbrella, so to speak.
We'll see! I've got a close eye on a few sites I'm curious to see what will happen to...
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google only indexing the top 2/3 of my page?
HI, I have a page that is about 5000 lines of code total. I was having difficulty figuring out why the addition of a lot of targeted, quality content to the bottom of the pages was not helping with rankings. Then, when fetching as Google, I noticed that only about 3300 lines were getting indexed for some reason. So naturally, that content wasn't going to have any effect if Google in not seeing it. Has anyone seen this before? Thoughts on what may be happening? I'm not seeing any errors begin thrown by the page....and I'm not aware of a limit of lines of code Google will crawl. Pages load under 5 seconds so loading speed shouldn't be the issue. Thanks, Kevin
Intermediate & Advanced SEO | | yandl1 -
Trying to pinpoint why 1 keyword moved down 100 positions in 2 weeks. Help me speculate?
Hi there, One of my client's sites, a very large and successful ecommerce website with great SEO performance, has seen a significant drop in rankings in the past 2 weeks. The rankings have begun to somewhat stabilize today, except one particular keyword with a search volume of 74k has gone from 1 to 100. Here is what has taken place in 2 weeks, sitewide: I revised and improved upon title tags and meta descriptions to make them more user-friendly and contain more optimized terms. Following all of Google's best practices, as always. Google still appears to be indexing these changes (has anyone seen an initial drop in rankings while this takes place?) The site has seen a very significant increase in 404 errors due to one feature of the site breaking. We got a message about it in Webmaster Tools, and this appears to coincide with when overall rankings dropped. The development team is working quickly to get this resolved. As of today, I am seeing the highest page-load time than any other day in 2015. With regard to the particular page/keyword in question: The keyword is no longer "exact match" at the beginning of the title tag, but rather broken up throughout the title tag so the whole title sounds better for users. **Have you found that this type of change is sufficient for a keyword rank to move down ~100 positions?? **(Either way, I have asked the client to revise the title to start with the exact match keyword, once again.) Google has indexed the page 2 days ago, but is still displaying the old title tag in search results. I have not found any instances of internal or external links to this page being removed. With all this information, does anyone see anything that seems like it could have reasonably caused such a huge tank in rankings? Is this a blip in time? Is there anything I am not considering? Should I just be patient?
Intermediate & Advanced SEO | | FPD_NYC0 -
2.3 million 404s in GWT - learn to live with 'em?
So I’m working on optimizing a directory site. Total size: 12.5 million pages in the XML sitemap. This is orders of magnitude larger than any site I’ve ever worked on – heck, every other site I’ve ever worked on combined would be a rounding error compared to this. Before I was hired, the company brought in an outside consultant to iron out some of the technical issues on the site. To his credit, he was worth the money: indexation and organic Google traffic have steadily increased over the last six months. However, some issues remain. The company has access to a quality (i.e. paid) source of data for directory listing pages, but the last time the data was refreshed some months back, it threw 1.8 million 404s in GWT. That has since started to grow progressively higher; now we have 2.3 million 404s in GWT. Based on what I’ve been able to determine, links on this particular site relative to the data feed are broken generally due to one of two reasons: the page just doesn’t exist anymore (i.e. wasn’t found in the data refresh, so the page was simply deleted), or the URL had to change due to some technical issue (page still exists, just now under a different link). With other sites I’ve worked on, 404s aren’t that big a deal: set up a 301 redirect in htaccess and problem solved. In this instance, setting up that many 301 redirects, even if it could somehow be automated, just isn’t an option due to the potential bloat in the htaccess file. Based on what I’ve read here and here, 404s in and of themselves don’t really hurt the site indexation or ranking. And the more I consider it, the really big sites – the Amazons and eBays of the world – have to contend with broken links all the time due to product pages coming and going. Bottom line, it looks like if we really want to refresh the data on the site on a regular basis – and I believe that is priority one if we want the bot to come back more frequently – we’ll just have to put up with broken links on the site on a more regular basis. So here’s where my thought process is leading: Go ahead and refresh the data. Make sure the XML sitemaps are refreshed as well – hopefully this will help the site stay current in the index. Keep an eye on broken links in GWT. Implement 301s for really important pages (i.e. content-rich stuff that is really mission-critical). Otherwise, just learn to live with a certain number of 404s being reported in GWT on more or less an ongoing basis. Watch the overall trend of 404s in GWT. At least make sure they don’t increase. Hopefully, if we can make sure that the sitemap is updated when we refresh the data, the 404s reported will decrease over time. We do have an issue with the site creating some weird pages with content that lives within tabs on specific pages. Once we can clamp down on those and a few other technical issues, I think keeping the data refreshed should help with our indexation and crawl rates. Thoughts? If you think I’m off base, please set me straight. 🙂
Intermediate & Advanced SEO | | ufmedia0 -
Product Pages & Panda 4.0
Greeting MOZ Community: I operate a real estate web site in New York City (www.nyc-officespace-leader.com). Of the 600 pages, about 350 of the URLs are product pages, written about specific listings. The content on these pages is quite short, sometimes only 20 words. My ranking has dropped very much since mid-May, around the time of the new Panda update. I suspect it has something to do with the very short product pages, the 350 listing pages. What is the best way to deal with these pages so as to recover ranking. I am considering these options: 1. Setting them to "no-index". But I am concerned that removing product pages is sending the wrong message to Google. 2. Enhancing the content and making certain that each page has at least 150-200 words. Re-writing 350 listings would be a real project, but if necessary to recover I will bite the bullet. What is the best way to address this issue? I am very surprised that Google does not understand that product URLs can be very brief and yet have useful content. Information about a potential office rental that lists location, size, price per square foot is valuable to the visitor but can be very brief. Especially listings that change frequently. So I am surprised by the penalty. Would I be better off not having separate URLs for the listings, and for instance adding them as posts within building pages? Is having separate URLs for product pages with minimal content a bad idea from an SEO perspective? Does anyone have any suggestions as to how I can recover from this latest Panda penalty? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Duplicating a site on 2 different ccTLDs and using cannonical
Hello, We have a site that sells a certain product on www.example.com. This site contains thousands of pages including a whole section of well written content that we invested a lot of money in making. The site ranks on many KWs both brand and non-brand related. SERPs include the Homepage and many of the articles mentioned. We receive traffic and clients to this site from around the world, BUT our main geo-targeting is UK. Due to lack of resources and some legal needs we now have to create a new site - www.example.co.uk that all UK traffic will be able to purchase the product only from this site and not from the .com site anymore. We have no resources to create new content for the new .co.uk site and that is the reason we want to duplicate the site on both domains and use a canonical tag to point the .co.uk site as the primary site. Does anyone have experience with such activity? will this work across the whole site? We need to have a fast solution here, as we do not have too much time to wait because of the legal issue I mentioned. What is the best solutions you can offer to do this so we do not lose important SERPs. On the one hand since our main market is the UK, we assume the main site to promote will be www.example.co.uk but as said earlier, we still have users from other parts of the world as well. Is there any risk that we are missing here? Thanks James
Intermediate & Advanced SEO | | Tit0 -
Has My Site Been Hit by Panda 4.0?
I operate a New York City commercial real estate web site (www.nyc-officespace-leader.com). Ranking and traffic have dropped steeply since early June. Around May 20th a new Panda update was launched by Google and I wonder if that could partially explain the drop. My site contains the following: -300 listing pages. These are product pages and often contain less than 100 words. Many have not been changed in two years. -150 Building pages. These contain less than 220 words. Many have not been changed in two years. -40 blog pages. We have been adding 1 or 2 per month. -50 or 60 neighborhood and type of space pages. These contain 200-600 words. Could our drop in traffic be due to Panda? I might add that an upgraded version of the site with new forms, a modified right rail an header was launched on June 6th. Also, we submitted a disavow file with Google on April 20th for about 100 toxic domains, one third of the 300 domains that link to us. In order to take remedial action we need to understand what has happened. Any ideas??? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Sitemap Issue - vol 2
Hello everyone! I validated the sitemap with different tools (w3Schools, and so on..) and no errors were found. So I uploaded into my site, tested it through GWT and BANG! all of a sudden there is a parsing error, which correspond to the last, and I mean last piece of code of thousand of lines, . I don't know why it isn't reading the code and it's giving me this as there are no other errors and I haven't got a clue about what to do in order to fix it! Thanks
Intermediate & Advanced SEO | | PremioOscar0 -
PR dropped from 3 to 0\. Why?
Hi, I have an important question and I hope you'll be able to help me find the answer. My site http://www.pokeronlineitalia.com had a PR of 3 (three). Then, I had it restructured to make it look more appealing in order to increase the conversion rate. The problem is that after it was redesigned, the PR dropped all of a sudden from 3 to 0. This is really bad, as it took me over three years to reach that point. Could you please analyze the site and find out what happened? I used SEOMOZ's research tools to try to understand and noticed the following message" "Accessible to Engines Easy fix Crawl status Status Code: 200 meta-robots: noindex, follow meta-refresh: None X-Robots: None Explanation Pages that can't be crawled or indexed have no opportunity to rank in the results. Before tweaking keyword targeting or leveraging other optimization techniques, it's essential to make sure this page is accessible. Recommendation Ensure the URL returns the HTTP code 200 and is not blocked with robots.txt, meta robots or x-robots protocol (and does not meta refresh to another URL)". Basically, the message said that the search engines cannot access the homepage (http://www.pokeronlineitalia.com). May this be the reason why the PR dropped? What do I have to do to solve this problem? Is there a chance I can reach a PR of 3 again? Thank you very much for your help. It'd be great if you could help my site regain its SEO strength.
Intermediate & Advanced SEO | | salvyy0