Is it possible to reverse a G algo update (Penguin/Panda)?
-
...if yes, how? Can you share resources / blogs / etc... I want to reverse my site's rankings.
Here's the gist of it:
- I recently purchased a website that has 600+ pieces of aged content on it.
- Domain was ranking great about 10 years ago (1M uniques a year)
- It apparently got hit by a G algo update in 2012/2013 (Penguin and Panda?), because the rankings have tanked (10 hits a day)
- In the past two years, the previosu owner published about 100+ off-topic blog posts and it appears been using the site as a PBN. The UX sucks and there's a ton of 404s. (NOTE: I am in the process of removing that content and have cleaned up the 404s).
Domain stats: 20+ years old (1998) and DA 32, linking domains 850+, inbound links of 16k+
What I've done:
- disavow (550 domains),
- fix all the 404s
What I'm doing / about to do:
- remove spammy content
- write new/fresh on-topic content
- update the site UX
- start a backlink building campaign
My questions:
- is it common to bounce back from a G algo update? is it hard / am I over my head / am I a sucker trying to get the site back alive?
- are there articles about bouncing back that you can share so I can learn more about this process? Or agencies / consultants, etc that you recommend?
- what other recommendations / suggestions do you have that would help reverse this 8-year-old penalty?
-
Agreed... the SEO landscape has changed greatly since 2012. I'll definitely have to re-invest in new, fresh content and start building backlinks.
But what I'm looking for specifically is a playbook for bouncing back from an algo penalty. I need a strategy. I was hoping others have been able to bounce back and have shared their experience.
ps - it's an algo penalty, not manual, so no messages or warnings in GSC unfortunately.
-
Hi there,
It is doable, but it will not be an easy task since now you have more competitors than back in 2012. I would say it depends on your specific vertical, your budget, and your end goal for this site. You need to see who you go against to because you need to consistently invest money into content and links in order to beat your competitors. If you still have a penalty is GSC then you should submit a reconsideration request by saying that you are a new owner of the site.
Ross
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Search Console Crawl Errors/Not Found - Strange URLs
Hello, In Google Search Console under Crawl > Crawl Errors > Not found I have strange URLs like the following: https://www.domain.com//UbaOZ/
Reporting & Analytics | | chuck-layton
https://www.domain.com//UPhXZ/
https://www.domain.com//KaUpZ/WYdhZ/SnQZZ/MOcUZ/ There is no info in Linked From tab. Have you seen this type of error??
Does anyone know whats causing it??
How should it be fixed?? Thanks for reading and the help!0 -
Does anyone have a good resource for seeing SERP feature/presentation change Year-Over-Year?
Moz only tracks general SERP feature changes for the last 30 days.
Reporting & Analytics | | homedepot0 -
Track conversion from paypal express/Apple pay
Hi All, Is there any way to track apple pay conversion or paypal express conversion in Google Analytics? Thanks
Reporting & Analytics | | Alick3000 -
Help Blocking Crawlers. Huge Spike in "Direct Visits" with 96% Bounce Rate & Low Pages/Visit.
Hello, I'm hoping one of you search geniuses can help me. We have a successful client who started seeing a HUGE spike in direct visits as reported by Google Analytics. This traffic now represents approximately 70% of all website traffic. These "direct visits" have a bounce rate of 96%+ and only 1-2 pages/visit. This is skewing our analytics in a big way and rendering them pretty much useless. I suspect this is some sort of crawler activity but we have no access to the server log files to verify this or identify the culprit. The client's site is on a GoDaddy Managed WordPress hosting account. The way I see it, there are a couple of possibilities.
Reporting & Analytics | | EricFish
1.) Our client's competitors are scraping the site on a regular basis to stay on top of site modifications, keyword emphasis, etc. It seems like whenever we make meaningful changes to the site, one of their competitors does a knock-off a few days later. Hmmm. 2.) Our client's competitors have this crawler hitting the site thousands of times a day to raise bounce rates and decrease the average time on site, which could like have an negative impact on SEO. Correct me if I'm wrong but I don't believe Google is going to reward sites with 90% bounce rates, 1-2 pages/visit and an 18 second average time on site. The bottom line is that we need to identify these bogus "direct visits" and find a way to block them. I've seen several WordPress plugins that claim to help with this but I certainly don't want to block valid crawlers, especially Google, from accessing the site. If someone out there could please weigh in on this and help us resolve the issue, I'd really appreciate it. Heck, I'll even name my third-born after you. Thanks for your help. Eric0 -
How well do G.A. utm Campaigns play together?
Hello! Here's my situation. In the coming days I will have two sources of Google Analytics utm Campaigns running: Source 1 is paid online advertising. Source 2 will be banner ads on our own website that encourage people to request an appointment with the dentist whose profile page they're viewing. (With close to 100 doctors, that's a LOT of Campaigns URLs, but I fell it will be worth it to know where appointment requests are coming from.) So, the question... Let's say a site visitor arrives to our site via one of our paid ad's utm Campaigns URL. What happens if that visitor navigates to one of our dentist's profile page then clicks the banner ad that also contains a utm URL. Which Campaign gets the credit for the goal conversion? The most recent (the banner)? The original source (the paid ad)? Both? Thanks in advance for any insight! Erik
Reporting & Analytics | | SmileMoreSEO0 -
URL Structure Q - /UniqueURL/ProductA or /SubcategoryURL/ProductA?
Hi Mozers, I have a niche ecommerce site http://www.ecustomfinishes.com that sells custom barn wood furniture. I have about 600 products online. 2 weeks ago I started rewriting my urls from /subcategoryurl/ProductA to /UNIQUEURL/productA for my individual products, For example for my subcategory farm tables (150 products) I had /rustic-farm-tables/productA, /rustic-farm-tables/ProductB ...."rustic-farm-table" about 150 times. 2 weeks ago I started changing the 150x "/rustic-farm-table/" to a more descriptive URL such as /white-farm-table/producA /rustic-square-dining-table/ProductB /Black-harvest-table/ProductC Here is why I am need advice: I have 1181 pages, the page with the most entrances with "rustic-farm-tables" is #31/1181 based on entrances. the 2nd most is #71/1181 Alternatively, I have 13 table product pages such a as /12ft-Rustic-Farm-Dining-Table-p/12-foot-table-with-inlay.htm" that get more entrances than any product that includes "rustic-farm-tables" Since changing the urls to be product specific, my overall traffic has dropped 20%!!! So here is my question: do i continue to have the /UNIQUEURL/product be unique to the product, which is consistant amongst my best preforming pages, yet has dropped my traffic 20% in the last 2 weeks, OR do i keep /SAME-URL/product which written as a best practice, and be happy with the traffic I had? Could the 20% drop just be a temporary shock? Why would this happen? This would be a good long tail/head term experiment. Try to get more head terms, or do what you can do focus on long tail. I hope i was able to explain this well, I say follow the best practices of my best preforming pages, however the 20% drop has me worried. Thank you in advance for your help
Reporting & Analytics | | longdenc_gmail.com0 -
Is Google able to determine duplicate content every day/ month?
A while ago I talked to somebody who used to work for MSN a couple of years ago within their engineering department. We talked about a recent dip we had with one of our sites.We argued this could be caused by the large amount of duplicate content we have on this particular website (+80% of our site). Then he said, quoted: "Google seems only to be able to determine every couple of months instead of every day if the content is actually duplicate content". I clearly don't doubt that duplicate content is a ranking factor. But I would like to know you guys opinions about Google being only able to determine this every couple of X months instead of everyday. Have you seen or heard something similar?
Reporting & Analytics | | Martijn_Scheijbeler0 -
Best practice SEO/SEM/Analaytics/Social reports
Hi All, does anyone have a best practice excel spreadsheet of a internal report we should be using.... ie what are the main factors we should be tracking? Unqiue views? time spent on site? Where they came from? seo/sem/network/direct to site? social media tracking? amount of +1/fb likes/tweets etc thanks
Reporting & Analytics | | Tradingpost0