Google penalty removal expert questions
-
We have searched online for a Google penalty “expert” (individual or company) and have located what appear to be “experts”.
Please provide feedback on the following 2 individuals/companies we have found that can help with penalty removal.
Have you or one of your clients used either of the “experts” below?
What were the results?
How many disavows and reconsideration requests did you/they have to make?
1.www.penaltypros.com . To give a quote and to see what your links are they use links from Google Webmaster Tools only.
Penaltypros.com disavows first and then removes bad links second. This is opposite of what Google and Seo’s recommend but penaltypros.com claims 100% success using this non-traditional approach. See imgur.com link for screenshot.
2.http://www.hiswebmarketing.com/ To give a quote and to see what your links are they use links from https://ahrefs.com/ only.
Please provide any and all feedback on the above 2 “experts” and also post the websites, individual names, company names of those that you consider Google penalty removal “experts” so that we may obtain a quote from them.
-
Thanks for considering Penalty Pros to assist you with your Google penalty.
Whilst I obviously believe strongly in our service, we can equally vouch for Marie and her team - they do great work (and I can't say that about many...)
All the best!
-
We received an email from Google WMT on April 7th, 2013 notifying us of a manual action taken against our site due to unnatural inbound links. Our SEO company at the time (a very well known SEO company) didn't respond to any of our emails or phone calls about the WMT notice for 3 days, and then finally told us to "just ignore it". Our rankings plummeted and after much searching we found Marie. We hired her on May 29th to handle the penalty which was finally revoked on October 4th. Unfortunately we were hit by the Penguin update on May 26th and didn't get our link profile cleaned up in time for the October 4th Penguin update. So we're still penalized.
We had to submit about 5 reconsideration requests in all. When Google rejected our 4th reconsideration request they gave us an example URL that wasn't reported in WMT, or in any of the other link reporting tools (sneaky Google). But Marie used the link as a clue and found 5 other URLs with the same scraped content. We quickly got those links removed, submitted another reconsideration request, and voila, the penalty was revoked. It definitely took a lot of work.
Marie is very knowledgable as a link analysis and penalty expert and is deeply committed to her customers. The nice thing about working with her is that you pay her once, and she will work until your penally is revoked, no matter how many reconsideration requests she has to file.
I highly recommend Marie and hiswebmarketing.com
Wishing you a speedy recovery.
-
Thanks for the kind words Michael.
Your site was one of the most difficult ones that we have dealt with. For some reason Google was extremely picky. But, I think that it was important that Google made us find close to ever possible unnatural link. We're already seeing improvements in your traffic and I expect that when Penguin refreshes you'll see even more.
-
Thanks for including us in your search! I just wanted to clarify one point. Although we base our quotes on the number of linking domains listed in ahrefs.com, we use links from every source we can find including WMT, Majestic SEO, Open Site Explorer and our own tool that finds links that are in the Google index but not in the above mentioned backlink checkers.
I'm happy to answer any other questions you may have. We do honest work and we are very good at removing penalties.
Marie
-
I inherited a nasty case of bad link building and 3-way link schemes I had to deal with midway through 2013. I carefully vetted dozens of "experts". I went with Marie and her team at hiswebmarketing.com and have have no regrets.
The manual process they use is strictly by the book. You'll also have an opportunity to review your bad links to make sure you don't lose any good juice. Considering all of the manual labor and documentation required, I think their price was more than fair compared to the competition. I thought the penalty was a death sentence but it's been revoked and I'm back on page 1-3 (depending on market) for the main keyword I was penalized for.
I'm pretty confident that if you go with hiswebmarketing.com your penalty will get revoked.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Are large property portals going to continue to dominate Google's search results?
We are having a discussion (potential argument) in our office around whether large portals (namely property portals) have longevity in Google's search. Argument 1:
Industry News | | NeilPursey
Google's rise in local search and rewarding strong brand names rather than keyword driven domain names will devalue property portals with keyword rich domain names. Property portals are essentially duplicating content on smaller individually owned property websites, therefore in time Google will devalue property portals. **Argument 2: **
Property portals have more property stock listed on their websites so therefore Google will reward them by ranking these websites higher than the smaller real estate agencies with niche stock in their areas that they operate in. The property portals that already are already in a dominate position already carry authority and their own sense of branding, therefore it's difficult for Google to ignore them. If we assume that Google is looking into user behaviour as a ranking factor, then this will help portals as they have more stock which means higher engagement on the website. I'd love to read the moz community thoughts and opinions on this. I reckon it's a worthy debate..0 -
Google still showing sitelinks from old website
Hi guys, we relaunched our website www.segafredo.com.au a few weeks ago, however google is still showing site links from our old page that no longer exist... Is there anything we can do about this? Sit back and wait or try demoting the old urls in webmaster tools? Looking forward to see your tips! Ciao, Manny.
Industry News | | Immanuel0 -
Did Google Search Just Get Crazy Local?
Hey All, I think it's a known fact at this point that when signed into a personal Google account while doing a search, the results are very oriented around keywords and phrases you have already searched for, as well as your account's perceived location; for instance when I wanted to check one of my own web properties in SE listings I would sign out or it would likely appear first as a false reading. Today I noticed something very interesting: even when not signed in, Google's listings were giving precedence to locality. It was to a very extreme degree, as in when searching for "web design," a firm a mile away ranked higher than one 1.5 miles away and such. It would seem that the algos having this high a level of location sensitivity and preference would actually be a boon for the little guys, which is, I assume why it was implemented. However, it brings up a couple of interesting questions for me. 1. How is this going to affect Moz (or any SE ranking platform, for that matter) reports? I assume that Google pulls locations from IP Addresses, therefore would it not simply pull the local results most relevant for the Moz server(s) IP? 2. What can one do to rise above this aggressive level of location based search? I mean, my site (which has a DA of 37 and a PA of 48) appears above sites like webdesign.org (DA of 82, PA of 85). Not that I'm complaining at the moment, but I could see this being a fairly big deal for larger firms looking to rank on a national level. What gives? I'd love to get some opinions from the community here if anyone else has noticed this...
Industry News | | G2W1 -
Google Penguin 2.0 - How To Recover?
Hi all,
Industry News | | chanel27
Last year, we have engaged a SEO company who promised to bring us to the first page on Google. But after 4 months, we actually found out that he might be using doing non quality mass link building tactic and this caused our ranking for all 3 sites we given to him to drop in ranking overnight on 22nd May 2012 after the Google Penguin 2.0 rolled out. Is there anything we can do to recover?1 -
Google Webspam Algo Update 24/4/12
Having just checked our clients rankings 95% have not been affected, in fact many have moved up rankings. 1 or 2 have had big drops 😞 Who has been effected by this? The forums are full of people talking about sites being floored from the serp's. it will be interesting to follow the aftermath of this and get some insight into what exactly has changed!
Industry News | | ifluidmedia0 -
Google places rejected
google has rejected a few listing i have for certain businesses, i have read the guidlines and I am well inside them. It does say that if business name is changed you need to re-verify, but does not allow you to do so. I think google have lost their way, they should stop building operating systems and electric cars and get their web site sorted out.
Industry News | | AlanMosley0 -
What is the best method for getting pure Javascript/Ajax pages Indeded by Google for SEO?
I am in the process of researching this further, and wanted to share some of what I have found below. Anyone who can confirm or deny these assumptions or add some insight would be appreciated. Option: 1 If you're starting from scratch, a good approach is to build your site's structure and navigation using only HTML. Then, once you have the site's pages, links, and content in place, you can spice up the appearance and interface with AJAX. Googlebot will be happy looking at the HTML, while users with modern browsers can enjoy your AJAX bonuses. You can use Hijax to help ajax and html links coexist. You can use Meta NoFollow tags etc to prevent the crawlers from accessing the javascript versions of the page. Currently, webmasters create a "parallel universe" of content. Users of JavaScript-enabled browsers will see content that is created dynamically, whereas users of non-JavaScript-enabled browsers as well as crawlers will see content that is static and created offline. In current practice, "progressive enhancement" in the form of Hijax-links are often used. Option: 2
Industry News | | webbroi
In order to make your AJAX application crawlable, your site needs to abide by a new agreement. This agreement rests on the following: The site adopts the AJAX crawling scheme. For each URL that has dynamically produced content, your server provides an HTML snapshot, which is the content a user (with a browser) sees. Often, such URLs will be AJAX URLs, that is, URLs containing a hash fragment, for example www.example.com/index.html#key=value, where #key=value is the hash fragment. An HTML snapshot is all the content that appears on the page after the JavaScript has been executed. The search engine indexes the HTML snapshot and serves your original AJAX URLs in search results. In order to make this work, the application must use a specific syntax in the AJAX URLs (let's call them "pretty URLs;" you'll see why in the following sections). The search engine crawler will temporarily modify these "pretty URLs" into "ugly URLs" and request those from your server. This request of an "ugly URL" indicates to the server that it should not return the regular web page it would give to a browser, but instead an HTML snapshot. When the crawler has obtained the content for the modified ugly URL, it indexes its content, then displays the original pretty URL in the search results. In other words, end users will always see the pretty URL containing a hash fragment. The following diagram summarizes the agreement:
See more in the....... Getting Started Guide. Make sure you avoid this:
http://www.google.com/support/webmasters/bin/answer.py?answer=66355
Here is a few example Pages that have mostly Javascrip/AJAX : http://catchfree.com/listen-to-music#&tab=top-free-apps-tab https://www.pivotaltracker.com/public_projects This is what the spiders see: view-source:http://catchfree.com/listen-to-music#&tab=top-free-apps-tab This is the best resources I have found regarding Google and Javascript http://code.google.com/web/ajaxcrawling/ - This is step by step instructions.
http://www.google.com/support/webmasters/bin/answer.py?answer=81766
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
Some additional Resources: http://googlewebmastercentral.blogspot.com/2009/10/proposal-for-making-ajax-crawlable.html
http://www.seomoz.org/blog/how-to-allow-google-to-crawl-ajax-content
http://www.google.com/support/webmasters/bin/answer.py?answer=357690 -
How does recent Google update affect e-commerce sites:
Most ecommerce sites use the original manufacturer product descriptions in their content. The product features and specifications are the content made by the manufacturer. Sometimes manufacturers insist that the ecommerce sites should use their original content and it is impossible to change what available in the original content and rewrite it.
Industry News | | IM_Learner1