Penalty removing company recommendation?
-
We've got a manual penalty, not sitewide, that we've been trying to remove and keep getting our reconsideration request denied. We also do not have the manpower to manually check backlinks, contact domain owners, etc anymore. Does anyone have recommendations on a company to use?
-
Any SEO agency should be able to do this for you. We do a one-off removal for clients for a set fee and they pay once - after that, we do all the work for removal. If it's denied, we do resubmission and get it taken care of. That's something you should be looking for - make sure they don't do it once-off because if they don't get everything, you'll be paying again.
(You don't want someone to be overly aggressive with the disavow, either. You will lose the good links as well.)
-
1. Unnatural links. We keep finding all these obviously paid for links and asking for them to be removed, and disavowing them.
2. Unnatural links.
-
1. Does the reconsideration request give a specific reason as to why it repeatedly gets denied?
2. What was the original manual action for?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Penalty
Hi, I wonder if a subdomain gets penalty from Google. Is there any risk that the domain gets penalty?
White Hat / Black Hat SEO | | Rob_I1 -
Removing duplicated content using only the NOINDEX in large scale (80% of the website).
Hi everyone, I am taking care of the large "news" website (500k pages), which got massive hit from Panda because of the duplicated content (70% was syndicated content). I recommended that all syndicated content should be removed and the website should focus on original, high quallity content. However, this was implemented only partially. All syndicated content is set to NOINDEX (they thing that it is good for user to see standard news + original HQ content). Of course it didn't help at all. No change after months. If I would be Google, I would definitely penalize website that has 80% of the content set to NOINDEX a it is duplicated. I would consider this site "cheating" and not worthy for the user. What do you think about this "theory"? What would you do? Thank you for your help!
White Hat / Black Hat SEO | | Lukas_TheCurious0 -
Why isn't a 301 redirect removing old style URLs from Google's index?
I have two questions:1 - We changed the URL structure of our site. Old URLs were in the format of kiwiforsale.com/used_fruit/yummy_kiwi. These URLs are 301 redirected to kiwiforsale.com/used-fruit/yummy-kiwi. We are getting duplicate content errors in Google Webmaster Tools. Why isn't the 301 redirect removing the old style URL out of Google's index?2 - I tried to remove the old style URL at https://www.google.com/webmasters/tools/removals, however I got the message that "We think the image or web page you're trying to remove hasn't been removed by the site owner. Before Google can remove it from our search results, the site owner needs to take down or update the content."Why are we getting this message? Doesn't the 301 redirect alert Google that the old style URL is toast and it's gone?
White Hat / Black Hat SEO | | CFSSEO0 -
Will Removing My Keyword from Breadcrumb Title to Simplify UI Hurt Page SEO?
Working on the UI of a new site and I would like to simplify the breadcrumbs so they do not take up as much space. They will still communicate the same message to user. See example below: Before: Home > Widget Dealers > Tennessee > Nashville After: Home > Dealers > Tennessee > Nashville The page title and/or menu item would still be "Widget Dealers". So my question is, if I remove the keyword "Widget" only from the breadcrumb could that hurt me in any way?
White Hat / Black Hat SEO | | the-coopersmith1 -
Our ranking as not returned after penalty, Why?
Hi, We have had a Google action against us for over a year. After many "SEO Company's" we found someone who help us remove (December 2013) the action. Which was due to our bad back link profile. We have 100% improved our content for our website, as Google has requested. We are active within social media, we add relevant content to our blog and we clean up our desk after we finish work 🙂 After looking at Moz tools we have great results, sometimes even better than our competitors. But we are still not getting or improving on our traffic, if anything its decreasing. Is anyone else in the same position? or has anyone recovered from a similar situation? Josh
White Hat / Black Hat SEO | | JoshuaKersh0 -
SEO Company claiming our results?
This company http://www.synapseinteractive.com/portfolio/kempnrugelawgroup.php is linking to us, with a bs graphic about how they improved our rankings for some keywords. I have no idea who this company is. Does this happen often? Also, I'm tempted to contact them to take it down, but I really don't need some questionable company getting annoyed and then linking 10,000 spam sites to me. Any thoughts on what to do? I'm tempted to just do nothing, but for I ignore it, I want to make sure there's insidious about this link that would cause me any problems down the road. No, I haven't got any GWT or BWT messages about it. Thanks, Ruben
White Hat / Black Hat SEO | | KempRugeLawGroup0 -
Looking for a Way to Standardize Content for Thousands of Pages w/o Getting Duplicate Content Penalties
Hi All, I'll premise this by saying that we like to engage in as much white hat SEO as possible. I'm certainly not asking for any shady advice, but we have a lot of local pages to optimize :). So, we are an IT and management training course provider. We have 34 locations across the US and each of our 34 locations offers the same courses. Each of our locations has its own page on our website. However, in order to really hone the local SEO game by course topic area and city, we are creating dynamic custom pages that list our course offerings/dates for each individual topic and city. Right now, our pages are dynamic and being crawled and ranking well within Google. We conducted a very small scale test on this in our Washington Dc and New York areas with our SharePoint course offerings and it was a great success. We are ranking well on "sharepoint training in new york/dc" etc for two custom pages. So, with 34 locations across the states and 21 course topic areas, that's well over 700 pages of content to maintain - A LOT more than just the two we tested. Our engineers have offered to create a standard title tag, meta description, h1, h2, etc, but with some varying components. This is from our engineer specifically: "Regarding pages with the specific topic areas, do you have a specific format for the Meta Description and the Custom Paragraph? Since these are dynamic pages, it would work better and be a lot easier to maintain if we could standardize a format that all the pages would use for the Meta and Paragraph. For example, if we made the Paragraph: “Our [Topic Area] training is easy to find in the [City, State] area.” As a note, other content such as directions and course dates will always vary from city to city so content won't be the same everywhere, just slightly the same. It works better this way because HTFU is actually a single page, and we are just passing the venue code to the page to dynamically build the page based on that venue code. So they aren’t technically individual pages, although they seem like that on the web. If we don’t standardize the text, then someone will have to maintain custom text for all active venue codes for all cities for all topics. So you could be talking about over a thousand records to maintain depending on what you want customized. Another option is to have several standardized paragraphs, such as: “Our [Topic Area] training is easy to find in the [City, State] area. Followed by other content specific to the location
White Hat / Black Hat SEO | | CSawatzky
“Find your [Topic Area] training course in [City, State] with ease.” Followed by other content specific to the location Then we could randomize what is displayed. The key is to have a standardized format so additional work doesn’t have to be done to maintain custom formats/text for individual pages. So, mozzers, my question to you all is, can we standardize with slight variations specific to that location and topic area w/o getting getting dinged for spam or duplicate content. Often times I ask myself "if Matt Cutts was standing here, would he approve?" For this, I am leaning towards "yes," but I always need a gut check. Sorry for the long message. Hopefully someone can help. Thank you! Pedram1 -
I am still confused about anchor text and penalties
As I understand in order to rank well for the page in google, the page/site has to have a lot of back links that have an anchor text with the keywords that you want to rank for. At the same time if google finds that your anchor text contains kewords that are in your title or h1 tag, it may penalize your site. So what do i do to rank well for my chosen keywords. Lets say I am only interested in keywords San Francisco widget and Oakland widget The title of my webpage says San Francisco | Oakland widget The anchor text that I usually pick is either San Francisco widget or Oakland widget. I also have plenty of links that have anchor text like "website" or "click here" What should I use for my anchor text in my backlinks?
White Hat / Black Hat SEO | | SirMax0