Google Recon Request 4 Failed - This is crazy. HELP!
-
We run a niche website selling sunglasses at www.aluminumeyewear.com.
I've been trying to resolve a 'Failed Quality Guidelines' message since May. My 4th recon request has just failed and I've exhausted all changes that I believe I need to make. I rely on this site to pay my bills etc so obviously I really need to get this resolved. I would be grateful if someone from Google could actually point out whats wrong instead of an unhelpful auto response.Steps taken.1. Rewrote content as it was a bit thin. Recon failed.2. Removed old products that couldn't be reached from every page. Recon failed.3. Submitted back link audit and added 'sitemap' link to footer. Recon Failed.4. Removed 40+ old urls that existed from old Yahoo! store (didn't realize they still existed). Recon failed.I felt sure #4 would resolve the issue so feeling pretty low right now that it didn't. That being said doing a site:aluminumeyewear.com it looks like I missed one of them which was http://www.aluminumeyewear.com/demora/black/, however it just returns a 404 which would seem harsh to penalize me for.The only other pages that I can think of are some dynamic pages that the store uses to create reviews such as:www.aluminumeyewear.com/product-reviews-add.aspx?product=2www.aluminumeyewear.com/resize.aspxI'm pretty sure that the reviews page is blocked via robots txt. The resize.aspx is a blank page with javascript as its needed by the PowerReviews Express system to work, and many many merchants use that platform so it would be hard to think its that.Thanks in advance.
-
Thanks Ryan. That was my thought - address the links and resubmit. Then, if there is still an issue go after the duplication issue.
-
I have worked with dozens of sites penalized for link issues and I have never once seen a notice state "some or all of your pages" violate the quality guidelines. That notice is sent when there are on site issues.
I would welcome the experiences from the community to see if anyone else has experienced anything differently.
You can always address the link issue first and then submit a Reconsideration Request, then proceed from that point.
-
Hi Ryan,
I was wondering how we can be certain that "this notice clearly involves a second issue not related to links?"
Is it not possible that the reason that "some or all of your pages" violate the quality guidelines is because of the bad links pointing to them? Do you know for certain that it has to be an on page issue?
-
Ideally you want to copyright your home page and a few of your main product page. The process takes time but it relatively cheap. This process will not stop people from copying your content but it will make it easier for you to prove you are the original content owner when issues like this arise.
Generally speaking using Google Authorship can help but often companies will not want to identify an author for product pages.
Steps to take:
1. Contact the site which posted the content. Inform them you are the content owners and require them to take it down immediately.
2. If the above request does not yield immediate results, file a DMCA with the site's host.
3. Inform Google of the issue: http://support.google.com/bin/static.py?hl=en&ts=1114905&page=ts.cs. When asked to specify the nature of your request, select " I have a legal issue that is not mentioned above".
The only risk in the above process is if after step 1 the site owner spins the content a bit more but tries to keep it. If you are concerned about that possibility, I would contact both the site host and Google immediately after contacting the site owner. The process is supposed to be to use Google as a last resort, but when dealing with unscrupulous site owners you need to protect your business.
Of course hiring an attorney may be a good step but they can get very expensive fast.
-
Wow - what is it with people? Yes - they've copied that text from our site. They probably think we are Gatorz and hence copied it. Thankfully they have left our product descriptions alone.
Whats the best way to deal with this?
-
The images you shared are perfect. Thank you.
Based on those images, it appears your site has been hit with two manual penalties. I am sorry to share the bad news.
The first manipulative links warning came on April 18th which was before Penguin and before Google was handing out "warnings" which were not penalties.
Your reconsideration request specifically states "some or all of your pages" violate Google's quality guidelines. This notice clearly involves a second issue not related to links. Both of these issues need to be resolved. I would recommend approaching each separately.
I cannot review your site's content fully but even a quick search indicated your site is duplicating / "spinning" content from another site. It could be the other site has taken your content but whatever the issue, it needs to be worked out.
Which of the below two sites has the original content and who copies / spun whom?
-
I've spoken to Marie privatly and she does feel its a back links issue, she's going to take a proper look tomorrow and get back to me.
Very frustrating because some of the issues have been created by the firms that were hired to help me.
Attached are some of the info from GWT.
-
Marie shared some helpful advice. We are frankly flying blind here and we need to understand exactly what penalty we are dealing with. Can you please post the exact Google response from your last reconsideration request? It would also be helpful if you can post the very first Google notice which mentioned a problem with your site along with the date it was received.
-
Yes, that's a good idea (the spreadsheet plus the links to the Google Doc that documents every single email.)
I really think it's a moot point whether it is manual or algorithmic because you got a message in your webmaster tools and then you got a response from your reconsideration request saying your site still doesn't pass. Now it's possible that you could have both but nothing is going to change until you get past the reconsideration. Then, if you do have an algorithmic penalty, the changes that you make for your reconsideration request should be enough for the algorithmic penalty to be gone the next time Penguin/Panda refreshes.
-
Yes, it would seem that way. I've been in ecommerce since 2005 but never had hassle like this before.
Whats the best way to document everything? I'm guessing I can put a spreadsheet together stating success or failure in getting the link removed, but not sure about emails? Perhaps in a word doc and post them all on Google docs and include links to them in the recon request?
Lastly, how do I know its a manual penalty and not an algorithmic one?
-
Oh man...I really feel for you. You're trusting SEO companies and getting deeper and deeper in a hole!
[quote]we used the services of another SEO whose advice was to dilute the crap links with good links and swing the link ratio.[/quote]
This is something that may work for Penguin recovery. The reason for this is that Penguin is algorithmic, so the belief is that if there are a higher percentage of good links than crappy links, then the next time Panda rolls around you may escape the penalty.
But, if you've got a manual warning, there is no dilution that is going to work.
As mentioned in the other thread, if you've got a manual warning then you've got to fess up to EVERY single link that you (or your SEO company) have built with the intention of helping your site (as opposed to natural links).
The problem is that it is sometimes hard to know what is a good link and what isn't. The BBB links may have been good. Hopefully, when you had them change your business category (from internet business) you managed to still keep a link from them.
Most likely you will do well on reconsideration if you address every single backlink on your site and what you have done to deal with it.
For the article sites that have republished your spun article, you still need to attempt to contact them to get the link removed. If they don't respond, or you can't find an email address (either on the site, in the WHOIS or no contact form) then you need to document your attempts (including a copy of the emails you have sent.)
It is also a good idea to mention in your reconsideration request that you may have fallen victim to a bad SEO company and mention the SEO company's name.
-
Hey Marie,
I saw some of your other posts and was hoping you would reply!
Yes, we did receive two of those notices but firstly a bit of history.
We never did any actual link building ourselves however at the start of 2010 an 'SEO' wrote an article for us (back when that seemed ok) and submitted it. The article had exact anchor text in it and it got spun to several article directories. Things seemed fine so we left it at that. Then in mid April this year we got whacked with the unnatural links notice.
At the time GWT reported that our biggest source of inbound links was from the Better Business Bureau, around 90. This is because we are listed in their business directory as an internet business, and for internet businesses they list you in every city in your state with a link to your primary domain. Thinking this was the links they were referring to we got in touch with the BBB and got it amended. We then contacted the guy who had written the article and requested he get it removed. He replied said he had gotten it removed from the directories he had submitted it to, but couldn't do anything about the other sites that had spun the article.
Not long after all this we started getting the other message from Google re quality guidelines, so naturally we started thinking it was something to do with the content on the actual site.
Fast forward a couple of months and we used the services of another SEO whose advice was to dilute the crap links with good links and swing the link ratio. So we hired them to do that. In July we got another unnatural links notice so we had an audit done by the same company. Their audit recommended getting rid of links from sms-jok.com as there were over 500, but here's the kicker, those links were from an article THEY HAD POSTED BY MISTAKE! arrghh - why can't people just do their jobs?
So they got rid of the article from that site and it seems the site has now been de-indexed. We submited the rest of the audit along with a recon request to Google but it still failed.
So now I'm in a situation where paying the mortgage is getting very difficult because of the lost revenue we are experiencing.
I also don't know who to trust.
-
Did your message in webmaster tools mention anything about unnatural links? A quick look at your link profile shows some sites that look like spam. One site I tried to go to in your backlink profile gave me a virus warning.
Part of the quality guidelines have to do with your backlinks too. Any links that were made solely to increase your rankings is considered unnatural.
This question has a good discussion on how to deal with backlinks when dealing with a reconsideration request.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google Indexing Request - Typical Time to Complete?
In Google Search Console, when you request the (re) indexing of a fetched page, what's the average amount of time it takes to re-index and does it vary that much from site to site or are manual re-index request put in a queue and served on a first come - first serve basis despite the site characteristics like domain/page authority?
Intermediate & Advanced SEO | | SEO18050 -
Google is not indexing an updated website
We just relaunched a website that has 5 years old, we maintain all the old URLs and articles but for some reason google is not picking up the new website https://www.navisyachts.com. In Google Webmaster Tools we can see the sitemap with over 1000 pages submitted but shows nothing as indexed. The site is loosing traffic rapidly and positions, from the SEO side all looks fine for me. What can be wrong? I’ll appreciate any help. The new website is built over Joomla 3.4, we have it here at MOZ and other than some minor details it doesn't show that something can be wrong with the website. Thank you.
Intermediate & Advanced SEO | | FWC_SEO0 -
Google and PDF indexing
It was recently brought to my attention that one of the PDFs on our site wasn't showing up when looking for a particular phrase within the document. The user was trying to search only within our site. Once I removed the site restriction - I noticed that there was another site using the exact same PDF. It appears Google is indexing that PDF but not ours. The name, title, and content are the same. Is there any way to get around this? I find it interesting as we use GSA and within GSA it shows up for the phrase. I have to imagine Google is saying that it already has the PDF and therefore is ignoring our PDF. Any tricks to get around this? BTW - both sites rightfully should have the PDF. One is a client site and they are allowed to host the PDFs created for them. However, I'd like Mathematica to also be listed. Query: no site restriction (notice: Teach for america comes up #1 and Mathematica is not listed). https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#q=HSAC_final_rpt_9_2013.pdf+"Teach+charlotte"+filetype:pdf&as_qdr=all&filter=0 Query: site restriction (notice that it doesn't find the phrase and redirects to any of the words) https://www.google.com/search?as_q=&as_epq=HSAC_final_rpt_9_2013.pdf&as_oq=&as_eq=&as_nlo=&as_nhi=&lr=&cr=&as_qdr=all&as_sitesearch=&as_occt=any&safe=images&tbs=&as_filetype=pdf&as_rights=&gws_rd=ssl#as_qdr=all&q="Teach+charlotte"+site:www.mathematica-mpr.com+filetype:pdf
Intermediate & Advanced SEO | | jpfleiderer0 -
Google Places Listing Active In Two Seperate Google Places Accounts?
Hi is there any issues with having a google places listing in two seperate google places accounts. For example we have a client who cannot access their old google places account (ex-employee had their login details which they can't get) and want us to take control over the listing. If we click the "is this your listing" manage this page button - and claim the listing, will this transfer the listing to our control? Or will it create a duplicate? Are there any problems having the listing in different separate accounts. Is it a situation in which the last person who manages the listing takes control? And the listing automatically deactivates from the old account? Do all the images remain aswell? Thanks,
Intermediate & Advanced SEO | | MBASydney
Tom0 -
Help needed on Google Webmaster tools
Hi I notice that one of my oldest sites , even if I put hundred of backlinks (good or bad) google webmaster tools never index more like 20 per day. Why is this happening? They control it? I mean they dont let them all to get indexed and they take it slowly slowly? If I put just 20 per day is the ideal link building amount? Thnk you
Intermediate & Advanced SEO | | nyanainc0 -
Is Affiliate masking a problem for Google?
Does Google consider affiliate masking as unethical? I have a offer website that has 1000's of affiliate links that are masked. I feel google does not like masking URL's which is why my ranking started dropping. Can any one explain the context of affiliate masking please?
Intermediate & Advanced SEO | | SEOMad0 -
How to Block Google Preview?
Hi, Our site is very good for Javascript-On users, however many pages are loaded via AJAX and are inaccessible with JS-off. I'm looking to make this content available with JS-off so Search Engines can access them, however we don't have the Dev time to make them 'pretty' for JS-off users. The idea is to make them accessible with JS-off, but when requested by a user with JS-on the user is forwarded to the 'pretty' AJAX version. The content (text, images, links, videos etc) is exactly the same but it's an enormous amount of effort to make the JS-off version 'pretty' and I can't justify the development time to do this. The problem is that Googlebot will index this page and show a preview of the ugly JS-off page in the preview on their results - which isn't good for the brand. Is there a way or meta code that can be used to stop the preview but still have it cached? My current options are to use the meta noarchive or "Cache-Control" content="no-cache" to ask Google to stop caching the page completely, but wanted to know if there was a better way of doing this? Any ideas guys and girls? Thanks FashionLux
Intermediate & Advanced SEO | | FashionLux0 -
Google Places Duplicate Listings
Hey Mozzers- I know the basic process for handling duplicate listings, but I just want to make sure and ask because this one is a little sensitive. I have a client with a claimed and verified listings page, which is here: http://maps.google.com/maps/place?q=chambers+and+associates&hl=en&cid=9065936543314453461 There is also another listing (which I have not claimed yet) here: http://maps.google.com/maps/place?q=dr.+george+chambers&hl=en&cid=14758636806656154330 The first listing has 0 reviews, where the 2nd unverified listing has 12 fantastic 5 star reviews. We can all agree that if I can get these two listings to merge, his general listing will perform much better than it already is (the first listing has about 200 actions per months). So, what is the best way to merge these two without losing any reviews and without suspending my places account? Thanks in advance! Ian
Intermediate & Advanced SEO | | itrogers0