Reconsideration request ignored for 1 month
-
| Oct 9 (1 day ago) |
See question ive posted here https://productforums.google.com/forum/#!msg/webmasters/Wz_pAz7_lk8/jR8DvSyn5T4JHi
We've submitted 2 reconsideration requests in a month and google are not replying to us. They've caused huge loss in business for us over links that are now against google guidelines. I checked our links from the tools provided in webmaster tools and i can see some that are against google guidelines that have recently been spidered, however these links were built in 2008-09, long before the panda updates where this type of link would be classed as spammy.IS their anyone i can contact to speak with as clearly google are too big to care anymore or respond to such requests.the responce ive got is
| |
|
|On Tuesday, October 9, 2012 10:41:33 AM UTC-3, cyberhostpro wrote:
however these links were built in 2008-09, long before the panda updates
When we all rode horses to work there was no need for speed limits on the highway & byways... but then things change.In 2008-09 those links may have been worth something... why do you believe they deserve that value today?Get rid of all those 2008-09 links and you should be ok! If you could offer nay help or advice with this it would be most appreciatedRegardsDaniel
-
Oh, I think it goes back to the root of all evil: cash. We are a corporation and have several sister companies under our umbrella that, collectively, have good sized budgets. And, in fairness, we may have gotten lucky with our assigned rep
I'm not proud of it; I think it's a huge, huge miss on Google's behalf and a ding to their reputation if they play preference for advertisers (ahem, keyword unavailable). I don't envy your position because this can be so frustrating - thank goodness for groups like this where you can either find answers or commiserate?!
-
Andrea,
I'm glad to hear someone had a better experience than I did, smh. Even after getting enrolled in their 'Engage' program, got an account rep, etc. they did ABSOLUTELY NOTHING and provided me ZERO INSIGHT into how to get a site that was rejected, approved.
My question to you is: what Jedi Mind Tricks did you employ to get that 'better experience'?
-
In my experience, in order to get a timely reply - or a reply from a non-automated source - you have to be an advertiser. It's disappointing, but once I had an account rep, my experience changed for the better.
-
Sometimes, I'm afraid Google just lags on getting back to you. I had several publications that had taken several months to get indexed, had written letters to Google, made sure my site was OK, etc. and they still turned my submission down.
Finally, after about 2 months of trying and trying to get added to their indices (not to mention, pulling my hair out and preparing to give-up) I wrote them one last, long letter that included documentation of what the auto-reply message said I was in violation of (as well as a description of how I was meeting their guidelines) and they finally relented.
To summarize, it's quite apparent that they're not reviewing website submissions with human eyes, at least initially. Just keep trying to get indexed and just keep hounding them, eventually they should accept your submission.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why is my domain authority still 1?
I changed the domain of my website from www.vanillacrush.co.uk to www.carissamay.co.uk at the end of December and yet my DA for carissamay is still 1. As advised, I set up a 301 redirect from VC to CM which seems to be working fine. However when I check on redirect detective it tells me I also have a 302 set up. Could this be confusing things? http://www.vanillacrush.co.uk http://www.vanillacrush.co.uk/ http://www.carissamay.co.uk Any help would be greatly appreciated! Many thanks
Intermediate & Advanced SEO | | Carissamay0 -
Erratic Rankings Earlier This Month - UK
Can anybody tell me what happened here? http://5lx.co.uk/images/searchconsole2015-12-22 13.37.16.png That is the ranking pattern of my UK site on Google UK for my main keyword. It reached number 1 on 7 December but three or four days later, dropped like a stone. It has been very erratic ever since, slumping as low as the mid-30s. I checked the search console but there were no manual penalties. There is an update signalled on the chart but I can't find any info about it and anyway it seems to have happened after the drop began. I barely touched the site between its top ranking and the slump and certainly didn't do anything to deserve that roller coaster (as far as I know). Does anybody have any ideas? Thanks
Intermediate & Advanced SEO | | neilmac0 -
Our client's web property recently switched over to secure pages (https) however there non secure pages (http) are still being indexed in Google. Should we request in GWMT to have the non secure pages deindexed?
Our client recently switched over to https via new SSL. They have also implemented rel canonicals for most of their internal webpages (that point to the https). However many of their non secure webpages are still being indexed by Google. We have access to their GWMT for both the secure and non secure pages.
Intermediate & Advanced SEO | | RosemaryB
Should we just let Google figure out what to do with the non secure pages? We would like to setup 301 redirects from the old non secure pages to the new secure pages, but were not sure if this is going to happen. We thought about requesting in GWMT for Google to remove the non secure pages. However we felt this was pretty drastic. Any recommendations would be much appreciated.0 -
Manual reconsideration request not going away.
Hello fellow Mozzers. I am in need of the support from a knowledgeable community, my brain is hurting over this query and is not providing any answers! So I have got my fingers crossed that someone can spot the issue of why a website I am responsible for has been bumped out of the Google search results. In March this website (www.message me for details.com) lost all keyword rankings and also all brand terms. Action was taken to remove unnatural links as you can see from the timeline below these links have been removed. The manual review request has come back from Google and now seems to indicate the reason for the serps removal is due to 'some or all of your pages still violate our quality guidelines', which makes me think it the website itself as well and links that were causing the issue. So what has happened so far? 9 March - Google Webmaster Tools notice of detected unnatural links to www.message me for details.com 20 May - All 'unnatural' links that could not be removed by contacting website owners were compiled and added to the disavow tool. 29 May - Manual resubmission request submitted. 6 June - The following message received (see end of post): 18 June - updated disavow request submitted - roughly 40-50% links removed. 27 June – Manual review requested. 2 July - The following message received (see end of post): So after reviewing thousands of links and removing any poor-quality links, contacting webmaster and when not ale to remove manually I have added to a disavow list. Although their is a chance I have missed something in the link reviews but I am pretty confident that anything that could be considered spammy as been removed or disavowed. I have also used the tech crawl tool and there are no issues showing up there. I am at a lose as to what is cauing this issue. I need some advice on what steps to take next.. Regards, Colin Google message Dear site owner or webmaster of www.message me for details.com,
Intermediate & Advanced SEO | | TeamSEO
We received a request from a site owner to reconsider http://www.havenpower.com/ for compliance with Google's Webmaster Guidelines.
We've reviewed your site and we believe that some or all of your pages still violate our quality guidelines. In order to preserve the quality of our search engine, pages from http://www.havenpower.com/ may not appear or may not rank as highly in Google's search results, or may otherwise be considered to be less trustworthy than sites which follow the quality guidelines. If you wish to be reconsidered again, please correct or remove all pages that are outside our quality guidelines. When such changes have been made, please visithttps://www.google.com/webmasters/tools/reconsideration?hl=en and resubmit your site for reconsideration. If you have additional questions about how to resolve this issue, please see our Webmaster Help Forum for support.0 -
Our Robots.txt and Reconsideration Request Journey and Success
We have asked a few questions related to this process on Moz and wanted to give a breakdown of our journey as it will likely be helpful to others! A couple of months ago, we updated our robots.txt file with several pages that we did not want to be indexed. At the time, we weren't checking WMT as regularly as we should have been and in a few weeks, we found that apparently one of the robots.txt files we were blocking was a dynamic file that led to the blocking of over 950,000 of our pages according to webmaster tools. Which page was causing this is still a mystery, but we quickly removed all of the entries. From research, most people say that things normalize in a few weeks, so we waited. A few weeks passed and things did not normalize. We searched, we asked and the number of "blocked" pages in WMT which had increased at a rate of a few hundred thousand a week were decreasing at a rate of a thousand a week. At this rate it would be a year or more before the pages were unblocked. This did not change. Two months later and we were still at 840,000 pages blocked. We posted on the Google Webmaster Forum and one of the mods there said that it would just take a long time to normalize. Very frustrating indeed considering how quickly the pages had been blocked. We found a few places on the interwebs that suggested that if you have an issue/mistake with robots.txt that you can submit a reconsideration request. This seemed to be our only hope. So, we put together a detailed reconsideration request asking for help with our blocked pages issue. A few days later, to our horror, we did not get a message offering help with our robots.txt problem. Instead, we received a message saying that we had received a penalty for inbound links that violate Google's terms of use. Major backfire. We used an SEO company years ago that posted a hundred or so blog posts for us. To our knowledge, the links didn't even exist anymore. They did.... So, we signed up for an account with removeem.com. We quickly found many of the links posted by the SEO firm as they were easily recognizable via the anchor text. We began the process of using removem to contact the owners of the blogs. To our surprise, we got a number of removals right away! Others we had to contact another time and many did not respond at all. Those we could not find an email for, we tried posting comments on the blog. Once we felt we had removed as many as possible, we added the rest to a disavow list and uploaded it using the disavow tool in WMT. Then we waited... A few days later, we already had a response. DENIED. In our request, we specifically asked that if the request were to be denied that Google provide some example links. When they denied our request, they sent us an email and including a sample link. It was an interesting example. We actually already had this blog in removem. The issue in this case was, our version was a domain name, i.e. www.domainname.com and the version google had was a wordpress sub domain, i.e. www.subdomain.wordpress.com. So, we went back to the drawing board. This time we signed up for majestic SEO and tied it in with removem. That added a few more links. We also had records from the old SEO company we were able to go through and locate a number of new links. We repeated the previous process, contacting site owners and keeping track of our progress. We also went through the "sample links" in WMT as best as we could (we have a lot of them) to try to pinpoint any other potentials. We removed what we could and again, disavowed the rest. A few days later, we had a message in WMT. DENIED AGAIN! This time it was very discouraging as it just didn't seem there were any more links to remove. The difference this time, was that there was NOT an email from Google. Only a message in WMT. So, while we didn't know if we would receive a response, we responded to the original email asking for more example links, so we could better understand what the issue was. Several days passed we received an email back saying that THE PENALTY HAD BEEN LIFTED! This was of course very good news and it appeared that our email to Google was reviewed and received well. So, the final hurdle was the reason that we originally contacted Google. Our robots.txt issue. We did not receive any information from Google related to the robots.txt issue we originally filed the reconsideration request for. We didn't know if it had just been ignored, or if there was something that might be done about it. So, as a last ditch final effort, we responded to the email once again and requested help as we did the other times with the robots.txt issue. The weekend passed and on Monday we checked WMT again. The number of blocked pages had dropped over the weekend from 840,000 to 440,000! Success! We are still waiting and hoping that number will continue downward back to zero. So, some thoughts: 1. Was our site manually penalized from the beginning, yet without a message in WMT? Or, when we filed the reconsideration request, did the reviewer take a closer look at our site, see the old paid links and add the penalty at that time? If the latter is the case then... 2. Did our reconsideration request backfire? Or, was it ultimately for the best? 3. When asking for reconsideration, make your requests known? If you want example links, ask for them. It never hurts to ask! If you want to be connected with Google via email, ask to be! 4. If you receive an email from Google, don't be afraid to respond to it. I wouldn't over do this or spam them. Keep it to the bare minimum and don't pester them, but if you have something pertinent to say that you have not already said, then don't be afraid to ask. Hopefully our journey might help others who have similar issues and feel free to ask any further questions. Thanks for reading! TheCraig
Intermediate & Advanced SEO | | TheCraig5 -
How do I best optimise a page with 3 keywords that all contain 1 common word?
I am new to this so still getting to grips with a few things. I have a page here that I want to optimise for 3 keyword phrases. Towels, Egyptian Cotton Towels, Personalised Towels http://www.towelsrus.co.uk/towels/catlist_fnct561.htm SEOmoz reports a huge number of instances of the word towels overall. Title 3 URL 1 Meta Desc 5 H1 3 H2-4 1 Body 83 B / Strong 1 IMG ALT 11 Total Keyword Usage for this Page = 108How could i restructure meta tags and descriptions to still rank for these terms but reduce the level the word towels? Similarly as this is a category how can I reduce the term towels from other on page links?Also any extra advice regarding on page optimisation would be greatly appreciated to help our efforts
Intermediate & Advanced SEO | | Towelsrus0 -
Pages that takes more then 1,5 second to load penalized?
Hi all I just read an article (print) about the importance of af having a fast website. The author claims that all pages that are taking longer than 1,5 second to load is getting penalized in the SERPS. Speed is of course a ranking factor. But I have never heard a statement like this before. Is 1,5 second a guideline from Google? Can anyone say, where this number is coming from? Is there maybe another guideline to be followed? Thanks in advance for your comments / answers 🙂 Best regards, Kenneth Karl Nielsen
Intermediate & Advanced SEO | | KennethK0 -
Should we block urls like this - domainname/shop/leather-chairs.html?brand=244&cat=16&dir=ascℴ=price&price=1 within the robots.txt?
I've recently added a campaign within the SEOmoz interface and received an alarming number of errors ~9,000 on our eCommerce website. This site was built in Magento, and we are using search friendly url's however most of our errors were duplicate content / titles due to url's like: domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=1 and domainname/shop/leather-chairs.html?brand=244&cat=16&dir=asc&order=price&price=4. Is this hurting us in the search engines? Is rogerbot too good? What can we do to cut off bots after the ".html?" ? Any help would be much appreciated 🙂
Intermediate & Advanced SEO | | MonsterWeb280