Is there a way to contact Google besides the google product forum?
-
Our traffic from google has dropped more than 35% and continues to fall. We have been on this forum and google's webmaster forum trying to get help. We received great advice, have waited months, but instead of our traffic improving, it has worsened. We are being penalized by google for many keywords such as trophies, trophies and awards and countless others - we were on page one previously.
We filed two reconsideration requests and were told both times that there were no manual penalties.
Some of our pages continue to rank well, so it is not across the board (but all of our listings went down a bit).
We have made countless changes (please see below).
Our busy season was from March to May and we got clobbered. Google, as most people know, is a monopoly when it comes to traffic, so we are getting killed.
At first we thought it was Penquin, but it looks like we started getting killed late last year. Lots of unusual things happened - we had a large spike in traffic for two days, then lost our branded keywords, then our main keywords. Our branded keywords came back pretty quickly, but nothing else did.
We have received wonderful advice and made most of the changes. We are a very reputable company and have a feeling we are being penalized for something other than spamming. For example, we have a mobile site we added late last year and a wholesale system was added around the same time. Since the date does not coincide with Penquin, we think there is some major technical driver, but have no idea what to do at this point.
The webmasters have all been helpful, but nothing is working. We are trying to find out what one does in a situation as we are trying to avoid closing our business.
Thank you!
Changes Made:
1. We had many crawl errors so we reduced them significantly
2. We had introduced a mobile website in January which we
thought may have been the cause (splitting traffic, duplicate content, etc.),
so we had our mobile provider add the site to their robots.txt file.3. We were told by a webmaster that their were too many
links from our search provider, so we have them put the search pages in a
robots.txt file.4. We were told that we had too much duplicate content. This was / is true, as we have hundred of legitate products that are similar:
example trophies and certificates that are virtually the same but are
for different sports or have different colors and sizes. Still, we added more content and added no index tags to many products. We compared our % of dups to competitors and it is far less.5. At the recommendation of another webmaster, we changed
many pages that might have been splitting traffic.6. Another webmaster told us that too many people were
linking into our site with the same text, namely Trophy Central and that it
might have appeared we were trying to game the system somehow. We have never bought links and don't even have a webmaster although over the last 10 years have worked with programmers and seo companies (but we don't think any have done anything unusual).7. At the suggestion of another webmaster, we have tried to
improve our link profile. For example,
we found Yahoo was not linking to our url.8. We were told to setup a 404 page, so we did
9. We were told to ensure that all of the similar domains
were pointing to www.trophycentral.com/ so we setup redirects10. We were told that a site that we have linking to us from too many places so we reduced it to 1.
Our key pages have A rankings from SEOMOZ for the selected keywords.
We have made countless other changes recommended by experts
but have seen no improvements (actually got worse). I am the
president of the company and have made most of the above recent changes myself.Our website is trophycentral.com
-
Hi, thanks for your comments. Yes, I posted many questions on the google forum and got some good suggestions, but so far no luck. I'll take a look at the keywords...
Thanks.
-
Hi, It does sound like a similar problem to a client of mine.
We submitted all their dodgy backlink sites to Google webmaster tools but there's no way of contacting Google to see when these sites will be de-indexed.
Have you tried posting within the webmaster tools forum?
I also noticed your still using meta keywords, although this isn't harming your site, it could be perceived as Keyword stuffing.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Whats up with the last google update.
I have numerous clients who were at the top of page in the top 3 spots. They all dropped to page 3 or 4 or 2 and now they are number 1 in maps or in the top 3. Content is great on all these sites. backlinks are high quality and we do not build high quantity, we always focus on quality. the sites have authorship information. trust . we have excellent content written by professionals in the industry for each of the websites. The sites load super fast. they are very mobile friendly. we have CDN installed. content is organized per topic. all of our citations are setup properly and no duplicates, or missing citations. code is good on the websites. we do not have anchor text links pointing to the site from gust posts or whatever. we have plenty of content. our DA/PA is great. Audits of the website are great. I've been doing this a long time and ive never been so dumb founded as to what google has done this time. Or better yet what exactly is wrong with our clients websites today that was working perfectly for the last 5 years. I really am getting frustrated. im comparing my sites to competitors and everything's better. Please someone guide me here and tell me what im missing or tell me what you have done to recover from this nonsense.
Intermediate & Advanced SEO | | waqid0 -
Best way to link to 1000 city landing pages from index page in a way that google follows/crawls these links (without building country pages)?
Currently we have direct links to the top 100 country and city landing pages on our index page of the root domain.
Intermediate & Advanced SEO | | lcourse
I would like to add in the index page for each country a link "more cities" which then loads dynamically (without reloading the page and without redirecting to another page) a list with links to all cities in this country.
I do not want to dillute "link juice" to my top 100 country and city landing pages on the index page.
I would still like google to be able to crawl and follow these links to cities that I load dynamically later. In this particular case typical site hiearchy of country pages with links to all cities is not an option. Any recommendations on how best to implement?0 -
Is possible to submit a XML sitemap to Google without using Google Search Console?
We have a client that will not grant us access to their Google Search Console (don't ask us why). Is there anyway possible to submit a XML sitemap to Google without using GSC? Thanks
Intermediate & Advanced SEO | | RosemaryB0 -
With the New Panda update supposedly only weeks away, is it wise to No Index my products I have not had time to rewrite the product descriptions for ?
Hi Mozzers, I read on SEJ yesterday than apparently the Panda update was due in the 2 - 4 weeks. I still have a large of my products which I have not got around to rewriting unique product descriptions for. I know these product descriptions are duplicated on other affiliate sites so do it think it in light of the panda update coming , would it wise to put a NO INDEX Meta tag on these product pages until I get around to rewriting the descriptions. That way, I may not hit my Panda and it will buy me a bit more time. Just an idea , but thought I'd run it by. thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
Google indexing wrong pages
We have a variety of issues at the moment, and need some advice. First off, we have a HUGE indexing issue across our entire website. Website in question: http://www.localsearch.com.au/ Firstly
Intermediate & Advanced SEO | | localdirectories
In Google.com.au, if you search for 'plumbers gosford' (https://www.google.com.au/#q=plumbers+gosford), the wrong page appears - in this instance, the page ranking should be http://www.localsearch.com.au/Gosford,NSW/Plumbers I can see this across the board, across multiple locations. Secondly
Recently I've seen Google reporting in 'Crawl Errors' in webmaster tools URLs such as:
http://www.localsearch.com.au/Saunders-Beach,QLD/Electronic-Equipment-Sales-Repairs&Sa=U&Ei=xs-XVJzAA9T_YQSMgIHQCw&Ved=0CIMBEBYwEg&Usg=AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA This is an invalid URL, and more specifically, those query strings seem to be referrer queries from Google themselves: &Sa=U&Ei=xs-XVJzAA9T_YQSMgIHQCw&Ved=0CIMBEBYwEg&Usg=AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA Here's the above example indexed in Google: https://www.google.com.au/#q="AFQjCNHXPrZZg0JU3O4yTGjWbijon1Q8OA" Does anyone have any advice on those 2 errors?0 -
Google bot vs google mobile bot
Hi everyone 🙂 I seriously hope you can come up with an idea to a solution for the problem below, cause I am kinda stuck 😕 Situation: A client of mine has a webshop located on a hosted server. The shop is made in a closed CMS, meaning that I have very limited options for changing the code. Limited access to pagehead and can within the CMS only use JavaScript and HTML. The only place I have access to a server-side language is in the root where a Defualt.asp file redirects the visitor to a specific folder where the webshop is located. The webshop have 2 "languages"/store views. One for normal browsers and google-bot and one for mobile browsers and google-mobile-bot.In the default.asp (asp classic). I do a test for user agent and redirect the user to one domain or the mobile, sub-domain. All good right? unfortunately not. Now we arrive at the core of the problem. Since the mobile shop was added on a later date, Google already had most of the pages from the shop in it's index. and apparently uses them as entrance pages to crawl the site with the mobile bot. Hence it never sees the default.asp (or outright ignores it).. and this causes as you might have guessed a huge pile of "Dub-content" Normally you would just place some user-agent detection in the page head and either throw Google a 301 or a rel-canon. But since I only have access to JavaScript and html in the page head, this cannot be done. I'm kinda running out of options quickly, so if anyone has an idea as to how the BEEP! I get Google to index the right domains for the right devices, please feel free to comment. 🙂 Any and all ideas are more then welcome.
Intermediate & Advanced SEO | | ReneReinholdt0 -
Google Places not appearing
is it possible to be sandboxed for a google places page? one of our clinics has a places page, and it was doing fine (http://www.google.com/maps/place?cid=5542269234389030356) but now whenever we set our location to trinity,fl and try to search for weight loss, weight loss trinity, etc.. it doesnt come up. it only comes up if we search medi weight loss trinity. also, when we go into our google places dashboard and try to edit the pictures, it doesnt show the same pictures on the actual locations page. for example, in our dashboard we have 5 pictures, but on the actual places page, 3 pictures are showing (none of which are in our dashboard). any ideas?
Intermediate & Advanced SEO | | AustinBarton0 -
Block Google Sitelinks for DSEO?
I am trying to manage DSEO for a client. The question is: would blocking a page listing from my client's Google Sitelinks cause that blocked sitelink page to be independently listed in the rankings and therefore potentially stuff a negative listing further down the rankings? Or would the blocked sitelink not show up at all in the SERPs
Intermediate & Advanced SEO | | bcmull0