Sites went from page 1 to page 40 + in results
-
Hello all
We are looking for any insight we can get as to why all (except 1) of our sites were effected very badly in the rankings by Google since the Panda updates.
Several of our sites londonescape.com dublinescape.com and prague, paris, florence, delhi, dubai and a few others (all escape.com urls) have had major drop in their rankings.
LondonEscape.net (now.com (changed after rank drop) ), was ranked between 4th & 6th but is now down around 400th and DelhiEscape.net and MunichEscape.com were both number 1 for several years for our main key words
We also had two Stay sites number 1 , AmsterdamStay and NewYorkstay both .com ranked number 1 for years , NewYork has dropped to 10th place so far the Amsterdam site has not been effected.
We are not really sure what we did wrong. MunichEscape and DelhiEcape should never have been page 1 sites ) just 5 pages and a click thru to main site WorldEscape) but we never did anything to make them number 1.
London, NewYork and Amsterdam sites have had regular new content added, all is checked to make sure its original.
**Since the rankings drop **
LondonEscape.com site
We have redirected the.net to the .com url
Added a mountain of new articles and content
Redesigned the site / script
Got a fair few links removed from sites, any with multiple links to us. A few I have not managed yet to get taken down.
So far no result in increased rankings.
We contacted Google but they informed us we have NOT had a manual ban imposed on us, we received NO mails from Google informing us we had done anything wrong.
We were hoping it would be a 6 month ban but we are way past that now.
Anyone any ideas ?
-
Are your sites cross-linked at all (I'm not seeing anything obvious)? The cascade effect across all of them almost suggests they were either (1) seen as a link network, or (2) you're using the same link strategy across all of them.
I'd watch the keyword stuffing - your titles are pretty aggressive, and then you repeat those lists in the copy at the top and bottom of the page. It's borderline at best. I don't think it's enough to get multiple sites penalized, but it may not be helping you right now.
You've got a duplicate of the London Escape home-page floating out there in the index:
http://www.londonescape.com/index.php?option=com_wesearch&task=loginVIAFacebook
Might want to add rel=canonical to the home-page, just to sweep up things like that.
You're indexed and I'm able to get the site ranking on very targeted keywords (exact-match and fairly unique), but you fall off the map for everything else. That's definitely indicative of a link-based penalty.
-
Hello Brian
Thanks for the coments
We have been removing links since the site went down in the rankings so that accounts for the slow hemorage of links
We thought having the .com would be better than the .net as was recommended by a SEO company we hired and then fired.
We had a links company 5 + years back get us links and that is how we have 50% links with same anchor, stupid we now know of course and could well be something to do with penalty.
Several of our other sites would be in the same link boat I guess and also went down the rankings but they had little content, so it is not surprising they went down.
We were never bothered with analytics as our sites were all top and were for years. The drop came after penguin update. Over a few days the sites all went down.
-
Hello Charles.
Thanks for taking a look at londonescape.com , this is the correct site.
As far as I know, all our pages have meta descriptions.
Not sure what you mean by (a production description is in one place and the order button is on another page). Each property has it's own page and the order / book button is on the same page.
There is a fair bit of internal linking and all our content is 100% original text.
We have asked Google and got this reply
There's no need to file a reconsideration request for your site, because any ranking issues you may be experiencing are not related to a manual action taken by the webspam team.
Of course, there may be other issues with your site that affect your site's ranking. Google's computers determine the order of our search results using a series of formulas known as algorithms. We make hundreds of changes to our search algorithms each year, and we employ more than 200 different signals when ranking pages. As our algorithms change and as the web (including your site) changes, some fluctuation in ranking can happen as we make updates to present the best results to our users.
If you've experienced a change in ranking which you suspect may be more than a simple algorithm change, there are other things you may want to investigate as possible causes, such as a major change to your site's content, content management system, or server architecture. For example, a site may not rank well if your server stops serving pages to Googlebot, or if you've changed the URLs for a large portion of your site's pages. This article has a list of other potential reasons your site may not be doing well in search.
If you're still unable to resolve your issue, please see our Webmaster Help Forum for support.
Sincerely,
Google Search Quality TeamWe are in the process of removing links we suspect may be spammy.
Many of our article pages do rank very high in Google, just rankings for our main keywords are in the 100's.
-
Charles,
I took a look at the home page and two random pages of londonescape.com. All three had meta descriptions. Were you by chance referring to a different domain? Making that clear would help.
Rather than stating something is horrible, it's more helpful to state that it is difficult to use because a production description is in one place and the order button is on another page (an example from one of my own sites).
I'm looking at content pages from the main navigation of londonescape.com, such as Guest Information and Short Term Apartments, and see plenty of internal linking.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does adding new pages, new slugs, new URLS in a site affects rankings and visibility?
hi reader, i have decided to add new pages to my site. if i add new urls, i feel like i have to submit the sitemap again. my question is, does submitting sitemap again with new slugs or urls affects visibility is serps, if yes, how do i minimize the impact?
Web Design | | SIMON-CULL0 -
Https pages indexed but all web pages are http - please can you offer some help?
Dear Moz Community, Please could you see what you think and offer some definite steps or advice.. I contacted the host provider and his initial thought was that WordPress was causing the https problem ?: eg when an https version of a page is called, things like videos and media don't always show up. A SSL certificate that is attached to a website, can allow pages to load over https. The host said that there is no active configured SSL it's just waiting as part of the hosting package just in case, but I found that the SSL certificate is still showing up during a crawl.It's important to eliminate the https problem before external backlinks link to any of the unwanted https pages that are currently indexed. Luckily I haven't started any intense backlinking work yet, and any links I have posted in search land have all been http version.I checked a few more url's to see if it’s necessary to create a permanent redirect from https to http. For example, I tried requesting domain.co.uk using the https:// and the https:// page loaded instead of redirecting automatically to http prefix version. I know that if I am automatically redirected to the http:// version of the page, then that is the way it should be. Search engines and visitors will stay on the http version of the site and not get lost anywhere in https. This also helps to eliminate duplicate content and to preserve link juice. What are your thoughts regarding that?As I understand it, most server configurations should redirect by default when https isn’t configured, and from my experience I’ve seen cases where pages requested via https return the default server page, a 404 error, or duplicate content. So I'm confused as to where to take this.One suggestion would be to disable all https since there is no need to have any traces to SSL when the site is even crawled ?. I don't want to enable https in the htaccess only to then create a https to http rewrite rule; https shouldn't even be a crawlable function of the site at all.RewriteEngine OnRewriteCond %{HTTPS} offor to disable the SSL completely for now until it becomes a necessity for the website.I would really welcome your thoughts as I'm really stuck as to what to do for the best, short term and long term.Kind Regards
Web Design | | SEOguy10 -
How does adding ecommerce to a site affect SEO? What are the negative and what are the positives?
We are thinking of adding ecommerce to our website as a service to our customers. We generate most of our leads through online quote requests but heard that it may be beneficial to our SEO if we add ecommerce for a few products. Is this true? Does anyone have tips on best and worst SEO ecommerce practices?
Web Design | | TeguarMarketing0 -
HELP! IE secure page display issue on new live site
For some reason IE 7, 8, & 9 do not display the following page: https://www.jwsuretybonds.com/protools.htm All they show is the Norton seal. It shows properly in all other browsers without issue (including IE 10+), but the earlier versions flash the page for a split second, then hides everything. Can someone shed some light on this? This is a new live site we just launched minutes ago and these browsers account for 12% of our overall traffic. UGH I hate you microsoft!!! Thanks all 🙂
Web Design | | TheDude0 -
How to add SEO Content to this site
Hi Great community and hope you guys can help! I have just started on a SEO project for http://bit.ly/clientsite , the clients required initial KPI is Search Engine Rankings at a fairly low budget. The term I use for the site is a "blurb site", the content is thin and the initial strategy I want to employ to get the keyword rankings is to utilize content. The plan is to: add targeted, quality (user experience & useful) and SEO content on the page itself by adding a "read more" link/button to the "blurb" on the right of the page (see pink text in image) when someone clicks on the "read more", a box of content will slide out styled much the same as the blurb itself and appear next to and/or overlay over the blurb and most of the page (see pink rectangle in image) Question: Is this layer of targeted , quality (user experience & useful) and SEO content (which requires an extra click to get to it) going to get the same SEO power/value as if it were displayed traditionally on the initial display? If not, would it be better to create a second page (2<sup>nd</sup> layer) and have the read more link to that and then rel-canonical the blurb to that 2<sup>nd</sup> page, so that all the SEO passes to this expanded content and the second page/layer is what will show up in the rankings? Thanks in advance qvDgZNE
Web Design | | Torean0 -
Mega Dropdown Menus affect SEO results?
Our e-commerce website http://www.autoidsavings.com has a mega multi-level dropdown menu on top that we wanted to re-design. The real problem is the Shop By Brand menu which hovers down to 4 level at most. Our options are: 1. Limit the dropdown menu to 2 levels total. (currently have 4 levels the most) 2. Completely remove Shop By Brand Menu and create a page like https://www.cdw.com/content/brands/?cm_sp=GlobalHeader--Products|Brands--Home My concern is that will either changes help or destroy our SEO results?
Web Design | | Mobile_ID0 -
Best way of conserving link juice from non important pages
If I have a bunch of non important pages on my website which are of little use in the SE's index - IE contact us pages, pages which are near duplicate and conflict with KW's targetting other pages etc, what is the best way of retaining the link juice that would normally be passed to these pages? Most recent discussion I have read has said that with nofollow you effectively just loose link juice, as opposed to conserving it, so that doesn't seem a great option. If I do "noindex" on these pages, would that conserve the link juice in the site, or again would it be just lost? It seems quite a tricky situation as many pages are legitimate for customer usability, but are not worth having in the SE's index and you better off consolidating link juice - so it seems you are getting penilised for making something "for users". Thanks
Web Design | | James770 -
Ecommerce web site with too many internal links
Hi, We're using Magento CE 1.4.0.1 for our ecommerce web site with a fairly flat navigation system i.e. 9 major categories display across the top menu that when you roll over display 2-20 sub categories (which take you to a groups of similar products) and then individual product pages. The categories and sub categories are available to click on as part of a dynamic Html menu system on each page. Each page also shows a small number of related products. This linking structure seems fairly standard and yet Seomoz throws up the error message, "Too Many On-page links" for most pages on our site. Do I need to really worry about this? Is there much can be done to improve this on an ecommerce web site with a large catalogue of products? I've looked at the Knowledge Base but I don't feel the existing responses adequately address the issue for ecommerce sites.
Web Design | | languedoc0