I redesigned a clients website and there is a pretty massive drop in traffic - despite my efforts to significantly improve SEO.
-
Hi there,
I redesigned a clients website that was very old fashioned and was not responsive. I implemented 301 redirects, kept the content pretty similar, website linking structure very similar - the only things i changed was making the website responsive, improved title tags, added a bit more information, improved the footer and h1 tags etc..
however although clicks are fairly similar search impressions have dropped about 60% on average over the past week.
The old site had some keywords linking to pages with no new content so i removed those as seemed like black hat seo tricks and also there was a huge list of "locations we deliver to" on the homepage followed by around 500 citys/towns I removed this. Could this be the cause for the drop? as i assumed those would do more harm than good?
Fairly new with SEO as you can probably tell. Looking for advice on what may be the cause and what steps I should take now.
Thanks for reading!
-
@badgergravling Can you suggest, if adding a forum will give any extra benefit to my main website's popularity?
-
Thanks for your reply Dan.
It's an International site. Our target audience is from India, Middle East countries, and East Asia.
Google has indexed https version and looks good, but after all the on page optimization also, the traffic not going up, however, we are ranking for more keywords than ever.
I've attached the screenshot of traffic stats from Jan to June, if that will help pinpoint issues.
Thanks again
-
Hey Logan,
We did a redesign+migration from as.net to Wordpress. We launched in February 2017 & still struggling to get back where we were. Almost 80% traffic loss.
Any specific pointers that get missed during these kinds of migrations, is it normal to loose this much traffic?
Please help
-
Is it a small, local type business by any chance?
Despite what a lot of best practice will say regarding location pages - they still work for a lot of small, local busineses, particularly in uncompetitive niches. Generally, I'd say 500 cities/towns is massively overdoing it, and there are a lot less spammy ways of achieving something similar, but it may well have contributed to the hit...
If the clicks are the same, then actually it's fine. No one was visiting and buying via the impressions you've lost. However, I'd say it's worth looking at the location/local SEO optimisation again and redoing it in a less spammy way to ensure your client won't panic if they ever look at impressions in Google Search Console (Pretty unlikely for most clients)...
-
Hi Logan,
Thanks for the response! Haha. That is what i thought originally, however I read that a drop of around 5% is normal so 50%+ had my alarm bells ringing! (although the website only has a small amount of traffic)
There was a few other factors that worried me like the old site was .htm but I .301 redirected every page to the new ones with .html
I will try not to worry and hopefully things will improve within the next week or so.
Thanks again.
-
Hi,
Patience, young padawan, it's only been a week
It's basically impossible to redesign a website without some organic traffic loss. There's a lot for search engines to consider, everything they previously knew about your website is gone, they have to learn your new site architecture, URL structures, content, meta data, etc. etc.
It sounds like you covered all the most important bases and got rid of some old school tactics. Give it some time, you should start to see a rebound in a few weeks. The worst thing you can do is panic and try to fix a bunch of stuff that isn't actually broken.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
WordPress Tags and SEO
Good Morning everyone, I am trying to decide how I am going to handle an issue on two WordPress websites. I recently acquired 2 new clients that used to do business with the same SEO company. Neither of the clients know of the other but both had the same story about said SEO company. The usual complaints, I wont get into details. My issue is the old SEO company basically was spamming keywords and utilized tags to do this. For each of these clients they had very thin spammy blog posts written and then had a multitude of spammy tags used as keywords here is an example https://captainjacksboatingschool.com/middlesex-county-boating-safety-class/ Each one of these tags is creating duplicate content. How do i properly handle these tags? Do i delete? Do i need to redirect into one main page after deletion. I much rather use plain English and authoritative based Categories. In fact I never use tags, only categories. They do not seem to have much seo value. Both clients who were with this company have the same Tag setup... Any advice would be greatly appreciated as I do not want to loose the customers current rank because i want to do things my way Thanks, Don Silvernail
White Hat / Black Hat SEO | | donsilvernail0 -
Traffic going down in all sites in a niche
Hello, A client has three Ecommerce sites in a niche. Because of competition and a (possibly) non manual penalty due to doorways and paid links (though I think it's mainly competition too) our traffic is going down. What are the keys to increasing traffic at this point. Feel free to include tricks that cost money. A Hrefs (I love Moz though!) has some neat content tricks. Please give me the best tricks in the industry to increase traffic. We're adding content to the main site of the three and maybe that's what to focus on, but we're having trouble driving serious traffic with the content. We need serious traffic. We are experts in our field and capable of almost anything as far as information goes in our field. Thanks.
White Hat / Black Hat SEO | | BobGW0 -
Competitor website, how come they get away with it?
Hi we have been looking at competitors websites do see how we can improve, this website jumped out at me straight away as spammy gateway pages where 3 words was the only difference on all of the pages. Why does google give them so much weight still and rank them so highly? I thought this is what G was trying to avoid? Am I missing something here in terms of great SEO opportunity? A checked for noindex or canonical and I cannot see any. Love to hear some feedback. Cheers
White Hat / Black Hat SEO | | PottyScotty0 -
Lots of websites copied my original content from my own website, what should I do?
1. Should I ask them to remove and replace the content with their unique and original content? 2. Should I ask them to link to the URL where the original content is located? 3. Should I use a tool to easily track these "copycat" sites and automatically add links from their site to my site? Thanks in advance!
White Hat / Black Hat SEO | | esiow20130 -
My site www.sriindustries.com dropped to back page penguin 2.1
My website penguin 2.1 dropped back to page 5 and beyond, can you help me to come out from this ? My head is breaking, also I would like to know how to be on top for local business search (maps)
White Hat / Black Hat SEO | | synchronyinfo0 -
Negative SEO and when to use to Dissavow tool?
Hi guys I was hoping someone could help me on a problem that has arisen on the site I look after. This is my first SEO job and I’ve had it about 6 months now. I think I’ve been doing the right things so far building quality links from reputable sites with good DA and working with bloggers to push our products as well as only signing up to directories in our niche. So our backlink profile is very specific with few spammy links. Over the last week however we have received a huge increase in backlinks which has almost doubled our linking domains total. I’ve checked the links out from webmaster tools and they are mainly directories or webstat websites like the ones below | siteinfo.org.uk deperu.com alestat.com domaintools.com detroitwebdirectory.com ukdata.com stuffgate.com | We’ve also just launched a new initiative where we will be producing totally new and good quality content 4-5 times a week and many of these new links are pointing to that page which looks very suspicious to me. Does this look like negative Seo to anyone? I’ve read a lot about the disavow tool and it seems people’s opinions are split on when to use it so I was wondering if anyone had any advice on whether to use it or not? It’s easy for me to identify what these new links are, yet some of them have decent DA so will they do any harm anyway? I’ve also checked the referring anchors on Ahrefs and now over 50% of my anchor term cloud are totally unrelated terms to my site and this has happened over the last week which also worries me. I haven’t seen any negative impact on rankings yet but if this carries on it will destroy my link profile. So would it be wise to disavow all these links as they come through or wait to see if they actually have an impact? It should be obvious to Google that there has been a huge spike in links so then the question is would they be ignored or will I be penalised. Any ideas? Thanks in advance Richard
White Hat / Black Hat SEO | | Rich_9950 -
Dust.js Client-side JavaScript Templates & SEO
I work for a commerce company and our IT team is pushing to switch our JSP server-side templates over to client-side templates using a JavaScript library called Dust.js Dust.js is a JavaScript client-side templating solution that takes the presentation layer away from the data layer. The problem with front-end solutions like this is they are not SEO friendly because all the content is being served up with JavaScript. Dust.js has the ability to render your client-side content server-side if it detects Google bot or a browser with JavaScript turned off but I’m not sold on this as being “safe”. Read about Linkedin switching over to Dust.js http://engineering.linkedin.com/frontend/leaving-jsps-dust-moving-linkedin-dustjs-client-side-templates http://engineering.linkedin.com/frontend/client-side-templating-throwdown-mustache-handlebars-dustjs-and-more Explanation of this: “Dust.js server side support: if you have a client that can't execute JavaScript, such as a search engine crawler, a page must be rendered server side. Once written, the same dust.js template can be rendered not only in the browser, but also on the server using node.js or Rhino.” Basically what would be happening on the backend of our site, is we would be detecting the user-agent of all traffic and once we found a search bot, serve up our web pages server-side instead client-side to the bots so they can index our site. Server-side and client-side will be identical content and there will be NO black hat cloaking going on. The content will be identical. But, this technique is Cloaking right? From Wikipedia: “Cloaking is a SEO technique in which the content presented to the search engine spider is different from that presented to the user's browser. This is done by delivering content based on the IP addresses or the User-Agent HTTP header of the user requesting the page. When a user is identified as a search engine spider, a server-side script delivers a different version of the web page, one that contains content not present on the visible page, or that is present but not searchable.” Matt Cutts on Cloaking http://support.google.com/webmasters/bin/answer.py?hl=en&answer=66355 Like I said our content will be the same but if you read the very last sentence from Wikipdia it’s the “present but not searchable” that gets me. If our content is the same, are we cloaking? Should we be developing our site like this for ease of development and performance? Do you think client-side templates with server-side solutions are safe from getting us kicked out of search engines? Thank you in advance for ANY help with this!
White Hat / Black Hat SEO | | Bodybuilding.com0 -
How do you remove unwanted links, built by your previous SEO company?
We dropped significantly (from page 1 for 4 keywords...to ranking over 75 for all) after the Penguin update. I understand trustworthy content and links (along with site structure) are the big reasons for staying strong through the update...and those sites that did these things wrong were penalized. In efforts to gain Google's trust again, we are checking into our site structure and making sure to produce fresh and relevant content on our site and social media channels on a weekly basis. But how do we remove links that were built by our SEO company, those of which could be untrustworthy/irrelevant sites with low site rankings? Try to email the webmaster of that site (using data from Open Site Explorer)?
White Hat / Black Hat SEO | | clairerichards0