Take a good amount of existing landing pages offline because of low traffic, cannibalism and thin content
-
Hello Guys,
I decided to take of about 20% of my existing landing pages offline (of about 50 from 250, which were launched of about 8 months ago).
Reasons are:
-
These pages sent no organic traffic at all in this 8 months
-
Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content)
-
Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50.. I also realized that for some keywords the landing page dropped out of the top 50, another landing page climbed from 50 to top 10 in the same week, next week the new landing page dropped to 30, next week out of 50 and the old landing pages comming back to top 20 - but not to top ten...This all happened in October..Did anyone observe such things as well?
That are the reasons why I came to the conclustion to take these pages offline and integrating some of the good content on the other similiar pages to target broader with one page instead of two. And I hope to benefit from this with my left landing pages. I hope all agree?
Now to the real question:
Should I redirect all pages I take offline? Basically they send no traffic at all and non of them should have external links so I will not give away any link juice. Or should I just remove the URL's in the google webmaster tools and take them then offline? Like I said the sites are basically dead and personally I see no reason for these 50 redirects.
Cheers,
Heiko
-
-
If you remove a URL and allow it to 404 you can either remove it in GWT as well, or wait for them to update it. I would remove it in GWT as well just to be sure.
There is no difference whether you have the files on the server or not unless the redirect comes down someday for awhile (even for an hour), which could result in all of those pages being reindexed. Other potential issues are if you have the site available on another domain or sub-domain that points to the same folder, in which case your redirects might not work on the other domain, depending on how they were written.
For these reasons, I would go ahead and remove the files from the server just to be safe. You can back them up somewhere local or at some point before the "Public HTML" folder on the server.
-
Thanks Everett for your response, changes are in process and I will implement it this week. But it would be even better do remove the not redirected URLs in webmaster tools. right?
Technical question to the redirected URLs: Is there any difference if I leave the redirected webpages on the server or if I delete them?
-
I've done this many times with good results. If the page has no traffic and no external links just remove it, and allow it to 404 so the URLs get removed from the index. If the page has traffic and/or external links, 301 redirect it to the most appropriate page about the topic. In either case remove/update internal links, including those within sitemaps.
Simple as that.
-
It all make sense.
-
-
Well, yes I expect that the other pages will benefit from it, because I basically can overtake the good content parts to the similiar pages. Moreover I can set more internal links to the pages which are actually ranking and generating more traffic. Of course, I could just take off all internal links from the dead pages, but I see no sense in there existence any more.
-
I know that you don't get a penalty for duplicate content. But I think it makes more sense to have one (improved) page for a topic/keyword than having 2 pages and one is basically dead from traffic perspective. From their whole structure the pages are just to simiiliar beside the "content" and even if this cannot force manual actions, it can lead to panda/hummingbird issues you will never recognize.
-
Yeah this action has nothing to do with the dead pages, you are right, I just wanted to mention it, because for me I inptreted it in the way, that google tests similiar pages in there performance and this can lead to longterm decreases. That was for me just another reason for putting similiar websites together and think more in "topical hubs". I talk about really similiar websites like for 3 phrase keywords when just the last word differs and the content is unique but basically tells the user the same like on the other page...
-
-
Question. If the fluctuations were due to the different pages competing with each other, shouldn't you see the different pages exchange places, one goes up, the other far down, then swap places and keep dancing?
-
Yes make sense. It's also what the people at koozai describe in the link Sheena posted.
Yet, my personal seo-religion so far have dictated me to never remove, every time I asked myself if I should, I got to the conclusion was better not to.
Let me re-check your motivation to do so:
- These pages sent no organic traffic at all in this 8 months
That's horrible, but removing them is going to improve something else? Maybe, or maybe not. You can find out only trying out (testing).
- Often really similiar landing pages exist (just minor keyword targeting difference and I would call it "thin" content)
If you are worried about duplicate content penalization, there's no such thing as a duplicate content penalization, google doesn't penalize duplicate content, google just make a choice, choosing one among different duplicate page to rank. Matt Cutts on that here.
If you have multiple landing pages for similar keyword with thin content, improve the content. You can find authoritative voices advocating multiple landing pages for related keyword interlinking as a perfectly whitehat LSI SEO strategy.
- Moreover I had some Panda Issues in Oct, basically I ranked with multiple landing pages for the same keyword in the top ten and in Oct many of these pages dropped out of the top 50..
I doubt your algo penalization is due to those 0-traffic landing page mentioned above, remove them and see what happen, but I bet won't change it. Instead I would look honestly at all your website and ask myself what spammy, stuffing, nasty dirty little things did I in the past?
-
Yes I checked, these pages don't have external backlinks, generating only link juice through internally linking. As I will change the internal linking and the pages I take down will not get any more internal links this should'nt make any difference...
I just want to avoid any redirect, which is not necessary to really make sure that only pages who have a relevant similiar page get a redirect. makes sense, right?
-
Have you checked with OSE and other tools to see the page juice/authority they may have?
-
Thanks for your opinions!
There are no manual actions against the pages, so shouldn't care about this! Like I said mostly they are generating no traffic at all (for these ones I cannnot see a good reason to redirect and not just delete them from the index and take them down) and some URL's are just competing against each other and the ranking fluctuations are quite high and therefore I want to put these competing pages together.
I guess I will redirect the pages which still have relevant similiar pages left, but don't redirect pages which basically had no traffic at all in 8 months and no real similiar page is existing.
-
This article is about removing blog posts, but I think it's still relevant: http://www.koozai.com/blog/search-marketing/deleted-900-blog-posts-happened-next/
The 'removals/redirects' & 'lessons learnt' sections are particularly important to consider.
-
It's possible, but it sounds like the ranking fluctuations are likely from multiple URLs competing for the same search queries ("Often really similar landing pages exist - just minor keyword targeting difference and I would call it "thin" content") rather than poor link profiles. He didn't mention any manual penalties either.
I agree that you would not want all 50 URLs redirecting to one or even just a few URLs. Only redirect the ones that are really related to the content of the remaining pages and let the rest drop off. Also make sure you have a killer 404 page that helps users get to the right pages.
-
I'm not so sure.
Common sense tells me that pages without any Page Authority, or those that may have been penalised (or indeed not indexed) for having spammy, thin content, etc will only pass these **negative **signals on through a 301 redirect?
Also surely if there is as many as 250 potential landing pages all redirecting (maybe even to one single URL), it'd surely raise alarm bells for a crawler?
-
What you're really doing is consolidating 'orphan SEO pages' to fewer, higher value pages - which is a specific example Google providesas a "good reason to redirect one URL to another." I would 301 the pages to their most relevant, consolidated landing pages that remain.
Hope this helps!
-
Why not to redirect? If you don't you will keep seeing them in error in WMT, which is not a good thing. Also returning 410 in theory is an option, but I tried in the past and WMT ignores that.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Page content not being recognised?
I moved my website from Wix to Wordpress in May 2018. Since then, it's disappeared from Google searches. The site and pages are indexed, but no longer ranking. I've just started a Moz campaign, and most pages are being flagged as having "thin content" (50 words or less), when I know that there are 300+ words on most of the pages. Looking at the page source I find this bit of code: page contents Does this mean that Google is finding this and thinks that I have only two words (page contents) on the page? Or is this code to grab the page contents from somewhere else in the code? I'm completely lost with this and would appreciate any insight.
Technical SEO | | Photowife1 -
"Noindex, follow" for thin pages?
Hey there Mozzers, I have a question regarding Thin pages. Unfortunately, we have Thin pages, almost empty to be honest. I have the idea to ask the dev team to do "noindex, follow" on these pages. What do you think? Has someone faced this situation before? Will appreciate your input!
Technical SEO | | Europarl_SEO_Team0 -
URL structure change for pages without traffic: 301 redirect or not ?
Hi, I am just starting with MOZ PRO and trying to handle the high priority issues, starting with pages with 4XX Client Error. I am wondering what we should do with pages with no traffic and no external links. For instance: So time ago we change the URL structure of our blog to a flatter one, and so eg we moved a page: from: domain-name/dla-rodzicow/poradniki/poradniki-po-markach/vilac/vilac-zabawki-z-dusza to: domain-name/dla-rodzicow/poradniki/marka-vilac/vilac-zabawki-z-dusza/ Still not very flat but this is not the point. MOZ PRO shows we are having internal links to the old url. According to MOZ PRO, we don't have external links. According to Analytics we have no traffic on the old page. So now we changed the internal link, and I am wondering whether we should 301 redirect the old page to the new one, or whether a sitemap update is enough for this kind of pages ? Thanks in advance for your help.
Technical SEO | | isabelledylag0 -
Crawl Test Report only shows home page and no inner site pages?
Hi, My site is [removed] When I first tried to set up a new campaign for the site, I received the error: Roger has detected a problem: We have detected that the root domain [removed] does not respond to web requests. Using this domain, we will be unable to crawl your site or present accurate SERP information. I then ran a Crawl Test per the FAQ. The SEOmoz crawl report only shows my home page URL and does not have any inner site pages. This is a Joomla site. What is the problem? Thanks! Dave
Technical SEO | | crave810 -
If my home page never shows up in SERPS but other pages do, does that mean Google is penalizing me?
So my website I do local SEO for, xyz.com is finally getting better on some keywords (Thanks SEOMOZ) But only pages that are like this xyz.com/better_widgets_ or xyz.com/mousetrap_removals Is Google penalizing me possibly for some duplicate content websites I have out there (working on, I know I know it is bad)...
Technical SEO | | greenhornet770 -
Duplicate content by php id,page=... problem
Hi dear friends! How can i resolve this duplicate problem with edit the php code file? My trouble is google find that : http://vietnamfoodtour.com/?mod=booking&act=send_booking&ID=38 and http://vietnamfoodtour.com/.....booking.html are different page, but they are one but google indexed both of them. And the Duplcate content is raised 😞 how can i notice to google that they are one?
Technical SEO | | magician0 -
Googlebot take 5 times longer to crawl each page
Hello All From about mid September my GWMT has show that the average time to crawl a page on my site has shot up from an average of 130ms to an average of 700ms and peaks at 4000ms. I have checked my server error logs and found nothing there, I have checked with the hosting comapny and there are no issues with the server or other sites on the same server. Two weeks after this my ranking fell by about 950 places for most of my keywords etc.I am really just trying to eliminate this as a possible cause, of these ranking drops. Or was it the Pand/ EMD algo that has done it. Many Thanks Si
Technical SEO | | spes1230 -
How to prevent duplicate content at a calendar page
Hi, I've a calender page which changes every day. The main url is
Technical SEO | | GeorgFranz
/calendar For every day, there is another url: /calendar/2012/09/12
/calendar/2012/09/13
/calendar/2012/09/14 So, if the 13th september arrives, the content of the page
/calendar/2012/09/13
will be shown at
/calendar So, it's duplicate content. What to do in this situation? a) Redirect from /calendar to /calendar/2012/09/13 with 301? (but the redirect changes the day after to /calendar/2012/09/14) b) Redirect from /calendar to /calendar/2012/09/13 with 302 (but I will loose the link juice of /calendar?) c) Add a canonical tag at /calendar (which leads to /calendar/2012/09/13) - but I will loose the power of /calendar (?) - and it will change every day... Any ideas or other suggestions? Best wishes, Georg.0