Forced to remove Categories with high volume & revenue
-
Hi everyone
I've been forced to remove level 4 & 5 categories (e.g. example.com/level-2/level-3**/level-4/level-5/**) from our website, even though they're getting plenty of traffic, revenue and are ranking for some of our keywords. The argument is customers were using refinement/filters more than clicking into categories, and a new backend system is coming into the business and these need to be removed anyway.
We've done this before and seen a drop in visibility, revenue & traffic in these areas, but we're going ahead with another batch of removals anyway. I was wondering if anyone has any experience in fixing a problem like this? I've been told the categories will not be returning and have to 301 them, so need to find a workaround to get eligible for ranking for these Keywords again.
I've been looking at using the refinements to make it look like a category (change URL to a clean one, update Page Title, Meta Description, H1, remove text from core page, when refinement is clicked) but not sure what kind of knock-on effects this will have, if it even works!
Hope you can help! I've probably missed some details so let me know if you need more info!!!
Thanks
-
Very hard to prove these things before they're done - good luck with getting buy-in for what you need to do and in undoing the worst of the damage.
-
Thanks Will! Yep sounds similar to what I've sent onto Development, where the filters are actually those sub-category pages. Unfortunately they think it's going to be a huge amount of work, so now I need to show the value of creating these pages before they start working on it. From the Macro point of view, unfortunately, I had no choice and just had to redirect, which are all in place now. Painful to do when you know it's going to damage the performance, and after a couple of weeks it looks like the stats showing it already has
But great to have your feedback, will definitely give weight for my pitch to get those filters working for us! The top-level idea might actually be a great workaround for now too!
-
Hi Frankie,
Sorry for the slow reply to this one. I hope it's still relevant to offer some thoughts.
First, at the top level, I would say that the stated reasons don't necessarily mean that you should not have the kinds of pages you describe. My first preference would be to modify the functionality so that the filters you describe users actually using are those sub-category pages. Even if this meant changing URLs (and hence 301 redirecting the pages you currently have), it is possible to have filter / facet pages be indexable and have unique URLs and meta information.
If that's not possible for whatever reason, I would separate my efforts into the micro and the macro:
- Micro: apply a 80:20 or 90:10 rule to the pages that you are losing - find the small number of most important and highest traffic / conversion pages and find a way to keep versions of those pages (again - even if you have to 301 redirect them, you could create them as static content pages targeting those keywords or something if you had to)
- Macro: where you simply have no choice but to lose these pages, I think your best bet will be to redirect them to the absolutely best (/ next best!) page on the site for those queries - these might be other (sub-)category pages or they might be individual products or content pages, but at least for the highest traffic end, it'd be worth specific research effort to identify the best redirect targets
One final thought: it's not always the case that the URL has to represent every level in the hierarchy. I don't know your underlying technology, but it might be possible to recreate some of these sub-categories as top-level categories if products are allowed by your CMS to be in more than one category at once. I wrote this article about the difference between URL structures and site architecture that might give more clarity on what I mean here.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTPS Update - 1 Category Dropped Out of Google
Hi We updated to HTTPs last week, we haven't had any major issues and most categories on the site are OK, apart from one. We have completely dropped out of ranking in Google at all for our Dollies section: https://www.key.co.uk/en/key/dollies-load-movers-door-skates We've always ranked well on the first page for a number of keywords, now we're out of the top 100 - I am trying to hunt for an issue but I can't seem to find one. Can anyone advise? Thanks 🙂
Intermediate & Advanced SEO | | BeckyKey0 -
6 .htaccess Rewrites: Remove index.html, Remove .html, Force non-www, Force Trailing Slash
i've to give some information about my website Environment 1. i have static webpage in the root. 2. Wordpress installed in sub-dictionary www.domain.com/blog/ 3. I have two .htaccess , one in the root and one in the wordpress
Intermediate & Advanced SEO | | NeatIT
folder. i want to www to non on all URLs Remove index.html from url Remove all .html extension / Re-direct 301 to url
without .html extension Add trailing slash to the static webpages / Re-direct 301 from non-trailing slash Force trailing slash to the Wordpress Webpages / Re-direct 301 from non-trailing slash Some examples domain.tld/index.html >> domain.tld/ domain.tld/file.html >> domain.tld/file/ domain.tld/file.html/ >> domain.tld/file/ domain.tld/wordpress/post-name >> domain.tld/wordpress/post-name/ My code in ROOT htaccess is <ifmodule mod_rewrite.c="">Options +FollowSymLinks -MultiViews RewriteEngine On
RewriteBase / #removing trailing slash
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^(.*)/$ $1 [R=301,L] #www to non
RewriteCond %{HTTP_HOST} ^www.(([a-z0-9_]+.)?domain.com)$ [NC]
RewriteRule .? http://%1%{REQUEST_URI} [R=301,L] #html
RewriteCond %{REQUEST_FILENAME} !-f
RewriteCond %{REQUEST_FILENAME} !-d
RewriteRule ^([^.]+)$ $1.html [NC,L] #index redirect
RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /index.html\ HTTP/
RewriteRule ^index.html$ http://domain.com/ [R=301,L]
RewriteCond %{THE_REQUEST} .html
RewriteRule ^(.*).html$ /$1 [R=301,L]</ifmodule> The above code do 1. redirect www to non-www
2. Remove trailing slash at the end (if exists)
3. Remove index.html
4. Remove all .html
5. Redirect 301 to filename but doesn't add trailing slash at the end0 -
Ipad Sales & Traffic Improvement for my Ecommerce site
Do you guys know any tool or software which provides follow things for my ecommerce site? Real Time/ next day data for ipad traffic Real Time/ next day data for ipad urls visited Read time/ next day data for ipad Page rendering load time for all the urls separately Real Time/ next day data for ipad network load time for all the urls separately Real Time/ next day data for ipad dom processing time for the all the urls separately Real Time/ next day data for ipad request queuing load time for all the urls separtely Real Time/ next day data for ipad web application load time for all the urls separtely Real Time/ next day data for ipad total load time for each url Real Time/ Next day data for ipad timestamp i.e Time of each url being accessed by the visitor Real Time/ next day data for ipad visitor city Real Time/ next day data for ipad visitor country code Real Time/ next day data for ipad visitor duration on that page Real Time/ next day data for ipad visitor user agent name foreg chrome, IE, safari, firefox etc Real time/ next day data for ipad visitor user agent OS foreg. ipad only Real time/ next day data for ipad user agent version foreg. ipad 8.0, ipad 6.0, ipad air, ipad ratina, ipad mini etc Real time/ next day data for ipad visitor for each url session trace in water fall like backend time, dom processing, page load, waiting on ajax, interactions of visitors etc Real time/ next day data for ipad visitor for each url with total request for each page. Real time/ next day data for ipad visitors for each url with javascript error on the page and javascript url plus stake track of that error. Real time/ next day data for ipad visitors for each url with ajax error on the page and ajax url plus stake track of the error Real time/ next day data for ipad visitors for each and every url where each and every request time taken in waterfall layout. Real time/ next day data for ipad visitors funnel visiualization tracking Real time/ next day data for ipad visitors transcations tracking. Please note that all above data also require day wise, country wise, previous days and month, model wise sorting, pagination feature, etc. waiting for your reply Regards, Mit
Intermediate & Advanced SEO | | mit0 -
Removing Low Rank Pages Help Others Shine?
Good Morning! I have a handful of pages that are not ranking very well, if at all. They are not driving any traffic, and are realistically just sorta "there". I have already determined I will not be bringing them over to our new web redesign. My question, could it be in our best interest to try and save these pages with ZERO traction and optimize them? Re-purpose them? Or does having them on our site currently muddy up our other pages? Any help is greatly appreciated! Thanks!
Intermediate & Advanced SEO | | HashtagHustler0 -
Product Pages & Panda 4.0
Greeting MOZ Community: I operate a real estate web site in New York City (www.nyc-officespace-leader.com). Of the 600 pages, about 350 of the URLs are product pages, written about specific listings. The content on these pages is quite short, sometimes only 20 words. My ranking has dropped very much since mid-May, around the time of the new Panda update. I suspect it has something to do with the very short product pages, the 350 listing pages. What is the best way to deal with these pages so as to recover ranking. I am considering these options: 1. Setting them to "no-index". But I am concerned that removing product pages is sending the wrong message to Google. 2. Enhancing the content and making certain that each page has at least 150-200 words. Re-writing 350 listings would be a real project, but if necessary to recover I will bite the bullet. What is the best way to address this issue? I am very surprised that Google does not understand that product URLs can be very brief and yet have useful content. Information about a potential office rental that lists location, size, price per square foot is valuable to the visitor but can be very brief. Especially listings that change frequently. So I am surprised by the penalty. Would I be better off not having separate URLs for the listings, and for instance adding them as posts within building pages? Is having separate URLs for product pages with minimal content a bad idea from an SEO perspective? Does anyone have any suggestions as to how I can recover from this latest Panda penalty? Thanks, Alan
Intermediate & Advanced SEO | | Kingalan10 -
Consensus on Paying to Remove Links
Hi all, For discussion... I am painstakingly working my way through a link profile, highlighting 'unnatural links' and contacting webmasters to try and get the links removed - I haven't got as far as 'disavow' or a 'Reconsideration Request' I have found a large number (around 150) of links from http://www.bookmarks4you.com and when I have attempted to contact the site for link removals I have had a payment request in order to do so. Now the amount being requested is low and so it may be worthwhile, however, I wondered what the consensus was with regards to this sort of demand? I know I could simply add the links to my 'disavow list' but for the sake of a small payment, I could get rid of them much quicker! Also, the majority of sites that I am contacting only have a contact from as opposed to an email address that I can use directly - what I am doing is taking a screen print of each contact form in order to have proof that I am actually doing the 'hard graft' as opposed to simply adding sites to a disavow list - is this a worthwhile exercise? Many thanks Andy
Intermediate & Advanced SEO | | TomKing0 -
Pagination & SEO
I have the WP-Pagination plugin and I am wondering how to handle duplicate content issues and what's best for SEO. My developer initially downloaded the plugin to speed up loading for the home page. Now my home page has 21 pages of paginated content. But the pagination continues with each of my categories as well. Should I be placing a canonical reference to my home page, or category main page? My site name is gracessweetlife (dot) com
Intermediate & Advanced SEO | | gracessweetlife0 -
Category Pages in competition with Homepages
I am finding it a real uphill task with a few of our clients with there either product or category pages competing against other sites on main keywords. The sites either categories or product specific pages are in direct competition with other sites homepages and I am finding it increasingly more difficult to break into positions. What are other peoples experiences with this ? Do you feel the way the pages are ranked within the xml sitemap with priority could also be a factor.
Intermediate & Advanced SEO | | onlinemediadirect0