What is the best strategy to SEO Discontinued Products on Ecommerce Sites?
-
RebelsMarket.com is a marketplace for alternative fashion. We have hundreds of sellers who have listed thousands of products. Over 90% of the items do not generate any sales; and about 40% of the products have been on the website for over 3+ years.
We want to cleanup the catalog and remove all the old listings that older than 2years that do not generate any sales. What is the best practice for removing thousands of listings an Ecommerce site? do we 404 these products and show similar items?
Your help and thoughts is much appreciated.
-
James, I would still make these as out of stock.
If these products don't get any organic search or traffic anyway, it is ok to re-direct them.
The message above was for established products that have been indexed by Google over a long period of time.
Please le the know if you have any questions. Also, if someone answer the question to your satisfaction you should mark the comment as a good comment
-
These are not out of stock products. These are items that don't sell and have not sold in years; We have listings older than 5yrs and do not have any sales at all.
You would mark them as out of stock?
-
Hi Cole
These are not out of stock products. These are items that don't sell and have not sold in years; We have listings older than 5yrs and do not have any sales at all.
You would mark them as out of stock?
-
I have countless clients that get HUGE traffic form products that they have "discontinued"
You worked so hard to get those products to display on Google, why would you throw away all of your traffic with a 301 redirect to a different product causing high bounce rates or even worse taking your visitors to a discontinued product page.
I would simply put an "Out of Stock" notice on that product and have related products below to direct your customers to similar products or maybe an add to waitlist, so if you decide to bring the product back you have immediate customers.
Amazon is a perfect example. For the most part, they do not delete or remove products. If you search a product that is no longer in stock at Amazon it will say out of stock, still allowing you to see multiple reviews on that product or other sellers offering similar products.
-
Hey,
If a product is out-of-stock temporarily, best practice is to link to alternative products, for example:
- Newer models or versions.
- Similar products from other brands.
- Other products in the same category that match in quality and price.
- The same product in different colours.
This provides a good service to customers and helps search engines find and understand related pages easier.
If a product is out-of-stock permanently there are three main options.
1: Product returns a 410 (or 404) Not Found status.
Google understands 410 and 404 Not Found pages are inevitable, but the problem with creating too many of them is it reduces the time search engine crawlers will spend visiting the pages that actually should rank. If this option is implemented, ideally there should be signposts to related products on the Not Found page.2. 301 permanently redirect old product to existing product (e.g. newer version or close alternative).
A dynamically generated message should clearly display on the page e.g. “Product X is no longer available. This is a similar product/the replacement product.”This option is recommended if redirect chains can be minimised, e.g. if product turnover is high the following could happen in a short timeframe:
- Product 1 no longer exists and gets 301 redirected to Product 2.
- Product 2 no longer exists and gets 301 redirected to Product 3.
- Now a redirect chain exists: Product 1 redirects to Product 2 which then redirects to Product 3. Product 1 would need to be updated to redirect to Product 3, without the intermediate redirect to Product 2.
3. 301 permanently redirect old product to parent category. A dynamically generated message should clearly display on the page e.g. “Product X is no longer available. Please see similar products below.”
As categories are likely to change less often than products, this is potentially easier to implement than option 2.
-
I'd 301 redirects from the discontinued lines to the main section pages, so
https://www.domain.com/product-type/a-red-sweater
would redirect to
https://www.domain.com/product-type/
-
Can't speak for everyone, but i had this same thing come up with our eCommerce website. We added a feature to our eCommerce store that allowed us to "discontinue" the product. Meaning that we removed the product from being searched or listed in our store. However, if you visited the page by direct URL the product page would load and say discontinued and display a list of related products in hopes the customer would not bounce.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
The use of a ghost site for SEO purposes
Hi Guys, Have just taken on a new client (.co.uk domain) and during our research have identified they also have a .com domain which is a replica of the existing site but all links lead to the .co.uk domain. As a result of this, the .com replica is pushing 5,000,000+ links to the .co.uk site. After speaking to the client, it appears they were approached by a company who said that they could get the .com site ranking for local search queries and then push all that traffic to .co.uk. From analytics we can see that very little referrer traffic is coming from the .com. It sounds remarkably dodgy to us - surely the duplicate site is an issue anyway for obvious reasons, these links could also be deemed as being created for SEO gain? Does anyone have any experience of this as a tactic? Thanks, Dan
White Hat / Black Hat SEO | | SEOBirmingham810 -
Algorithmically penalized site
I have been doing SEO for years, but luckily have never had a client penalized or had to go through that. I see everyone talking about it at conferences and know the absolute basics of recovery, but just had someone come to me that was algorithmically penalized about two years ago. They have no actual data to show me a date and they couldn't tell me a specific date. According to them, their SEO disappeared and wouldn't give them access to the analytics. They are definitely showing just about every red flag with anchor tags and low trust links and tons of duplicate content. Just about everything. I realize you don't have the deep data to go by, but are there cases when it is just better to start over from scratch. They have literally thousands of bad links and strange site pages that they say they weren't even aware of. Whether they were or not I guess isn't the point now, but I have heard rumors that if you start over, Google will still figure it out and follow you with the penalty. Is this true or documented? Don't want to potentially recommend that if that is something that generally happens to bad offenders. Happy to do the work and try to resolve their issues, but it is a lot of work and is going to be expensive and want to present other options. Thanks and any thoughts suggestions are appreciated.
White Hat / Black Hat SEO | | jeremyskillings0 -
Active Rain and SEO
I have been an active rain member for a long time. When I check my web site I can not find any links from Active Rain. I just updated my Active Rain profile and upgraded to their paid subscription. Can you tell me if this blog is creating a follow link back to my web site at www.RealEstatemarketLeaders.com the blog on active rain is here. at http://activerain.trulia.com/blogsview/4529309/hud-homes-for-sale-in-tri-cities-wa
White Hat / Black Hat SEO | | Brandon_Patton0 -
Forcing Entire site to HTTPS
We have a Wordpress site and hope to force everything to HTTPS. We change the site name (in wordpress settings) to https://mydomain.com In the htaccess code = http://moz.com/blog/htaccess-file-snippets-for-seos Ensure we are using HTTPS version of the site. RewriteCond %{HTTPS} !on RewriteRule (.*) https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] but some blogs http://stackoverflow.com/questions/19168489/https-force-redirect-not-working-in-wordpress say RewriteCond %{HTTPS} off RewriteRule ^ https://%{HTTP_HOST}%{REQUEST_URI} [L,R=301] Which one is right? 🙂 and are we missing anything?
White Hat / Black Hat SEO | | joony0 -
Inbound Links Inquiry for a New Site
For a site that is only one to two months old, what is considered a natural amount of inbound links if you're site offers very valuable information, and you have done a marketing push to get the word out about your blog? Even if you are receiving backlinks from authority websites with high DA, does Google get suspicious if there are too many inbound links during the first few months of a sites existence? I know there are some sites that blow up very fast and receive thousands of backlinks very quickly, so I'm curious to know if Google puts these kind of sites on a watchlist or something of that nature. Or is this simply a good problem to have?
White Hat / Black Hat SEO | | WebServiceConsulting.com0 -
What happens when content on your website (and blog) is an exact match to multiple sites?
In general, I understand that having duplicate content on your website is a bad thing. But I see a lot of small businesses (specifically dentists in this example) who hire the same company to provide content to their site. They end up with the EXACT same content as other dentists. Here is a good example: http://www.hodnettortho.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.braces2000.com/blog/2013/02/valentine’s-day-and-your-teeth-2/ http://www.gentledentalak.com/blog/2013/02/valentine’s-day-and-your-teeth/ If you google the title of that blog article you find tons of the same article all over the place. So, overall, doesn't this make the content on these blogs irrelevant? Does this hurt the SEO on these sites at all? What is the value of having completely unique content on your site/blog vs having duplicate content like this?
White Hat / Black Hat SEO | | MorganPorter0 -
How best to do Location Specific Pages for Eccomerce Post Panda Update..
Hi , We have an eCommerce site and currently we have a problem with duplicate content. We created Location specific landing pages for our product categories which initially did very well until the recent Google Panda update caused a big drop in ranking and traffic. example http://xxx.co.uk/rent/lawn-mower/London/100 http://.xxx.co.uk/rent/lawn-mower/Manchester/100 Much of the content on these location pages is the same or very similar apart from different H1 tag, title tag and in some cases slight variations on the on page content but given that these items can be hired from 200 locations it would take years to have unique content for every location for each category... We did this originally in April as we can't compete nationally but we found it was easier to compete locally ,hence the creation of the location pages and it did do well for us until now. My question is , since the last Google Panda update, our traffic has dropped 40% , and rankings have gone through the floor and we are stuck with this mess Should we get rid off (301 ) all of the location specific pages for each of the categories ot just keep say 10 locations per cities as the most popular ones and either do No follow no index the other locations or 301's or what would people recommend ? The only examples I can see on the internet that others do with multiple locations is to have a store finder type thing... but you cant' rank for the individual product /category doing it that way... If anyone has any advice or good examples of sites I could see that employ a good location url specific method, please let me know. thanks Sarah
White Hat / Black Hat SEO | | SarahCollins0 -
Best Link Building Practices to Avoid Over Optimizing
With all the new over opting talk, one of the things mentioned is having the same anchored text linking to a page over and over without variation. Is there a good estimate on how many external linking in keywords should be exact versus how many should be in variation? Also, keeping value of pages links in mind. Would it be best to use [Exact] phrase for the higher PR sites or more relevant higher traffic sites? and save the long tail or keyword variation text for the lesser valued sites. When to use exact phrase and when to long tail is my question/discussion I always stay relevant in my link building, and all my links are liking within context. Because I know that relevancy has been an important factor. After watching this video from Matt Cutt's http://youtu.be/KyCYyoGusqs I assume relevancy is becoming even more of an important factor.
White Hat / Black Hat SEO | | SEODinosaur0