Should Your Keep Out Of Stock Item Active On Your Site ?
-
If you have sold out products that will never come back in stock. Should you remove the items and urls from your sitemap and site. Or should you keep them active with a sold out image. The purpose would be for search engines will think your site is larger due the products and amount of urls you have ?
-
Also, critical to remember a person who bought the product in the past may want to view their history for some reason (see descriptions and such).
-
Before you keep these types of pages you should determine if they have any links or any traffic. If they are pulling traffic that you would not otherwise receive then you can use the page to tell the history of the item (such as an antique or other one-of-a-kind piece).
If it is a standard product such as a pair of running shoes that has been replaced by a different model you can make the page informative and explain that the item was replaced by a new model and the impovements that were made.
Both of the above showcase your helpfulness and knowledge.
However, if this page has no links and pulls no traffic then delete it and redirect it. You don't need useless pages on your site.
-
I would keep the sold out items on my site and have a "we also recommend" section below the item in the inner page to show the client that you have items similar to the one that is sold out. I wouldn't add a sold out image; instead, I would code it so if an item is sold out, then the purchase button is replaced by a sold out button/image. Depending on the products that you carry, some of your sold out items may carry weight within SEO and may show up for relevant keyword searches; you don't want to lose that.
-
It depends on the product. If the item is unique, such as a book that is now out of print, we keep it on the site with a pop-up pointing people to either a new version or the related category.
If it is a t-shirt design, we remove the item and redirect it to the category page for the item unless there is a new item that is a direct replacement.
You have to consider your audience. People will probably be grateful to know that a particular book they were looking for is out of print but they probably don't care that a specific color and cut of shirt is unavailable.
-
I agree with Virage. As long as it makes business sense and the volume of your OOS products is not more then your current products. I would not want to have 500 OOS products and 50 In stock products on a site.
Stop linking to that page from your navigation (obviously) and then if somebody does indeed navigate to your OOS product page from elsewhere on the web, then they can see it's OOS and see related products and so on. That's a much better user experience in my opinion vs a 404.
-
I do keep out of stock product pages active for the very reason you mentioned: it's more unique content for the search engines to read, and also, if someone is searching for my out of stock item, I would still want them to find my site because it is very likely we would have a similar alternative option of which they may purchase instead.
If anything, our product pages always include a ton of information, including PDFs and pictures, that just seem helpful from a consumer's POV. Even if they do not end up purchasing said product from us, they can still research it with us!
It really thus depends on the nature of your website and your products, but there is great value in retaining unique content, so if your product page is filled with useful product information, I'd say definitely keep it available for the search engines and consider linking to alternative in-stock options for your visitors to pursue as well!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Is "Author Rank," User Comments Driving Losses for YMYL Sites?
Hi, folks! So, our company publishes 50+ active, disease-specific news and perspectives websites -- mostly for rare diseases. We are also tenacious content creators: between news, columns, resource pages, and other content, we produce 1K+ pieces of original content across our network. Authors are either PhD scientists or patients/caregivers. All of our sites use the same design. We were big winners with the August Medic update in 2018 and subsequent update in September/October. However, the Medic update in March and de-indexing bug in April were huge losers for us across our monetized sites (about 10 in total). We've seen some recovery with this early June update, but also some further losses. It's a mixed bag. Take a look at this attached MOZ chart, which shows the jumps and falls around the various Medic updates. The pattern is very similar on many of our sites. As per JT Williamson's stellar article on EAT, I feel like we've done a good job in meeting those criteria, which has left we wondering what isn't jiving with the new core updates. I have two theories I wanted to run past you all: 1. Are user comments on YMYL sites problematic for Google now? I was thinking that maybe user comments underneath health news and perspectives articles might be concerning on YMYL sites now. On one hand, a healthy commenting community indicates an engaged user base and speaks to the trust and authority of the content. On the other hand, while the AUTHOR of the article might be a PhD researcher or a patient advocate, the people commenting -- how qualified are they? What if they are spouting off crazy ideas? Could Google's new update see user comments such as these as degrading the trust/authority/expertise of the page? The examples I linked to above have a good number of user comments. Could these now be problematic? 2. Is Google "Author Rank" finally happening, sort of? From what I've read about EAT -- particularly for YMYL sites -- it's important that authors have “formal expertise” and, according to Williamson, "an expert in the field or topic." He continues that the author's expertise and authority, "is informed by relevant credentials, reviews, testimonials, etc. " Well -- how is Google substantiating this? We no longer have the authorship markup, but is the algorithm doing its due diligence on authors in some more sophisticated way? It makes me wonder if we're doing enough to present our author's credentials on our articles, for example. Take a look -- Magdalena is a PhD researcher, but her user profile doesn't appear at the bottom of the article, and if you click on her name, it just takes you to her author category page (how WordPress'ish). Even worse -- our resource pages don't even list the author. Anyhow, I'd love to get some feedback from the community on these ideas. I know that Google has said there's nothing to do to "fix" these downturns, but it'd sure be nice to get some of this traffic back! Thanks! 243rn10.png
Algorithm Updates | | Michael_Nace1 -
Moving Main Site from HTTP to HTTPS: Seeking Quick Items to Consider
Hey Mozzers!
Algorithm Updates | | WhiteboardCreations
We're going to be moving are main company site from http://whiteboardcreations.com > https://whiteboardcreations.com and wanted to get some of your quick tips for items we need to consider. We are working on a new web redesign now and keeping in WordPress. A couple facts for your info... We have read a lot about it, but wanted to get some quick tips we need to take into account from your points of view. 1. March 2010 domain age
2. Ranks very well locally for our targeted keywords around web design, WordPress, SEO, social media, blog writing, web maintenance
3. Not many of our competitors, especially the SEO competitors, have moved to HTTPS
4. Site is hosted at WP Engine
5. Going to be purchasing a Domain Validated SSL... Is there any advantage to an Extended Validated SSL in Google's eye/mind?
6. Should we expect rank decrease or increase?
7. Anything else we should expect or prepare for from your experiences? Thank you!
Patrick0 -
SERP Question - Site showing up for national term over local term recently
Hey Moz, This has been happening to me with a couple of clients recently and I wanted to kick it out to the community and see if anyone else has experienced it and might be able to shed some light on why. (Disclaimer: Both clients are in the elective healthcare space)
Algorithm Updates | | Etna
Scenario: Client's site is optimized for a fairly competitive "procedural keyword + location" phrase. Historically, the site had been ranking on the first page for a while until it suddenly dropped off for that query. At the same time, the page now ranks on the first page for just the procedural term, without the location modifier (obviously much more competitive than with the location modifier). Searches on Google were set to the city in which the client was located. Not that I'm complaining, but this seems a little weird to me. Anyone have a similar situation? If so, any theories about what might have caused it? TL;DR - Site ranked on 1st page for "keyword + location modifier" historically, now ranking on 1st page for "keyword" only and not found with "keyword + location modifier" TRQd9Hu0 -
Why do weaker competitors on open site explorer outrank me on SERP
Hi I am new to the whole SEO + marketing and was just wondering why the competition is doing better in SERP they are all in the top 1-5 positions; however i am 5th on page 2 ?? after doing a site explorer analysis i found that I am beating them in all aspects, my site is sparrowmakeup.com.au keyword is "makeup artist sydney' any suggestions on how to increase my SERP would be helpfull.
Algorithm Updates | | EdsonGroupMedia1 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Unable to increase the site traffic since 2 yrs
Hello friends, I am new to seomoz forum and this is my first query. Even i asked this query in many forums, i didnt get the right answer. it will be a big help if anyone answers my question. Since 2yrs i am doing seo for my site. even i am following all the white hat techniques and doing every submission manually. Still my site traffic is below 100 visits. Can any one help me to increase the site traffic? What are the techniques i need to follow to increase site visits? Also one of my sites recently got disappeared from google. I have checked all the pages listed in google for my site's major keywords. I didnt find the site anywhere. Can u hep me why this condition wll happen and what to do to overcome such issues?
Algorithm Updates | | Covantech0 -
Does anyone know if Google ranks a responsive site, or a specific mobile site higher than each other?
I have heard that Google favors specific .m sites overs responsive designs in it's rankings. Does anyone know if this is true? And, if there is any supporting information. I have been in contact with our account team at Google but haven't had a response on this as yet. I appreciate any help on this. Cheers!
Algorithm Updates | | Fasthosts0 -
ECommerce site being "filtered" by last Panda update, ideas and discussion
Hello fellow internet go'ers! Just as a disclaimer, I have been following a number of discussions, articles, posts, etc. trying to find a solution to this problem, but have yet to get anything conclusive. So I am reaching out to the community for help. Before I get into the questions I would like to provide some background: I help a team manage and improve a number of med-large eCommerce websites. Traffic ranges anywhere from 2K - 12K+ (per day) depending on the site. Back in March one of our larger sites was "filtered" from Google's search results. I say "filtered" because we didn't receive any warnings and our domain was/is still listed in the first search position. About 2-3 weeks later another site was "filtered", and then 1-2 weeks after that, a third site. We have around ten niche sites (in total), about seven of them share an identical code base (about an 80% match). This isn't that uncommon, since we use a CMS platform to manage all of our sites that holds hundreds of thousands of category and product pages. Needless to say, April was definitely a frantic month for us. Many meetings later, we attributed the "filter" to duplicate content that stems from our product data base and written content (shared across all of our sites). We decided we would use rel="canonical" to address the problem. Exactly 30 days from being filtered our first site bounced back (like it was never "filtered"), however, the other two sites remain "under the thumb" of Google. Now for some questions: Why would only 3 of our sites be affected by this "filter"/Panda if many of them share the same content? Is it a coincidence that it was an exact 30 day "filter"? Why has only one site recovered?
Algorithm Updates | | WEB-IRS1