Should Your Keep Out Of Stock Item Active On Your Site ?
-
If you have sold out products that will never come back in stock. Should you remove the items and urls from your sitemap and site. Or should you keep them active with a sold out image. The purpose would be for search engines will think your site is larger due the products and amount of urls you have ?
-
Also, critical to remember a person who bought the product in the past may want to view their history for some reason (see descriptions and such).
-
Before you keep these types of pages you should determine if they have any links or any traffic. If they are pulling traffic that you would not otherwise receive then you can use the page to tell the history of the item (such as an antique or other one-of-a-kind piece).
If it is a standard product such as a pair of running shoes that has been replaced by a different model you can make the page informative and explain that the item was replaced by a new model and the impovements that were made.
Both of the above showcase your helpfulness and knowledge.
However, if this page has no links and pulls no traffic then delete it and redirect it. You don't need useless pages on your site.
-
I would keep the sold out items on my site and have a "we also recommend" section below the item in the inner page to show the client that you have items similar to the one that is sold out. I wouldn't add a sold out image; instead, I would code it so if an item is sold out, then the purchase button is replaced by a sold out button/image. Depending on the products that you carry, some of your sold out items may carry weight within SEO and may show up for relevant keyword searches; you don't want to lose that.
-
It depends on the product. If the item is unique, such as a book that is now out of print, we keep it on the site with a pop-up pointing people to either a new version or the related category.
If it is a t-shirt design, we remove the item and redirect it to the category page for the item unless there is a new item that is a direct replacement.
You have to consider your audience. People will probably be grateful to know that a particular book they were looking for is out of print but they probably don't care that a specific color and cut of shirt is unavailable.
-
I agree with Virage. As long as it makes business sense and the volume of your OOS products is not more then your current products. I would not want to have 500 OOS products and 50 In stock products on a site.
Stop linking to that page from your navigation (obviously) and then if somebody does indeed navigate to your OOS product page from elsewhere on the web, then they can see it's OOS and see related products and so on. That's a much better user experience in my opinion vs a 404.
-
I do keep out of stock product pages active for the very reason you mentioned: it's more unique content for the search engines to read, and also, if someone is searching for my out of stock item, I would still want them to find my site because it is very likely we would have a similar alternative option of which they may purchase instead.
If anything, our product pages always include a ton of information, including PDFs and pictures, that just seem helpful from a consumer's POV. Even if they do not end up purchasing said product from us, they can still research it with us!
It really thus depends on the nature of your website and your products, but there is great value in retaining unique content, so if your product page is filled with useful product information, I'd say definitely keep it available for the search engines and consider linking to alternative in-stock options for your visitors to pursue as well!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What is the feeliing of "Here's where our site can help" text links used for conversions?
If you have an ecommerce site that is using editorial content on topics related to the site's business model to build organic traffic and draw visitors who might be interested in using the site's services eventually, what is the SEO (page ranking) impact -- as well as the impact on the visitors' perceptions about the reliability of the information on the site -- of using phrases like "Here is where [our site] can help you." in nearly every article. Note: the "our site" text would be linked in each case as a conversion point to one of the site's services pages to get visitors to move from content pages on a site to the sales pages on the site. Will this have an impact on page rankings? Does it dilute the page's relevance to search engines? Will the content look less authoritative because of the prevalence of these types of links? What about the same conversion links without the "we can help" text - i.e., more natural-sounding links that stem from the flow of the article but can lead interested visitors deeper into the ecommerce section of the site?
Algorithm Updates | | Will-McDermott0 -
Optimized site-wide internal links in footer - a problem?
Hello all - I am looking at a website with 8 heavily keyword optimized site-wide links in the footer. Yes, there are only 8 but it looks a bit spammy and I'm tempted to remove them. I imagine there's some possibility of a Google penalty too? What would your advice be? Thanks, Luke
Algorithm Updates | | McTaggart0 -
What is the point of XML site maps?
Given how Google uses Page Rank to pass link juice from one page to the next if Google can only find a page in an XML site map it will have no link juice and appear very low in search results if at all. The priority in XML sitemaps field also seems pretty much irrelevant to me. Google determines the priority of a page based on the number of inbound links to it. If your site is designed properly the most important pages will have the most links. The changefreq field could maybe be useful if you have existing pages that are updated regularly. Though it seems to me Google tends to crawl sites often enough that it isn't useful. Plus for most of the web the significant content of an existing page doesn't change regularly, instead new pages are added with new content. This leaves the lastmod field as being potentially useful. If Google starts each crawl of your site by grabbing the sitemap and then crawls the pages whose lastmod date is newer than its last crawl of the site their crawling could be much more efficient. The site map would not need to contain every single page of the site, just the ones that have changed recently. From what I've seen most site map generation tools don't do a great job with the fields other than loc. If Google can't trust the priority, changefreq, or lastmod fields they won't put any weight on them. It seems to me the best way to rank well in Google is by making a good, content-rich site that is easily navigable by real people (and that's just the way Google wants it). So, what's the point of XML site maps? Does the benefit (if any) outweigh the cost of developing and maintaining them?
Algorithm Updates | | pasware0 -
Why is site dropping in rank after we update it?
One of our sites - supereyes.com - appears to drop in rank after we update it. The client notified us of this today and I've verified that it did indeed drop in Google -- four spots since last week. He says this happens every time we make changes to the site, but then a week later it will go back up and is usually higher than where it was before. I have not verified this, but I'm very worried it may not rise again In the past week, we've posted a new blog entry to their site and we've changed some of the content -- specifically, added their locations to the header, added a contact page and put two testimonials in their sidebar. We've also had someone submitting their site to directories and local business sites like Angie's List and so forth. There are about 16 new backlinks established in the past 2-3 weeks. Also, I should note, traffic is higher than it's ever been, but the client doesn't look at traffic. They only look at their Google results. Can anyone offer any insight into what's going on here and if I need to be worried the site won't rise again in the rankings?
Algorithm Updates | | aloley0 -
Client's site dropped completely from Google - AGAIN! Please help...
ok guys - hoping someone out there can help... (kinda long, but wanted to be sure all the details were out there) Already had this happen once - even posted in here about it - http://www.seomoz.org/q/client-s-site-dropped-completely-for-all-keywords-but-not-brand-name-not-manual-penalty-help Guy was a brand new client, all we did was tweak title tags and add a bit of content to his site since most was generic boilerplate text... started on our KW research and competitor research... in just a week, from title tag and content tweaks alone, he went from ranking on page 4-5 to ranking on page 3-4... then as we sat down to really optimize his site... POOF - he was gone from the Googs... He only showed up in "site:" searches and for exact matches of his business name - everything else was gone. Posted in here and on WMT - had several people check it out, both local guys and people from here (thanks to John Doherty for trying!) - but no one could figure out any reason why it would have happened. We submitted a reconsideration request, explaining that we knew we hadn't violated any quality guidelines, that he had less than 10 backlinks so it couldn't be bad linking, and that we had hardly touched the site. They sent back a canned response a week later that said there was no manual penalty and that we should "check our content" - mysteriously, the site started to show back up in the SERPs that morning (we got the canned response in the afternoon) There WAS an issue with NAP mismatch on some citations, but we fixed that, and that shouldn't have contributed to complete disappearance anyway. SO - the site was back, and back at its page 3 or 4 position... we decided to leave it alone for a few days just to be sure we didn't do anything... and then just 6 days later, when we were sitting down to fully optimize the site - POOF - completely gone again. We do SEO for a lot of different car dealers all over the country, and i know our strategies work. Looking at the competition in his market, he should easily be ranked page 2 or 3 with the very minimal tweaking we did... AND, since we didn't change anything since he came back, it makes even less sense that he was visible for a week and then gone again. So, mozzers... Anybody got any ideas? I'm really at a loss here - it makes zero sense that he's completely gone, except for his biz name... if nothing else, he should be ranking for "used cars canton"... Definitely appreciate any help anyone can offer -
Algorithm Updates | | Greg_Gifford0 -
Why is this site ranking 1st?
I'm a relative SEO newbie, so please go easy on me. I've been an SEOMOZ pro user for a few months and have used it to dramatically improve my organic rankings. However, for the life of me, I cannot determine why the site that currently ranks number one, does so. For the factors I can determine, they shouldn't be ranking where they are, but reality is different. Could someone please offer me some ideas? My target keyword is "photography classes edmonton" My site is www.bsop.ca and I'm targetting the Google Canada engine. Any and all assistance is appreciated.
Algorithm Updates | | pburwell0 -
Large site with faceted navigation using rel=canonical, but Google still has issues
First off, I just wanted to mention I did post this on one other forum so I hope that is not completely against the rules here or anything. Just trying to get an idea from some of the pros at both sources. Hope this is received well. Now for the question..... "Googlebot found an extremely high number of URLs on your site:" Gotta love these messages in GWT. Anyway, I wanted to get some other opinions here so if anyone has experienced something similar or has any recommendations I would love to hear them. First off, the site is very large and utilizes faceted navigation to help visitors sift through results. I have implemented rel=canonical for many months now to have each page url that is created based on the faceted nav filters, push back to the main category page. However, I still get these damn messages from Google every month or so saying that they found too many pages on the site. My main concern obviously is wasting crawler time on all these pages that I am trying to do what they ask in these instances and tell them to ignore and find the content on page x. So at this point I am thinking about possibly using robots.txt file to handle these, but wanted to see what others around here thought before I dive into this arduous task. Plus I am a little ticked off that Google is not following a standard they helped bring to the table. Thanks for those who take the time to respond in advance.
Algorithm Updates | | PeteGregory0 -
Are you seeing changes in your sites today? Panda 2.2?
I've heard rumblings of some Panda sites recovering in the last few days and wondered if the talked about Panda 2.2 has been rolled out. My own site (which actually had a significant boost after Panda) has seen a significant increase in traffic today (started about noon EST yesterday) and a nice increase in Adsense revenue as well. How are your sites doing?
Algorithm Updates | | MarieHaynes1