Hundreds of thousands of 404's on expired listings - issue.
-
Hey guys,
We have a conundrum, with a large E-Commerce site we operate. Classified listings older than 45 days are throwing up 404's - hundreds of thousands, maybe millions. Note that Webmaster Tools peaks at 100,000.
Many of these listings receive links.
Classified listings that are less than 45 days show other possible products to buy based on an algorithm.
It is not possible for Google to crawl expired listings pages from within our site. They are indexed because they were crawled before they expired, which means that many of them show in search results.
-> My thought at this stage, for usability reasons, is to replace the 404's with content - other product suggestions, and add a meta noindex in order to help our crawl equity, and get the pages we really want to be indexed prioritised.
-> Another consideration is to 301 from each expired listing to the category heirarchy to pass possible link juice. But we feel that as many of these listings are findable in Google, it is not a great user experience.
-> Or, shall we just leave them as 404's? : google sort of says it's ok
Very curious on your opinions, and how you would handle this.
Cheers,
Croozie.
P.S I have read other Q & A's regarding this, but given our large volumes and situation, thought it was worth asking as I'm not satisfied that solutions offered would match our needs.
-
Wow! Thanks Ryan.
I'm sure it won't surprise you to know that I'm always reading eagerly when I see you respond to a question as well.
-
Thanks Ian, good to know Again, good confirmation.
-
Hi Sha,
Spot on. Yes that was my original thinking, then I switched to the school of 200's with meta index's. But having you guys confirming this, makes me realise that doing 301's to the parent category is most certainly the way to go.
Permanently redirecting will have the added benefit of effectively 'de-indexing' the original classified's and of course throwing a ton of link juice over to the category levels.
What a wonderful, helpful community!
Many thanks,
Croozie.
-
Sha, your responses continuously offer outstanding actionable items which offer so much value. I love them so much as they offer such great ideas and demonstrate a lot of experience.
-
Hi Croozie,
Awesome work once again from Ryan!
Since your question feels like a request for suggestions on "how" to create a solution, just wanted to add the following.
When you say "classified listings" I hear "once off, here for a while, gone in 45 days content".
If that is the case, then no individual expired listing will ever be matched identically with another (unless it happens to be a complete duplicate of the original listing).
This would mean that it would certainly be relevant to send any expired listing to a higher order category page. If your site structure is such that you have a clear heirarchy, then this is very easy to do.
For example:
If your listing URL were something like http://www.mysite.com/listings/home/furniture/couches/couch-i-hate.php, then you can use URL rewrites to strip out the file name and 301 the listing to http://www.mysite.com/listings/home/furniture/couches/, which in most cases will offer a perfectly suitable alternative for the user.
There is another alternative you could consider if you have a search program built in - you could send the traffic to a relevant search. In the above example, mysite.com/search.php?s=couch.
Hope that helps,
Sha
-
We are now doing something similar with our site. We have several thousand products that have been discontinued and didn't think about how much link juice we were throwing away until we got Panda pounded. It's amazing how many things you find to fix when times get tough.
We started with our most popular discontinued products and are 301 redirecting them to either a new equivalent or the main category if no exact match can be found.
We are also going to be reusing the same product pages for annual products instead of creating new pages each year. Why waste all that link juice from past years?
-
If you perform a redirect, I recommend you offer a 301 header response, not a 200. The 301 response will let Google and others know the URL should be updated in their database. Google would then offer the new URL in search results. Additionally any link value can be properly forwarded to the new page.
-
Thanks Ryan,
Massive response! Awesome!
It's interesting that you talk a lot about the 301's.
Are you suggesting this would be far more preferable than simply producing a 200 status code page, listing product choices based on an algorithm - which we currently offer our customers for listings expired less than 45 days?
I suppose, to clarify, I'm worried that if we were to do that (produce 200 status code pages), then crawl equity would be reduced for Google, that we would be wasting a lot of their bandwidth on 200 status pages, when they could be better off crawling and indexing more recent pages.
Whereas with 301's to relevant products as you suggest, we solve that issue.
BTW, our 404 pages offer the usual navigation and search options.
Cheers,
Croozie.
-
Hi Croozie.
The challenge with your site is the volume of pages. Most large sites with 100k+ pages have huge SEO opportunities. Ideally you need a team which can manually review every page of your site to ensure it is optimized correctly. Such a team would be a large expense which many site owners choose to avoid. The problem is your site quality and SEO are negatively impacted.
Whenever a page is removed from your site or otherwise becomes unavailable, a plan should be in place PRIOR to removing the page. The plan should address the simple question: how will we handle traffic to the page whether it is from a search engine or a person who bookmarked the page or a link. The suggested answer is the same whether your site has 10 pages or a million pages:
- if the product is being replaced with a very similar product, or you have a very similar product, then you can choose to 301 the page to the new product. If the product is truly similar, then the 301 redirect is a win for everyone.
Example A: You offer a Casio watch model X1000. You stop carrying this watch and replace it with Casio watch model X1001. It is the same watch design but the new model has a slight variation such as a larger dial. Most users who were interested in the old page would be interested in the new page.
Example B: You offered the 2011 version of the Miami Dolphins T-shirt. It is now 2012 and you have the 2012 version of the shirt which is a different design. You can use a 301 to direct users to the latest design. Some users may be unhappy and want the old design, but it is still probably the right call for most users.
Example You discontinue the Casio X1000 and do not have a very close replacement. You could 301 the page to the Casio category page, or you could let it 404.
The best thing to do in each case is to put on your user hat and ask yourself what would be the most helpful thing you can do to assist a person seeking the old content. There is absolutely nothing wrong with allowing a page to 404. It is a natural part of the internet.
One last point. Be sure your 404 page is optimized, especially considering how many 404s you present. The page should have the normal site navigation along with a search function. Help users find the content they seek.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Category Page - Optimization the product's anchor.
Hello, Does anybody have real experience optimizing internal links in the category page? The category pages of my actual client uses a weird way to link to their own products. Instead of creating diferents links (one in the picture, one in the photo and one in the headline), they create only one huge link, using everything as anchor (picture, text, price, etc.). URL: http://www.friasneto.com.br/imoveis/apartamentos/para-alugar/campinas/ This technique can reduce the total number of links in the page, improving the strenght of the other links, but also can create a "crazy" anchor text for the product. Could I improve my results creating the standard category link (one in the picture, one in the photo and one in the headline)? Hope it's not to confuse.
Intermediate & Advanced SEO | | Nobody15569049633980 -
Print pages returning 404's
Print pages on one of our sister sites are returning 404's in our crawl but are visible when clicked on. Here is one example: https://www.theelementsofliving.com/recipe/citrus-energy-boosting-smoothie/print Any ideas as to why these are returning errors? Thank you!
Intermediate & Advanced SEO | | FirstService0 -
Site's pages has GA codes based on Tag Manager but in Screaming Frog, it is not recognized
Using Tag Assistant (Google Chrome add-on), we have found that the site's pages has GA codes. (also see screenshot 1) However, when we used Screaming Frog's filter feature -- Configuration > Custom > Search > Contain/Does Not Contain, (see screenshot 2) SF is displaying several URLs (maybe all) of the site under 'Does Not Contain' which means that in SF's crawl, the site's pages has no GA code. (see screenshot 3) What could be the problem why SF states that there is no GA code in the site's pages when in fact, there are codes based on Tag Assistant/Manager? Please give us steps/ways on how to fix this issue. Thanks! SgTovPf VQNOJMF RCtBibP
Intermediate & Advanced SEO | | jayoliverwright0 -
Other domains hosted on same server showing up in SERP for 1st site's keywords
For the website in question, the first domain alphabetically on the shared hosting space, strange search results are appearing on the SERP for keywords associated with the site. Here is an example: A search for "unique company name" shows the results: www.uniquecompanyname.com as the top result. But on pages 2 and 3, we are getting results for the same content but for domains hosted on the same server. Here are some examples with the domain name replaced: UNIQUE DOMAIN NAME PAGE TITLE
Intermediate & Advanced SEO | | Motava
ftp.DOMAIN2.com/?action=news&id=63
META DESCRIPTION TEXT UNIQUE DOMAIN NAME PAGE TITLE 2
www.DOMAIN3.com/?action=news&id=120
META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 2
www.DOMAIN4.com/?action=news&id=120
META DESCRIPTION TEXT2 UNIQUE DOMAIN NAME PAGE TITLE 3
mail.DOMAIN5.com/?action=category&id=17
META DESCRIPTION TEXT3 ns5.DOMAIN6.com/?action=article&id=27 There are more but those are just some examples. These other domain names being listed are other customer domains on the same VPS shared server. When clicking the result the browser URL still shows the other customer domain name B but the content is usually the 404 page. The page title and meta description on that page is not displayed the same as on the SERP.As far as we can tell, this is the only domain this is occurring for.So far, no crawl errors detected in Webmaster Tools and moz crawl not completed yet.0 -
Should we use URL parameters or plain URL's=
Hi, Me and the development team are having a heated discussion about one of the more important thing in life, i.e. URL structures on our site. Let's say we are creating a AirBNB clone, and we want to be found when people search for apartments new york. As we have both have houses and apartments in all cities in the U.S it would make sense for our url to at least include these, so clone.com/Appartments/New-York but the user are also able to filter on price and size. This isn't really relevant for google, and we all agree on clone.com/Apartments/New-York should be canonical for all apartment/New York searches. But how should the url look like for people having a price for max 300$ and 100 sqft? clone.com/Apartments/New-York?price=30&size=100 or (We are using Node.js so no problem) clone.com/Apartments/New-York/Price/30/Size/100 The developers hate url parameters with a vengeance, and think the last version is the preferable one and most user readable, and says that as long we use canonical on everything to clone.com/Apartments/New-York it won't matter for god old google. I think the url parameters are the way to go for two reasons. One is that google might by themselves figure out that the price parameter doesn't matter (https://support.google.com/webmasters/answer/1235687?hl=en) and also it is possible in webmaster tools to actually tell google that you shouldn't worry about a parameter. We have agreed to disagree on this point, and let the wisdom of Moz decide what we ought to do. What do you all think?
Intermediate & Advanced SEO | | Peekabo0 -
301's & Link Juice
So lets say we have a site that has 0 page rank (kind of new) has few incoming links, nothing significant compared to the other sites. Now from what I understand link juice flows throughout the site. So, this site is a news site, and writes sports previews and predictions and what not. After a while, a game from 2 months gets 0 hits, 0 search queries, nobody cares. Wouldn't it make sense to take that type of expired content and have it 301 to a different page. That way the more relevant content gets the juice, thus giving it a better ranking... Just wondering what everybody's thought its on this link juice thing, and what am i missing..
Intermediate & Advanced SEO | | ravashjalil0 -
Our main domain has thousands of subdomains with same content (expired hosting), how should we handle it?
Hello, Our client allows users to create free-trial subdomains and once the trial expires, all the domains have the same page. If people stick, their own websites are hosted on the subdomain. Since all these expired trials subdomains have the same content and are linking towards the Homepage, should they be nofollows? Has anyone dealt with something similar? Thanks very much in advance,
Intermediate & Advanced SEO | | SCAILLE0 -
.com ranking over other ccTLD's that were created
We had a ecommerce website that used to function as the website for every other locale we had around the world. For example the French version was Domain.com/fr_FR/ or a German version in English would be Domain.com/en_DE/. Recently we moved all of our larger international locales to their corresponding ccTLD so no we have Domain.fr and Domain.de.(This happened about two months ago) The problem with this is that we are getting hardly any organic traffic and sales on these new TLD's. I am thinking this is because they are new but I am not positive. If you compare the traffic we used to see on the old domain versus the traffic we see on the new domain it is a lot less. I am currently going through to make sure that all of the old pages are not up and the next thing I want to know is for the old pages would it be better to use a 301 re-direct or a rel=canonical to the new ccTLD to avoid duplicate content and those old pages from out ranking our new pages? Also what are some other causes for our traffic being down so much? It just seems that there is a much bigger problem but I don't know what it could be.
Intermediate & Advanced SEO | | DRSearchEngOpt0