422 vs 404 Status Codes
-
We work with an automotive industry platform provider and whenever a vehicle is removed from inventory, a 404 error is returned. Being that inventory moves so quickly, we have a host of 404 errors in search console.
The fix that the platform provider proposed was to return a 422 status code vs a 404. I'm not familiar with how a 422 may impact our optimization efforts. Is this a good approach, since there is no scalable way to 301 redirect all of those dead inventory pages.
-
Thanks Mike.
Your initial solution would be preferred, but its not scalable. We are talking about over 100 websites with varying levels of inventory.
I was thinking along the lines of the keeping the 404 or 410 status. It was just odd when the vendor proposed a 422 error, when its not a preferred option in Google's support pages. I was just wondering if anyone used the 422 response code before and if so, why.
-
Personally I think you should set up a process whereby every time a vehicle and/or part is removed, you have someone automatically 301 it to the previous step in the site navigation. So when "blue widget 3" is removed from the site, anyone landing on that page or who has it bookmarked winds up on the "Widget" category page. Now there may not be an easy way to do it right this second because of how many there are now, but if you get in the habit of doing it and slowly work toward fixing the others then you'll be in a good position in the future to keep this from being an issue again.
Now if you really don't want to attempt that... 404s aren't necessarily horrible (too many can be). If your site is properly serving 404s then you won't be penalized for it but in this case you might want to consider using 410 status codes. Its a stronger signal for removal than a 404 and you don't plan on the product ever coming back so marking it Gone should get it removed from the index faster while also helping to keep you from competing against yourself in the SERPs when a new but similar product comes into stock.
-
Do pages of vehicles that are in inventory for a short time actually deliver monetizable traffic?
If the answer is no, because they are up for such a short amount of time, you would have to weigh the value of having them indexable in the first place vs creating an ever-growing list of missing pages.
Having a lot of 404s or 422s is a bit of a negative. Is there really no way to add the step of 301ing to their removal?
Making the pages non-indexable via noindex once they are indexed will not remove them. You either have to 301 and/or request removal from the G's index. Is there a programmatic way to turn their removal into a 301 to the top inventory category page?
Good luck!
-
A 422 is an unprocessable error, which I think will have as much impact as a 404 (page not found error).
You could make pages non indexable once a vehicle has been removed from the inventory. This shouldn't impact you SEO efforts.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Updating inbound links vs. 301 redirecting the page they link to
Hi everyone, I'm preparing myself for a website redesign and finding conflicting information about inbound links and 301 redirects. If I have a URL (we'll say website.com/website) that is linked to by outside sources, should I get those outside sources to update their links when I change the URL to website.com/webpage? Or is it just as effective from a link juice perspective to simply 301 redirect the old page to the new page? Are there any other implications to this choice that I may want to consider? Thanks!
Technical SEO | | Liggins0 -
Google Webmaster tools Sitemap submitted vs indexed vs Index Status
I'm having an odd error I'm trying to diagnose. Our Index Status is growing and is now up to 1,115. However when I look at Sitemaps we have 763 submitted but only 134 indexed. The submitted and indexed were virtually the same around 750 until 15 days ago when the indexed dipped dramatically. Additionally when I look under HTML improvements I only find 3 duplicate pages, and I ran screaming frog on the site and got similar results, low duplicates. Our actual content should be around 950 pages counting all the category pages. What's going on here?
Technical SEO | | K-WINTER0 -
Easy Question: regarding no index meta tag vs robot.txt
This seems like a dumb question, but I'm not sure what the answer is. I have an ecommerce client who has a couple of subdirectories "gallery" and "blog". Neither directory gets a lot of traffic or really turns into much conversions, so I want to remove the pages so they don't drain my page rank from more important pages. Does this sound like a good idea? I was thinking of either disallowing the folders via robot.txt file or add a "no index" tag or 301redirect or delete them. Can you help me determine which is best. **DEINDEX: **As I understand it, the no index meta tag is going to allow the robots to still crawl the pages, but they won't be indexed. The supposed good news is that it still allows link juice to be passed through. This seems like a bad thing to me because I don't want to waste my link juice passing to these pages. The idea is to keep my page rank from being dilluted on these pages. Kind of similar question, if page rank is finite, does google still treat these pages as part of the site even if it's not indexing them? If I do deindex these pages, I think there are quite a few internal links to these pages. Even those these pages are deindexed, they still exist, so it's not as if the site would return a 404 right? ROBOTS.TXT As I understand it, this will keep the robots from crawling the page, so it won't be indexed and the link juice won't pass. I don't want to waste page rank which links to these pages, so is this a bad option? **301 redirect: **What if I just 301 redirect all these pages back to the homepage? Is this an easy answer? Part of the problem with this solution is that I'm not sure if it's permanent, but even more importantly is that currently 80% of the site is made up of blog and gallery pages and I think it would be strange to have the vast majority of the site 301 redirecting to the home page. What do you think? DELETE PAGES: Maybe I could just delete all the pages. This will keep the pages from taking link juice and will deindex, but I think there's quite a few internal links to these pages. How would you find all the internal links that point to these pages. There's hundreds of them.
Technical SEO | | Santaur0 -
Google Schema Code for Organisation
I've created the Google Schema code for an organisation. Should this go in the template HTML so it would be shown on all pages or just on the home page?
Technical SEO | | CharlBritton0 -
Client error 404 pages!
I have a number of 404 pages coming up which are left over in Google from the clients previous site. How do I get them out of Google please?
Technical SEO | | PeterC-B0 -
Index page 404 error
Crawl Results show there is 404 error page which is index.htmk **it is under my root, ** http://mydomain.com/index.htmk I have checked my index page on the server and my index page is index.HTML instead of index.HTMK. Please help me to fix it
Technical SEO | | semer0 -
404's and duplicate content.
I have real estate based websites that add new pages when new listings are added to the market and then deletes pages when the property is sold. My concern is that there are a significant amount of 404's created and the listing pages that are added are going to be the same as others in my market who use the same IDX provider. I can go with a different IDX provider that uses IFrame which doesn't create new pages but I used a IFrame before and my time on site was 3min w/ 2.5 pgs per visit and now it's 7.5 pg/visit with 6+min on the site. The new pages create new content daily so is fresh content and better on site metrics (with the 404's) better or less 404's, no dup content and shorter onsite metrics better? Any thoughts on this issue? Any advice would be appreciated
Technical SEO | | AnthonyLasVegas0 -
Internal vs external blog and best way to set up
I have a client that has two domians registered - one uses www.keywordaustralia.com the other uses www.keywordaelaide.com He had already bought and used the first domain when he came to me I suggested the second as being worth buying as going for a more local keyword would be more appropriate. Now I have suggested to him that a blog would be a worthy use of the second domain and a way to build links to his site - however I am reading that as all links will be from the same site it wont be worth much in the long run and an internal blog is better as it means updated content on his site. should i use the second domain for blog, or just 301 the second domain to his first domain. Or is it viable to use the second domain as the blog and just set up an rss feed on his page ? Is there a way to have the second domain somehow 'linked' to his first domain with the blog so that google sees them as connected ? NOOBIE o_0
Technical SEO | | mamacassi0