Usage of HTTP Status Code 303
-
Hello,
is there anybody who has got some experience with 303 HTTP Status Code?
Our software development would like to use 303 "See Others" instead of 301 for redirecting old product-links to the site-root, instead of showing 404 errors.
What is the best practise for redirecting old product links which are gone in online-shop context?
Best regards
Steffen
-
I would recommend using a 301 redirect to the home page as this will pass link juice. If they can be redirected to the specific product category that would be useful.
An alternative would be to still serve up the old page so it results in a 200 code or a 301 to a product suggestion page. Having a products like this suggestion page and or a search for products page would likely convert better than just a blanket 301 redirect to the home page.
Another thing you could do is create an intelligent "catch" page that uses the search parameter (if there is one) or the title of the page referring the site and use that as a parameter for searching your products database and serving up some relevant products.
-
It probably will not pass your link juice if any. Zhis is the difference: 301 status codes are passing on 90% of the link juice the inbound links are giving to your pages.
For users it is good to redirect them to semothing else. The fact that a products period is over does not mean that it will not be searched anymore. Keeping old pages at least in the sitemap will not blow your pages at all. I would do that, however technically if there are no inbound links pointing to the pages that you want to 303 redirect, it will not hurt your seo.
-
Hi,
our old content is definitly gone away. We have a lot of volatile content which has got a lifetime from 6-12 month and somtimes shorter. I believe keeping old URLs will blow-up indexed pages.
But my general question was about 303 code. Do you have some experience about the difference between 301/303?
BR
-
Hello,
Are your products gone forever for sure? If you place 301 or 303 the visitors clicking your pages from the serps will see a new content instead of a 404 eror page that is for sure, so it has its user side benefits. However if you are ranking for these products in google and these words are bringing in traffic to your side i would think twice to delete those pages. I you delete the actual content you are ranking with the useres and the engines will see a totally new content, so if you lose your product specific pages you will also lose your rankings sooner or later.
I would leave those pages but do a little reorganizatin on the landing page. I would push the current content a bit downwords and place a one-two line convincing text why you have finished to sell those products (why users should not serych for them longer) and give an alternate better solution for the product type they are searching. So like we have finished selling lithium batteries as the new xy technique has longer 2x life period, and has half the time to charge. You can look at these astonishing products here
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
HTTPs to HTTP Links
Hi Mozers, I have a question about the news that Google Chrome will start blocking mixed content starting in December 2019. That starting in December 2019, users that are presented insecure content will be presented a toggle allowing those Chrome users to unblock the insure resources that Chrome is blocking. And in January 2020, Google will remove that toggle option an will just start blocking mixed content or insecure web pages. Not sure what this means. What are the implications of this for a HTTPS page that has an HTTP link? Thanks, Yael
Intermediate & Advanced SEO | | yaelslater0 -
HTTP URL hangover after move to HTTPS
A clients site was moved to https recently. It's a small site with only 6 pages. One of the pages is to advertise an emergency service. HTTPS move worked fine. Submitted https to webmaster tools, submitted sitemap. 301 redirects. Rankings preserved. However, a few weeks later doing the site:example.com there are two pages for the emergency service. One says https the other is http. But the http one says the correct SEO title and the https one says an old SEO title. This wasn't expected. When you click the HTTP URL link it 301 redirects to the HTTPS url and the correct SEO title is displayed in the browser tab. When you click the HTTPS url link it returns a 200 and the correct SEO title is shown as expected in the browser tab. Anyone have any idea what is going on? And how to fix? Need to get rid of the HTTP URL but in the site search it contains the correct title. Plus- why is it there anyway?
Intermediate & Advanced SEO | | AL123al0 -
Losing backlinks between http and https
Late last year, Shopify moved all their sites from http to https. I did an audit of a Shopify site recently and discovered the following... The https version of the site has just 13 backlinks from 5 domains The http version of the site has 568 backlinks from 48 domains So I went into the blog (which is on a different domain - long story) and changed all the backlinks from http to https. One week later... the https version of the site now has 278 backlinks from the same 5 domains. I've been told that Google doesn't worry about this when it comes to rankings. Not sure if this is true or not. But I definitely believe this DOES affect MOZ Domain Authority. Can anybody confirm or deny this? If the backlinks do not change from http to https following a migration, does this impact MOZ Domain Authority and/or Google rankings?
Intermediate & Advanced SEO | | muzzmoz0 -
Should I include URLs that are 301'd or only include 200 status URLs in my sitemap.xml?
I'm not sure if I should be including old URLs (content) that are being redirected (301) to new URLs (content) in my sitemap.xml. Does anyone know if it is best to include or leave out 301ed URLs in a xml sitemap?
Intermediate & Advanced SEO | | Jonathan.Smith0 -
Worth Modifying Code to Have Text Appear Near Top
Our site uses Wordpress. The code is somewhat heavy. The text to code ratio for the home page is only 16%. Our developer suggests that we modify the code so that the important text appears at the top of the page (without changing the design) so that Google can index it more easily. My developer feels this would be more beneficial for SEO. He believes that reducing the code would create HTML errors. The home page is www.nyc-officespace-leader.com Is this approach sound? My developer describes it in the following manner: | Let me say that I don’t believe the text to code ratio has a significant impact on SEO per se but of course that reducing code, it will reduce page weight therefore it may help to improve ranking. See Homepage for example, this is the top landing page of your site, therefore it is very relevant to optimize. You can see the first block, from attached it has very little content and too many code. There is almost nothing to do about it, visually that is a very good block, in terms of SEO it isn't. I do not recommend to take it off just for SEO, that will make all pages with lot of text, lack of images and people may go away. On the other hand, most of the cases we want to improve text code ratio, there is an impact on unexpected BUGs because the code is being changed and this may affect functionality. I would suggest to spend time on improve the sort-order of the important content inside the code, so we may have similar text code ratio at the end but the important code we need Google to index will be at the very top in the source code, in terms of a very technical approach Google will find the key content faster and that should help to improve the crawling process as search engines read HTML code linearly. This change do not necessarily will affect the HTML, we can achieve it by using style sheet (CSS code) instead, reducing the chance of major BUGs. Either is our choice, we need to evaluate potential problems, code issues and content impact and also we need to apply changes and wait at least 3-4 weeks to start seeing results. It is a long task. Let me know your thought about this, we will estimate a task to improve code without affect web design |
Intermediate & Advanced SEO | | Kingalan10 -
Meta tag description Usage
Do i wanna put meta tags as separate description that is not in the particular web page ,Normally i put meta description as 155 character from first paragraph of the web page .so do i need put a unique meta description ?
Intermediate & Advanced SEO | | innofidelity0 -
Status Code: 404 Errors. How to fix them.
Hi, I have a question about the "4xx Staus Code" errors appearing in the Analysis Tool provided by SEOmoz. They are indicated as the worst errors for your site and must be fixed. I get this message from the good people at SEOmoz: "4xx status codes are shown when the client requests a page that cannot be accessed. This is usually the result of a bad or broken link." Ok, my question is the following. How do I fix them? Those pages are shown as "404" pages on my site...isn't that enough? How can fix the "4xx status code" errors indicated by SEOmoz? Thank you very much for your help. Sal
Intermediate & Advanced SEO | | salvyy0 -
SEO Correlation Between Code and Search Engine Rankings
I posted this on my blog and wanted to get everyones opinion on this (http://palatnikfactor.com/2011/06/07/seo-correlation-between-code-and-search-engine-rankings/) I’m always looking to see what top ranking websites may be doing to get the rankings they do. One of the tasks of any SEO I guess is to really analyze competitors, right? I want to really stress that what I am writing here is completely opinion based and have not (due to time) validated this correlation enough but would like to get the discussion started. Nevertheless, I did enough research to see that there may be a correlation between code validation and top ranking websites, at least for certain queries where the number of real big players/brands is limited or non-existent. So, what do I mean? http://validator.w3.org/ validates code on websites. This tool shows you errors and warnings that may be making it harder for search engines to crawl your website. Looking at top competitors for certain niches, I was surprised to find that top sites had very few errors compared to 2+ page rankings. That’s not to say that all the sites on the first page had fewer errors (cleaner code) than websites in the 2<sup>nd</sup> page plus. However, again, top ranking websites for keywords that I was looking at had cleaner code which may have a correlation in regards to organic rankings. What’s your take? Does this have any effect in regards to SEO?
Intermediate & Advanced SEO | | PaulDylan0