Unsolved Using NoIndex Tag instead of 410 Gone Code on Discontinued products?
-
Hello everyone, I am very new to SEO and I wanted to get some input & second opinions on a workaround I am planning to implement on our Shopify store. Any suggestions, thoughts, or insight you have are welcome & appreciated!
For those who aren't aware, Shopify as a platform doesn't allow us to send a 410 Gone Code/Error under any circumstance. When you delete or archive a product/page, it becomes unavailable on the storefront. Unfortunately, the only thing Shopify natively allows me to do is set up a 301 redirect. So when we are forced to discontinue a product, customers currently get a 404 error when trying to go to that old URL.
My planned workaround is to automatically detect when a product has been discontinued and add the NoIndex meta tag to the product page. The product page will stay up but be unavailable for purchase. I am also adjusting the LD+JSON to list the products availability as Discontinued instead of InStock/OutOfStock.
Then I let the page sit for a few months so that crawlers have a chance to recrawl and remove the page from their indexes. I think that is how that works?
Once 3 or 6 months have passed, I plan on archiving the product followed by setting up a 301 redirect pointing to our internal search results page. The redirect will send the to search with a query aimed towards similar products. That should prevent people with open tabs, bookmarks and direct links to that page from receiving a 404 error.I do have Google Search Console setup and integrated with our site, but manually telling google to remove a page obviously only impacts their index.
Will this work the way I think it will?
Will search engines remove the page from their indexes if I add the NoIndex meta tag after they have already been index?
Is there a better way I should implement this?P.S. For those wondering why I am not disallowing the page URL to the Robots.txt, Shopify won't allow me to call collection or product data from within the template that assembles the Robots.txt. So I can't automatically add product URLs to the list.
-
@maribailey10 If I could I would, but we very rarely have products similar enough to the discontinued one for that approach to make sense. Hence why I plan on sending them to a search query page.
Occasionally, we are able to immediately replace a discontinued product with a replacement, but that rarely happens.
-
No just try to interlink them to other similar product and edit content accordingly
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Looking strategy for Backlinks
Hi, I'm a new bee here, Just wondering if we have some good but easy strategies to get a higher rank in PA and DA?
Link Building | | BeriCollection
My Store URL is [https://bericollection.com].
We are selling Shoes, Watches and smart gadgets.
We do shipping worldwide but need help from the community.0 -
What steps should I take to address damage to my website, including malware insertion and content theft?
The question revolves around the steps required to mitigate damage inflicted upon a website, encompassing issues such as malware insertion and content theft. It prompts a comprehensive exploration of the necessary actions to take in response to these challenges. The inquirer seeks guidance on how to effectively address the damage, indicating a desire for practical solutions and strategies to restore and safeguard their website's integrity. By posing this question, the individual demonstrates an awareness of the severity of the situation and a readiness to undertake corrective measures.
Technical SEO | | ralphbaer0 -
Over Optimised Magento Pages
We are working on a clients Magento site and we've added new copy which has a decent keyword density which is in line with best practice. When we run it through Moz we are getting a Key Word Stuffing alert saying the page has 27 keywords, where we can only see about 11. This is the page https://www.greatbeanbags.com/bean-bag-cushions The client is pushing back saying the page must have already been optimised before as our new copy has triggered the stuffing alert. But my guess is the page was already stuffed but buy some Magento code we can't see. Any ideas? #magento #Keyworddensity
Content Development | | Marketing_Optimist0 -
Staging website got indexed by google
Our staging website got indexed by google and now MOZ is showing all inbound links from staging site, how should i remove those links and make it no index. Note- we already added Meta NOINDEX in head tag
Intermediate & Advanced SEO | | Asmi-Ta0 -
Google Not Indexing Pages (Wordpress)
Hello, recently I started noticing that google is not indexing our new pages or our new blog posts. We are simply getting a "Discovered - Currently Not Indexed" message on all new pages. When I click "Request Indexing" is takes a few days, but eventually it does get indexed and is on Google. This is very strange, as our website has been around since the late 90's and the quality of the new content is neither duplicate nor "low quality". We started noticing this happening around February. We also do not have many pages - maybe 500 maximum? I have looked at all the obvious answers (allowing for indexing, etc.), but just can't seem to pinpoint a reason why. Has anyone had this happen recently? It is getting very annoying having to manually go in and request indexing for every page and makes me think there may be some underlying issues with the website that should be fixed.
Technical SEO | | Hasanovic1 -
Using Robots.txt
I want to Block or prevent pages being accessed or indexed by googlebot. Please tell me if googlebot will NOT Access any URL that begins with my domain name, followed by a question mark,followed by any string by using Robots.txt below. Sample URL http://mydomain.com/?example User-agent: Googlebot Disallow: /?
Technical SEO | | semer0 -
What criteria to use on deciding to renew a domain
I have a domain with a domain authority rank of 16 and 0 inbound links. I have owned the domain for over 6 years and no longer have any use for it. Additionally, it has no good keywords in the domain. It has been a one page website for about 4 years. The broad question is what criteria do people using in deciding whether or not to renew a domain name? I have a bunch coming due and am interested if I should take the time to try and sell on SEDO or just let them expire. Whould love to hear how people handle this situation.
Technical SEO | | FidelityOne0 -
How to use Schema.org for product listings
Hi Guys, Maybe a weird question but how would you advise using Schema.org for product listings or if you prefer a sub category page with products listed in it. Thanks, Walid
Technical SEO | | walidalsaqqaf0