Magento: Should we disable old URL's or delete the page altogether
-
Our developer tells us that we have a lot of 404 pages that are being included in our sitemap and the reason for this is because we have put 301 redirects on the old pages to new pages.
We're using Magento and our current process is to simply disable, which then makes it a a 404. We then redirect this page using a 301 redirect to a new relevant page. The reason for redirecting these pages is because the old pages are still being indexed in Google.
I understand 404 pages will eventually drop out of Google's index, but was wondering if we were somehow preventing them dropping out of the index by redirecting the URL's, causing the 404 pages to be added to the sitemap.
My questions are:
1. Could we simply delete the entire unwanted page, so that it returns a 404 and drops out of Google's index altogether?
2. Because the 404 pages are in the sitemap, does this mean they will continue to be indexed by Google?
-
Brilliant thanks Dan.
This is what I'm going to do.
-
Hey Andy
To answer your questions:
1. So if you're 301'ing the page, it's not really a 404 page, it's a 301
So yes, you can remove the 301 redirect, making it a true 404 page (check that it returns a 404 code using fetch as google or a tool like urivalet.com).
2. If they are in the sitemap, this won't prevent Google from removing them from the index, but it will throw an error. And not that many people care about Bing, but Bing is apparently super picky about having XML sitemaps perfect.
So yes I would just 404 them without the redirects.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does creating too many parent pages damage my website's SEO?
I need to know how to keep my website structure well organised and ensure Google still recognises the key pages. I work for a travel company which needs to give customers various pieces of information on our website and this needs to be well organised in terms of structure. For example, customers need information on airport pick-ups and drop-offs for each of our destinations but this isn't something that needs to rank on Google. Logically for site structure would be to create a parent page: thedragontrip.com/transfers/india Is creating parent pages for unimportant content a bad idea?
Intermediate & Advanced SEO | | nicolewretham1 -
Old Sub domain removal and deletion of content
There are two questions here. I have waited for over 2-3 weeks now and they are still not resolved till now. An old sub-domain is still indexed on Google (blog.nirogam.com) of which all pages have been redirected or 404'd to main domain. There is no webmasters, no authority of this old sub-domain. Hosting of the same might be there. (this has been deleted and does not exist - we own main domain only) How do I de-index and remove them for good? _(Around ~1,000 pages)_I am trying this public tool - any better approaches?Even after removing pages and submission on the tool, 600 pages are still indexed after 2-3 weeks! We deleted a lot of thin content/duplicate pages from the domain (nirogam.com) in Wordpress - All these pages are still in Google's index. They are in Trash folder now. This is causing an increase in 404s in the webmasters etcI have served a 410 header (using wordpress plugin) on all these pages as these should not be directed to anything. However, Google does not always fully understand 410 properly and it still shows up in webmasters as read in this detailed post.All these pages are still indexed.How do I de-index these pages? Any other approach to stop the 404s and remove these pages for good?Any feedback/approach will be highly appreciated.
Intermediate & Advanced SEO | | pks3330 -
When rebranding, what's the best thing to do with the new domain before rebranding?
A. Do nothing
Intermediate & Advanced SEO | | Maxaro.nl
B. Redirect to legacy site (current domain)
C. Create a placeholder with information about the rebranding
D. Other... What do you think is best?0 -
How can I remove my old sites URL from showing up in Google?
Hi everyone. We have had a new site up for over a year now. When I search site:sqlsentry.net the old url still shows up and while those pages are redirected to .com I'd like to get the .net URL's out of google forever. What is the best way I can go about that?
Intermediate & Advanced SEO | | Sika220 -
Preserving URL Structure from Os Commerce to Magento
I have a website that is built on OS Commerce and I am planning to transition to Magento. I was told that the transition to Magento would change my url structure. How do I preserve my current url structure while migrating to the Magento platform so that I do not lose my backlink profile.
Intermediate & Advanced SEO | | WebServiceConsulting.com0 -
What do you do with the page of a product that has been deleted?
As anyone know with an ecommerce website, products are constantly being added and removed. Once products are removed, the corresponding product pages are not reachable. Currently, I am redirecting to the Search page, if a product page is reached, whose corresponding product has been deleted. I am not sure if that is the correct, recommended technique from a SEO perspective. Should I try to show related products on the redirected page? Does anyone here know what is the best thing to do with this product page?
Intermediate & Advanced SEO | | amitramani0 -
URL Parameters Duplicate Page Title
Thanks in advance, I'm getting duplicate page titles because seomoz keeps crawling through my url parameters. I added forcefiltersupdate to the URL parameters in webmaster tools but it has not seemed to have an effect. Below is an example of the duplicate content issue that I am having. http://qlineshop.com/OC/index.php?route=product/category&path=59_62&forcefiltersupdate=true&checkedfilters[]=a.13.13.387baf0199e7c9cc944fae94e96448fa Any thoughts? Thanks again. -Patrick
Intermediate & Advanced SEO | | bamron0 -
Google's Exact Match Algorithm Reduced Our Traffic!
Google's first Panda de-valued our Web store, www.audiobooksonline.com, and our traffic went from 2500 - 3000 (mostly organic referrals) per month to 800 - 1000. Google's under-valuing of our Web store continued to reduce our traffic to 400-500 for the past few months. From 4/5/2013 to 4/6/2013 our traffic dropped 50% more, because (I believe) of Google's "exact domain match" algorithm implementation. We were, even after Panda and up to 4/5/2013 getting a significant amount of organic traffic for search terms such as "audiobooks online," "audio books online," and "online audiobooks." We no longer get traffic for these generic keywords. What I don't understand is why a UK company, www.audiobooksonline.co.uk/, with a very similar domain name, ranks #5 for "audio books online" and #4 for "audiobooks online" while we've almost disappeared from Google rankings. By any measurement I am aware of, our site should rank higher than audiobooksonline.co.uk. Market Samurai reports for "audio books online" and "audiobooks online" shows that our Web store is significantly "stronger" than audiobooksonline.co.uk but they show up on Google's first page and we are down several pages. I also checked a few titles on audiobooksonline.co.uk and confirmed they are using the same publisher descriptions we and many other online book / audiobook merchants do = duplicate content. We have never received notice that our Web store was being penalized. Why would audiobooksonline.co.uk rank so much higher than audiobooksonline.com? Does Google treat non-USA sites different than USA sites?
Intermediate & Advanced SEO | | lbohen0