Ecommerce SEO - Indexed product pages are returning 404's due to product database removal. HELP!
-
Hi all,
I recently took over an e-commerce start-up project from one of my co-workers (who left the job last week). This previous project manager had uploaded ~2000 products without setting up a robot.txt file, and as a result, all of the product pages were indexed by Google (verified via Google Webmaster Tool).
The problem came about when he deleted the entire product database from our hosting service, godaddy and performed a fresh install of Prestashop on our hosting plan. All of the created product pages are now gone, and I'm left with ~2000 broken URL's returning 404's. Currently, the site does not have any products uploaded. From my knowledge, I have to either:
- canonicalize the broken URL's to the new corresponding product pages,
or
- request Google to remove the broken URL's (I believe this is only a temporary solution, for Google honors URL removal request for 90 days)
What is the best way to approach this situation? If I setup a canonicalization, would I have to recreate the deleted pages (to match the URL address) and have those pages redirect to the new product pages (canonicalization)?
Alex
-
Everett,
You're right on the money. I don't think you could have summarized my problem any better. I will take Dana's and your advice and let them sit "indexed" for a while and serve a 404. According to GWT's Index Status, the product pages were indexed about a month ago, so I guess it won't hurt to wait a few more weeks until those pages dropped out of Google's index naturally, especially since the site development won't be done for another 6~7 weeks.
Thanks a bunch for all of your insights
-
Right on Everett. I agree 100%
-
I want to make sure everyone, including myself, understands you Alex. Correct me if I'm wrong, but you're saying that the website is totally new (a start-up) and nothing (at least nothing owned by the company you're with) has ever been on that domain name. While building the site the previous guy accidentally allowed the development version of the site to be indexed, and/or allowed product pages that you don't want on the site at all to be indexed. Since it is a brand new site those "old" pages that were deleted didn't have any external links, and didn't have any traffic from Google or elsewhere outside of the company.
IF that is the case, then you can probably just let those pages stay as 404s. Eventually, since nobody is linking to them, they will drop out of the index on their own.
I wouldn't use the URL removal tool in this case. For one thing, it is a dangerous tool and if you don't have experience with this sort of thing it could do more harm than good. It should only take a few weeks for those URLs that were briefly live and indexed to go away if you are serving a 404 or 410 http header response code on those URLs.
I hope this helps. Please let us know if we have misinterpreted your problem.
-
Understood Alex. Yes, of course you would have to rebuild the pages first before you can 301, but it sounds like you are planning on rebuilding them (otherwise you wouldn't be able to use canonical tags either, because there wouldn't be a page to put them on).
I wouldn't just give up and ask Google to remove all of the old URLs. I agree with what Mike has to say about that below. A 302 is a good option if you are worried about the 404s sitting in the index while you are rebuilding your product pages. If you are still on the same platform (it sounds like that didn't change), I would suggest rebuilding as many of the old URLs as you can (if they were good SEO-friendly URLs). That way you could bypass the 301 redirect. If you want to create your pages so that product options are rolled in and separate colors of things no longer need separate pages, you can then choose whether to 301 redirect those old URLs or simply let them 404.
404s aren't necessarily always a bad thing. Regarding the 2,000 of them you have now, if some of those pages just need to go away, you can let them 404 and they will eventually drop out of Google's index. You aren't required to manually submit them via GWT in order for them to be removed.
-
Hi Mike,
Thanks for weighing in. Recreating all of the old pages seems like a pain in the butt... Besides, the site never launched, so I had no traffic at all. Considering there was no traffic at all to these pages, do you think it's a good idea to go through the URL removal from GWT and purge the broken links completely from Google's index?
- Alex
-
Hi Dana,
Thank you for your advice. I'm new at SEO, so I may be wrong but...
Mapping out the old/new URLs on a spreadsheet and setting up a 301 redirect to the new URLs is not a plausible option in my opinion, mainly because the new URLs literally do not exist (I have not created ANY product pages). According to your suggestion, I would have to create new product pages and do a 301 redirect from the broekn URLs to the newly created pages? Not quite sure if I'm understanding you correctly...
In addition, the previous project manager wasn't SEO-savvy (l'm not either... sigh..), so he didn't know that creating separate pages for a product with multiple attributes (such as flavor and size) would result in major duplicate content issues.
The site is going through some major design/layout overhaul, and I intend to come up with a SEO strategy before creating any categories or products.
Thus...
Do you think it's better to submit a URL removal request on GWT and get rid of the indexed URL's completely? I just re-read Google's policy on URL removal, and it states that as long as I have a 4xx (404 or 410, I'm assuming..) returned for the URLs, Google will honor the removal request.
- Alex
-
Rel Canonical is not quite the right thing for this sort of issue.
If you're worried about the 404s sitting around too long and losing traffic for the moment, you can 302 everything to a landing page, category page, or homepage while you work on setting everything else up. You have two choices at this point.... 1) recreate all of the old pages and old URLs then remove the 302s, or 2) Add new products and new URLs, then as Dana said you'll need to map out all your new product URLs and old URLs to determine what old URL should be 301 redirected where. Then set up your necessary 301s and test that they all work.
-
Hi Alex, I am sorry to hear about this. What a mess, no? If it were me, I wouldn't rely solely on the canonical tag. I would also create a spreadsheet and map all the old URLs to the new URLs and set up 301 redirects from the old to the new. 2,000 isn't too bad. You can probably knock them out in 2-3 days...but be sure to test all of the 301s and make sure they are performing the way you expect them to. Hope that helps a little!
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Pages excluded from Google's index due to "different canonicalization than user"
Hi MOZ community, A few weeks ago we noticed a complete collapse in traffic on some of our pages (7 out of around 150 blog posts in question). We were able to confirm that those pages disappeared for good from Google's index at the end of January '18, they were still findable via all other major search engines. Using Google's Search Console (previously Webmastertools) we found the unindexed URLs in the list of pages being excluded because "Google chose different canonical than user". Content-wise, the page that Google falsely determines as canonical instead has little to no similarity to the pages it thereby excludes from the index. False canonicalization About our setup: We are a SPA, delivering our pages pre-rendered, each with an (empty) rel=canonical tag in the HTTP header that's then dynamically filled with a self-referential link to the pages own URL via Javascript. This seemed and seems to work fine for 99% of our pages but happens to fail for one of our top performing ones (which is why the hassle 😉 ). What we tried so far: going through every step of this handy guide: https://mza.bundledseo.com/blog/panic-stations-how-to-handle-an-important-page-disappearing-from-google-case-study --> inconclusive (healthy pages, no penalties etc.) manually requesting re-indexation via Search Console --> immediately brought back some pages, others shortly re-appeared in the index then got kicked again for the aforementioned reasons checking other search engines --> pages are only gone from Google, can still be found via Bing, DuckDuckGo and other search engines Questions to you: How does the Googlebot operate with Javascript and does anybody know if their setup has changed in that respect around the end of January? Could you think of any other reason to cause the behavior described above? Eternally thankful for any help! ldWB9
Intermediate & Advanced SEO | | SvenRi1 -
Exact match .org Ecommerce: Reason why internal page is ranking over home page
Hello, We have a new store where an internal category page (our biggest category) is moving up ahead of the home page. What could be the reason for this? It's an exact match .org. Over-optimization? Something else? It happened both when I didn't optimize the home page title tag and when I did for the main keyword, i.e. mainkeyword | mainkeyword.org, or just mainkeyword.org Home Page. Both didn't help with this. We have very few backlinks. Thanks
Intermediate & Advanced SEO | | BobGW0 -
Indexed Answer Box Result Leads to a 404 page?
Hey everyone, One of my clients is currently getting an answer box (people also ask) result for a page that is no longer live. They migrated their site approximately 6 months ago, and the old page is for some reason still indexed in the (people also asked) results. Weird thing is that this page leads to a 404 error. Why the heck is Google showing this? Are there separate indexes for "people also asked" results, and regular organic listings? Has anyone ever seen/experienced something like this before? Any insight would is much appreciated
Intermediate & Advanced SEO | | HSawhney0 -
Removing pages from index
My client is running 4 websites on ModX CMS and using the same database for all the sites. Roger has discovered that one of the sites has 2050 302 redirects pointing to the clients other sites. The Sitemap for the site in question includes 860 pages. Google Webmaster Tools has indexed 540 pages. Roger has discovered 5200 pages and a Site: query of Google reveals 7200 pages. Diving into the SERP results many of the pages indexed are pointing to the other 3 sites. I believe there is a configuration problem with the site because the other sites when crawled do not have a huge volume of redirects. My concern is how can we remove from Google's index the 2050 pages that are redirecting to the other sites via a 302 redirect?
Intermediate & Advanced SEO | | tinbum0 -
Does Google Read URL's if they include a # tag? Re: SEO Value of Clean Url's
An ECWID rep stated in regards to an inquiry about how the ECWID url's are not customizable, that "an important thing is that it doesn't matter what these URLs look like, because search engines don't read anything after that # in URLs. " Example http://www.runningboards4less.com/general-motors#!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 Basically all of this: #!/Classic-Pro-Series-Extruded-2/p/28043025/category=6593891 That is a snippet out of a conversation where ECWID said that dirty urls don't matter beyond a hashtag... Is that true? I haven't found any rule that Google or other search engines (Google is really the most important) don't index, read, or place value on the part of the url after a # tag.
Intermediate & Advanced SEO | | Atlanta-SMO0 -
SEO structure question: Better to add similar (but distinct) content to multiple unique pages or make one unique page?
Not sure which approach would be more SEO ranking friendly? As we are a music store, we do instrument repairs on all instruments. Currently, I don't have much of any content about our repairs on our website... so I'm considering a couple different approaches of adding this content: Let's take Trumpet Repair for example: 1. I can auto write to the HTML body (say, at the end of the body) of our 20 Trumpets (each having their own page) we have for sale on our site, the verbiage of all repairs, services, rates, and other repair related detail. In my mind, the effect of this may be that: This added information does uniquely pertain to Trumpets only (excludes all other instrument repair info), which Google likes... but it would be duplicate Trumpet repair information over 20 pages.... which Google may not like? 2. Or I could auto write the repair details to the Trumpet's Category Page - either in the Body, Header, or Footer. This definitely reduces the redundancy of the repeating Trumpet repair info per Trumpet page, but it also reduces each Trumpet pages content depth... so I'm not sure which out weighs the other? 3. Write it to both category page & individual pages? Possibly valuable because the information is anchoring all around itself and supporting... or is that super duplication? 4. Of course, create a category dedicated to repairs then add a subcategory for each instrument and have the repair info there be completely unique to that page...- then in the body of each 20 Trumpets, tag an internal link to Trumpet Repair? Any suggestions greatly appreciated? Thanks, Kevin
Intermediate & Advanced SEO | | Kevin_McLeish0 -
Impact of simplifying website and removing 80% of site's content
We're thinking of simplifying our website which has grown to a very large size by removing all the content which hardly ever gets visited. The plan is to remove this content / make changes over time in small chunks so that we can monitor the impact on SEO. My gut feeling is that this is okay if we make sure to redirect old pages and make sure that the pages we remove aren't getting any traffic. From my research online it seems that more content is not necessarily a good thing if that content is ineffective and that simplifying a site can improve conversions and usability. Could I get people's thoughts on this please? Are there are risks that we should look out for or any alternatives to this approach? At the moment I'm struggling to combine the needs of SEO with making the website more effective.
Intermediate & Advanced SEO | | RG_SEO0 -
Soft 404's from pages blocked by robots.txt -- cause for concern?
We're seeing soft 404 errors appear in our google webmaster tools section on pages that are blocked by robots.txt (our search result pages). Should we be concerned? Is there anything we can do about this?
Intermediate & Advanced SEO | | nicole.healthline4