Do I need to do 301s in this situation?
-
We have an e-commerce site built on Magento 2. We launched a few months ago, and had about 2K categories. The categories got indexed in Google for the most part. Shortly after launch, we decided to go with SLI for search and navigation because the native search/navigation was too slow given our database.
The navigation pages are now hosted navigation pages; meaning, the URLs have changed and they are now hosted by SLI.
I have done 301s for the most popular categories, but I didn't do 301s for all categories as we have to go through each category one-by-one and map it to the correct navigation page. Our new category sitemap only lists the new SLI category URLs.
Will the fact that we have not 301'd all of our former categories hurt us as far as SEO? Do I have to do 301 redirects for all former category pages?
-
Nigel,
Thanks for the feedback.
The product pages do not have the categories in the breadcrumbs. It is just home > product.
The thing is, the category pages that were indexed are already deindexed. And, we didn't build any links to the category pages as it was a new e-commerce site. I will try to see if there is a faster way of mapping the native categories to the hosted navigation pages.
Although it is much faster and more user friendly, I do regret using hosted navigation on the site now. Also, I may not use the hosted navigation service at the end of the year. I am wondering if I should hold off on the 301s, as will have to reverse the 301s if we cancel the service. What are your thoughts?
-
Hi Kevin
In short yes you do.
The thing is that if the categories are indexed in Google and someone clicks on one it will 404 which is not a great user experience. Also if those URLs are listed anywhere else in directories, blogs or any other site then visits from those sources will also 404. It's just very bad practice. Can't you just scrape all of the URLs, put them in a csv, add the new URLs and copy to .htaccess as a quick way of doing it?
By the way having navigation pages hosted externally on non-company URLs sounds like an SEO disaster to me. Your site's natural hierarchy from home>cat>product has the cat part on an external server. I can't see how this can possibly be good for SEO. Do your product URLs contain the category?
Regards Nigel
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
I need to know if the clicks on GSC are unique? Thanks for the answers
I need to know if clicks on GSC are unique or not? thanks
Intermediate & Advanced SEO | | Binary_SEO0 -
Lazy Loading of products on an E-Commerce Website - Options Needed
Hi Moz Fans. We are in the process of re-designing our product pages and we need to improve the page load speed. Our developers have suggested that we load the associated products on the page using Lazy Loading, While I understand this will certainly have a positive impact on the page load speed I am concerned on the SEO impact. We can have upwards of 50 associated products on a page so need a solution. So far I have found the following solution online which uses Lazy Loading and Escaped Fragments - The concern here is from serving an alternate version to search engines. The solution was developed by Google not only for lazy loading, but for indexing AJAX contents in general.
Intermediate & Advanced SEO | | JBGlobalSEO
Here's the official page: Making AJAX Applications Crawlable. The documentation is simple and clear, but in a few words the solution is to use slightly modified URL fragments.
A fragment is the last part of the URL, prefixed by #. Fragments are not propagated to the server, they are used only on the client side to tell the browser to show something, usually to move to a in-page bookmark.
If instead of using # as the prefix, you use #!, this instructs Google to ask the server for a special version of your page using an ugly URL. When the server receives this ugly request, it's your responsibility to send back a static version of the page that renders an HTML snapshot (the not indexed image in our case). It seems complicated but it is not, let's use our gallery as an example. Every gallery thumbnail has to have an hyperlink like: http://www.idea-r.it/...#!blogimage=<image-number></image-number> When the crawler will find this markup will change it to
http://www.idea-r.it/...?_escaped_fragment_=blogimage=<image-number></image-number> Let's take a look at what you have to answer on the server side to provide a valid HTML snapshot.
My implementation uses ASP.NET, but any server technology will be good. var fragment = Request.QueryString[``"_escaped_fragment_"``];``if (!String.IsNullOrEmpty(fragment))``{``var escapedParams = fragment.Split(``new``[] { ``'=' });``if (escapedParams.Length == 2)``{``var imageToDisplay = escapedParams[1];``// Render the page with the gallery showing ``// the requested image (statically!)``...``}``} What's rendered is an HTML snapshot, that is a static version of the gallery already positioned on the requested image (server side).
To make it perfect we have to give the user a chance to bookmark the current gallery image.
90% comes for free, we have only to parse the fragment on the client side and show the requested image if (window.location.hash)``{``// NOTE: remove initial #``var fragmentParams = window.location.hash.substring(1).split(``'='``);``var imageToDisplay = fragmentParams[1]``// Render the page with the gallery showing the requested image (dynamically!)``...``} The other option would be to look at a recommendation engine to show a small selection of related products instead. This would cut the total number of related products down. The concern with this one is we are removing a massive chunk of content from he existing pages, Some is not the most relevant but its content. Any advice and discussion welcome 🙂0 -
Need assistance in improving SEO of website
Dear SEO Expert, We run a website www.guitarmonk.com. Moz has told us of some errors at our website viz.: Especially duplicate content etc and whatever in addition you suggest should be good for the website for certain relative important keywords. Regards
Intermediate & Advanced SEO | | Guitarmonk0 -
Do I need to update my previous blog posts with my new SEO strategy'?
Hi Everyone, I have published 46 articles so far on my blog. Recently I changed my SEO strategy including changing main page titles, changing the targeting pages for each keyword, ... . Do you think it is a good thing to go back through all of my blog posts and change the internal link building and modify them accordingly. Thanks,
Intermediate & Advanced SEO | | AlirezaHamidian0 -
1 site on 2 domains (interesting situation, expert advice needed)
Dear all, i have read many posts about having one content on 2 different domains, how to combine those two to avoid duplicate content. However the story of my two domains makes this question really difficult. Domain 1: chillispot.org ( http://www.opensiteexplorer.org/links?site=chillispot.org ) The original site was on this domain, started 9 years ago. That time the owner of the domain was not me. The site was very popular with lots of links to it. Then after 5 years of operation, the site closed. I have managed to save the content to: Domain 2: chillispot.info ( http://www.opensiteexplorer.org/links?site=chillispot.info ) The content i put there was basically the same. Many links were changed to chillispot.info on external sites when they noticed the change. But lots of links are still unchanged and pointing to .ord domain. The .info is doing well in search engines (for example for keyword 'chillispot'). Now i managed to buy the original chillispot.org domain. As you can see the domain authority of the .org domain is still higher than the .info one and it has more valuable links. Question is: what would be the best approach to offer content on both domains without having penalized by google for duplicated content? Which domain should we keep the content on? The original .org one, which is still a better domain but not working for several years or the .info one who has the content for several years now and doing well on search engines? And then, after we decide this, what would be the best approach to send users to the real content? Thanks for the answers!
Intermediate & Advanced SEO | | Fudge0 -
Need advice on 301 domain redirection
Hello friends, We have two sites namely spiderman-example.com & avengers-example.com which sells the same product listed out under similar categories, since we are about to stop or put down the site “avengers-example.com” because we just want to concentrate in bringing up a single brand called spiderman-example.com. “Spiderman-example” has comparatively more visitors and conversion rates than ''avengers-example'' ie. 90 % more traffic and conversion. Avengers-example has a small fraction of loyal customers who still search for the brand-name & there are a hand-full of potential keywords those ranking on its own. So is it advisable to redirect Avengers-example to spiderman-example using 301-redirect? Will this help to gain any link-juice from Avengers-example? If so how can we effectively redirect between two domain’s with minimal loss in page authority & linkjuice to enhance ''spiderman-example''? Off beat:These names "Avengers" and "Spiderman" were just used as an example but the actual site names has no relation to the ones mentioned above.
Intermediate & Advanced SEO | | semvibe0 -
Confusing 301 / Canonical Redirect Issue - Wizard Needed
I had two pages on my site with identical content. What I did was 301 redirect one page to the other. I also added canonical redirect code to the page that held the 301 code. Here is what I have: www.careersinmusic.com/music-colleges.aspx - this page was a duplicate and I needed it to resolve to:
Intermediate & Advanced SEO | | 4Buck
www.careersinmusic.com/music-schools.aspx Here is the code I used: XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX music-colleges.aspx
<%@ Page Language="VB" AutoEventWireup="false" CodeFile="music-colleges.aspx.vb" Inherits="music_colleges" %>
http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd">
http://www.w3.org/1999/xhtml"> http://www.careersinmusic.com/music-schools.aspx"/> XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX
music-colleges.aspx.vb
Partial Class music_colleges
Inherits System.Web.UI.Page
Protected Sub Page_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load
Response.Status = "301 Moved Permanently"
Response.AddHeader("Location", "http://www.careersinmusic.com/music-schools.aspx")
End Sub
End Class XXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXXX The problem:
For some reason, when the search “music colleges” is done in Google, I am #7. When the term “music schools” is done, I am around 119. I MUST be getting a penalty for some reason, I just cannot figure the reason. When perform well for one term and terrible for the next? All I can come up with is a duplicate content penalty or something along those lines. Also, music-colleges.aspx seems to still be in Googles index, even though the above 301 happened months ago. Thoughts? site:www.careersinmusic.com/music-colleges.aspx Any insight into this would be GREATLY appreciated. Many Thanks!0 -
Need to migrate multiple URLs and trying to save link juice
I have an interesting problem SEOmozers and wanted to see if I could get some good ideas as to what I should to for the greatest benefit. I have an ecommerce website that sells tire sensors. We just converted the old site to a new platform and payment processor, so the site has changed completely from the original, just offering virtually the same products as before. You can find it at www.tire-sensors.com We're ranked #1 for the keyword "tire sensors" in Google. We sell sensors for ford, honda, toyota, etc -- and tire-sensors.com has all of those listed. Before I came along, the company I'm working for also had individual "mini ecommerce" sites created with only 1 brand of sensors and the URL to match that maker. Example : www.fordtiresensors.com is our site, only sells the Ford parts from our main site, and ranks #1 in Google for "ford tire sensors" I don't have analytics on these old sites but Google Keyword Tool is saying "ford tire sensors" gets 880 local searches a month, and other brand-specific tire sensors are receiving traffic as well. We have many other sites that are doing the same thing. www.suzukitiresensors.com (ranked #2 for "suzuki tire sensors") Only sells our Suzuki collection from the main site's inventory etc We need to get rid of the old sites because we want to shut down the payment gateway and various other things those sites are using, and move to one consolidated system (aka www.tire-sensors.com) Would simply making each maker-specific URL (ie. fordtiresensors.com) 301 redirect to our main site (www.tire-sensors.com) give us to most benefit, rankings, traffic etc? Or would that be detrimental to what we're trying to do -- capturing the tire sensors market for all car manufacturers? Suggestions? Thanks a lot in advance! Jordan
Intermediate & Advanced SEO | | JordanGodbey0