Maintaining SEO with Ecommerce Search Refinement
-
Hey Everyone,
i have an interesting scenario I'd appreciate some feedback on. I'm working on restructuring a client site for a store design, and he had previously built a bunch of landing pages mostly for SEO value- some of them aren't even accessible from the main nav and contain a lot of long-tail type targets. These pages are generating organic traffic but the whole thing is pretty not user-friendly because it's cumbersome to drill down into specific categories (that many of the landing pages fulfill) without going through 3 or 4 pages to get there. For example, if I want to buy orange shoes, i can see specific kinds of orange shoes, but not ALL the orange shoes, even though there is an SEO page for orange shoes that is otherwise inaccessible from the main navigation.
If that wasn't too confusing, essentially the usability solution to this is implementing some search refinement so that the specific sub categories can be drilled into easily with less steps.
My issue is that I'm hesitant to implement this even though I know it would be an overall benefit to the site, because the existence of these SEO pages and being wary of destroying the organic traffic they're already receiving.
My plan was to see to it that the specific category pages are built with the necessary keywords and content to attract those organic visits, but I'm still nervous it might not be enough.
Does anyone have any suggestions for this circumstance, but also just maximizing SEO efforts on a site with search refinement and how to minimize loss. From a usability standpoint, search refinement is great, but how do you counter the significant SEO risks that come with it?
Thanks for your help!
-
Actual pages reached through refinement.
-
That's great- I was thinking of trying something similar. One follow up though, were the sub content pages actually "pages" or were they accessible only through refinement.
For example, if we're talking about "Orange Shoes" - did you actually have a page for Orange Shoes, or is it accessible just by refining Orange shoes from the shoes category?
-
Makes much more sense now.
I went through something very similar about 9 months ago. What we did is take the content from the landing pages and placed it on the sub-content pages with products right on that page. We then created a 301 redirect from the old page to the new. We then went and found all sites on the internet linking to the old pages and updated them if possible to the new URL.
We did see a little bit of a dance on the keywords, but over time we have actually climbed higher in the rankings for those organic terms we were afraid of losing. Best part was that this has increased conversions because customers get to the products much faster and the overall experience is better for the customer.
-
So part of it is that I'm a little wary of deleting those pages, but also that the search-refinement won't be SEO friendly either.
-
I don't think I explained it very well, but yes, the second thing you said. In implementing the search refinement I would be replacing the landing pages with sub-content pages that can be drilled into and my concern is that I will lose the working SEO on those pages. Even though they aren't very user friendly, they're generating some long-tail traffic.
This restructuring is happening in conjunction with a store redesign so it's not as if I'm just implementing it for the heck of it. This is step 1 in a large project, and I don't think tightening up the category structures will considerably improve user experience without search refinement- but in implementing search refinement I'm trying not to demolish the SEO too.
Hope that makes sense- thanks for helping.
-
I'm a little confused as to what you are afraid of losing? From your description it sounds like you are saying you are going to make the pages that are receiving organic traffic more easily accessible from your main navigation?
However, I'm also thinking that you are saying you are considering replacing the landing pages with sub-content pages that can be drilled down in to?
Please help me understand a little more clearly.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO on dynamic website
Hi. I am hoping you can advise. I have a client in one of my training groups and their site is a golf booking engine where all pages are dynamically created based on parameters used in their website search. They want to know what is the best thing to do for SEO. They have some landing pages that Google can see but there is only a small bit of text at the top and the rest of the page is dynamically created. I have advised that they should create landing pages for each of their locations and clubs and use canonicals to handle what Google indexes.Is this the right advice or should they noindex? Thanks S
Intermediate & Advanced SEO | | bedynamic0 -
404s clinging on in Search Console
What is a reasonable length of time to expect 404s to be resolved in Search Console? There was a mass of 404s that were built up from directory changes and filtering URLs that have been fixed. These have all been fixed but of course there are some that slipped the net. How long is it reasonable to expect the old 404s that don't have any links to drop away from Search Console? New 404s are still being reported over 4 months later. 'First detected' is always showing as a date later than the fixed 404's date. Is this reasonable, i've never seen this being so resilient and not clean up like this? We manually fix these 404s and like popcorn more turn up. Just to add the bulk of 404s came into existence around a year ago and left for around 8 months.
Intermediate & Advanced SEO | | MickEdwards0 -
Ecommerce category pages
Hi there, I've been thinking a lot about this lately. I work on a lot of webshops that are made by the same company. I don't like to say this, but not all of their shops perform great SEO-wise. They use a filtering system which occasionally creates hundreds to thousands of category pages. Basically what happens is this: A client that sells fashion has a site (www.client.com). They have 'main categories' like 'Men' 'Women', 'Kids', 'Sale'. So when you click on 'men' in the main navigation, you get www.client.com/men/. Then you can filter on brand, subcategory or color. So you get: www.client.com/men/brand. Basically, the url follows the order in which you filter. So you can also get to 'brand' via 'category': www.client.com/shoes/brand Obviously, this page has the same content as www.client.com/brand/shoes or even /shoes/brand/black and /men/shoes/brand/black if all the brands' shoes happen to be black and mens' shoes. Currently this is fixed by a dynamic canonical system that canonicalizes the brand/category combinations. So there can be 8000 url's on the site, which canonicalize to about 4000 url's. I have a gut feeling that this is still not a good situation for SEO, and I also believe that it would be a lot better to have the filtering system default to a defined order, like /gender/category/brand/color so you don't even need to use these excessive amounts of canonicalization. Because, you can canonicalize the whole bunch, but you'd still offer thousands of useless pages for Google to waste its crawl budget on. Not to mention the time saved when crawling and analysing using Screaming Frog or other audit tools. Any opinions on this matter?
Intermediate & Advanced SEO | | Adriaan.Multiply0 -
Duplicate ecommerce sites, SEO implications & others?
We have an established eCom site built out with custom php, dedicated SERPs, traffic, etc.. The question has arisen on how to extend commerce on social and we have found a solution with Shopify. In order to take advantage of this, we'd need to build out a completely new site in Shopify and would have to have the site live in order to have storefronts on Pinterest and Twitter. Aside from the obvious problem with having two databases, merchant processing, etc, does anyone know whether there are SEO implications to having two live sites with duplicate products? Could we just disavow a Shopify store in Webmaster Tools? Any other thoughts or suggestions? TIA!
Intermediate & Advanced SEO | | PAC31350 -
Researching search volume drop
I am seeing a pretty precipitous drop in search volume traffic (see link). My keyword rankings don't seem to have suffered too much over this period. In fact, my #1 keyword have actually increased slightly in this timeframe. Two questions... Is there some way to assess overall search volume across my tracked keywords (to see if this is just a case of overall searches dropping)? Is there a recommended plan of attack for investigating drops like this - beyond overall search volume, what other data might be important in identifying the cause of this. In short, I'm looking for some logic/structure for how I investigate this, using Moz tools and reports. Thanks. Mark omE1VPc
Intermediate & Advanced SEO | | MarkWill0 -
International SEO Domain Structure
Hi Guys, I am wondering if anybody can point me to a recent trusted report or study on international domain name structure and SEO considerations. I am looking to read up on the SEO considerations and recommendations for the different domain structures in particular using sub-directories i.e. domain.com/uk, domain.com/fr. Kind regards,
Intermediate & Advanced SEO | | WeAreContinuum
Cian1 -
Has this site been a victim of negative seo?
The rankings for our client's site - www.yourlifeprotected.co.uk fell off the face of the earth back in June. Despite trying a huge number of things to try and help the site recover, we've seen no real positive improvements since then. Examples of things we have tried: Disavowed & manually removed poor quality Links Removed any internal Duplicate Content Removed any broken links Re-written all website content to ensure unique & high quality No-Followed all outbound links Added any missing title tags changed hosting Rewritten content to ensure no duplication internally or externally The most recent issue we've picked up is that some highly spammy sites seem to have copied extracts of text from the website and hidden them in their pages. This is a rather puzzling one, as there aren't backlinks, pointing to our site - just the copy. For example - Cancer Page and Diabetes Page.It feels very much as though this could be a negative SEO attack which could be responsible for the drop in rankings and traffic the site has experienced. If this is the case, what can we do about it?! Having already re-written the copy on the site, we obviously dont want to do this again unnecessarily - especially if this could just happen again in future! Any help or advice would be hugely appreciated.
Intermediate & Advanced SEO | | Digirank0 -
SEO Correlation Between Code and Search Engine Rankings
I posted this on my blog and wanted to get everyones opinion on this (http://palatnikfactor.com/2011/06/07/seo-correlation-between-code-and-search-engine-rankings/) I’m always looking to see what top ranking websites may be doing to get the rankings they do. One of the tasks of any SEO I guess is to really analyze competitors, right? I want to really stress that what I am writing here is completely opinion based and have not (due to time) validated this correlation enough but would like to get the discussion started. Nevertheless, I did enough research to see that there may be a correlation between code validation and top ranking websites, at least for certain queries where the number of real big players/brands is limited or non-existent. So, what do I mean? http://validator.w3.org/ validates code on websites. This tool shows you errors and warnings that may be making it harder for search engines to crawl your website. Looking at top competitors for certain niches, I was surprised to find that top sites had very few errors compared to 2+ page rankings. That’s not to say that all the sites on the first page had fewer errors (cleaner code) than websites in the 2<sup>nd</sup> page plus. However, again, top ranking websites for keywords that I was looking at had cleaner code which may have a correlation in regards to organic rankings. What’s your take? Does this have any effect in regards to SEO?
Intermediate & Advanced SEO | | PaulDylan0