What to do when all products are one of a kind WYSIWYG and url's are continuously changing. Lots of 404's
-
Hey Guys, I'm working on a website with WYSIWYG one of a kind products and the url's are continuously changing. There are allot of duplicate page titles (56 currently) but that number is always changing too.
Let me give you guys a little background on the website. The site sells different types of live coral. So there may be anywhere from 20 - 150 corals of the same species. Each coral is a unique size, color etc. When the coral gets sold the site owner trashes the product creating a new 404. Sometimes the url gets indexed, other times they don't since the corals get sold within hours/days. I was thinking of optimizing each product with a keyword and re-using the url by having the client update the picture and price but that still leaves allot more products than keywords.
Here is an example of the corals with the same title http://austinaquafarms.com/product-category/acans/
Thanks for the help guys. I'm not really sure what to do.
-
Hey Aron
Just wanted to chime in on the wordpress bit. EGOL nailed the core answer though. But for the noindex, yes you can just noindex any pages you want to and this isn't going to cause any issues. Noindexed pages do not count towards Panda or low user metrics in the algo, so it's a great way to let the content exist but not have it cause trouble in the SERPs.
-Dan
-
The way woocommerce works is by creating a custom post type. (similar to a regular blog post but styled for products) and then dynamically adds the products to a product category page or anywhere else I want them displayed. Creating a custom post type for products also enables allot of customization and even advanced reporting.
-
....but the product url will always be there.
I agree..... but if you never link to that page then Google should not know about it.
I am not familiar with Wordpress and Woocommerce, however, the shopping systems that I have used all allowed me to create "add to cart" buttons. I could place these anywhere on the site - even in pdf documents. I have never used the product pages that my shopping systems produce. Why? I think that I can make product pages that are better optimzied for search and better arranged for customers. So, I have lots of pages on my site that list multiple items and almost no pages that list a single item. This has saved me a lot of time, I think that my site competes a lot better, I think that it makes more convenient shopping and I believe that I sell a lot more.
-
Well I'm not sure it's possible to remove the product pages since the site is built with WordPress and Woocommerce. Each product creates a url. I can create a quick view box with an add to cart button for a better user experience but the product url will always be there.
Thinking about what you said I may have a few options but I may be way off.
1. I can optimize the category page for the product keywords with more content and details. Then I can noindex all the product pages.
2. I can optimize the single product pages and add unique content to each product for about 10 products and noindex the rest.
Not sure if noindexing would be the right way to do it or if it would add more issues. Since I'm using WordPress and the yoast seo plugin I can have the website owner check the noindex button on all new products created. What do you think?
As far as content we are in the process of putting together a full content strategy and will have many tutorials and other great care info along with industry tips and blog posts.
-
Awesome coral! Awesome. http://austinaquafarms.com/product-category/acans/
If this was my site, I would make it just a few pages. One page for each type of coral.
The individual product pages on the site right now have almost no information. So, I would put all of the information on a huge category page and optimize it perfectly for the type of coral. Plus I would add several authoritative paragraphs of text onto the page - maybe in the right side bar with background info about the type of coral, how to care for it, tips for making it do well in your tank. This extra content will enable the page to be more competitive and will pull in traffic for long tail keywrods.
Eliminating the product pages will eliminate the 404 problems, simplify maintaining the site and when visitor lands on the Acans page they will say WOW! I think you will sell more... from this presentation... no guarantees, just my gut.
I also believe that it will pull all of the power that has seeped into the product pages back into the category page. From my experience, a compact site with a small number of pages competes a LOT better than a larger site with a bunch of pages with very thin content.
Those thin content pages also put this site at risk for Panda problems if it does not have them already.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How will changing my website's page content affect SEO?
Our company is looking to update the content on our existing web pages and I am curious what the best way to roll out these changes are in order to maintain good SEO rankings for certain pages. The infrastructure of the site will not be modified except for maybe adding a couple new pages, but existing domains will stay the same. If the domains are staying the same does it really matter if I just updated 1 page every week or so, versus updating them all at once? Just looking for some insight into how freshening up the content on the back end pages could potentially hurt SEO rankings initially. Thanks!
Intermediate & Advanced SEO | | Bankable1 -
One of my Friend's website Domain Authority is Reducing? What could be the reason?
Hello Guys, One of my friend's website domain authority is decreasing since they have moved their domain from HTTP to https.
Intermediate & Advanced SEO | | Max_
There is another problem that his blog is on subfolder with HTTP.
So, can you guys please tell me how to fix this issue and also it's losing some of the rankings like 2-5 positions down. Here is website URL: myfitfuel.in/
here is the blog URL: myfitfuel.in/mffblog/0 -
Site's disappearnce in web rankings
I'm currently doing some work on a website: http://www.abetterdriveway.com.au. Upon starting, I detected a lot of spammy links going to this website and sort to remove them before submitting a disavow report. A few months later, this site completely disappeared in the rankings, with all keywords suddenly not ranked. I realised that the test website (which was put up to view before the new site went live) was still up on another URL and Google was suddenly ranking that site instead. Hence, I ensured that test site was completely removed. 3 weeks later however, the site (www.abetterdriveway.com.au) still remains unranked for its keywords. Upon checking Web Master Tools, I cannot see anything that stands out. There is no manual action or crawling issues that I can detect. Would anyone know the reason for this persistent disappearance? Is it something I will just have to wait out until ranking results come back, or is there something I am missing? Help here would be much appreciated.
Intermediate & Advanced SEO | | Gavo0 -
How should I handle URL's created by an internal search engine?
Hi, I'm aware that internal search result URL's (www.example.co.uk/catalogsearch/result/?q=searchterm) should ideally be blocked using the robots.txt file. Unfortunately the damage has already been done and a large number of internal search result URL's have already been created and indexed by Google. I have double checked and these pages only account for approximately 1.5% of traffic per month. Is there a way I can remove the internal search URL's that have already been indexed and then stop this from happening in the future, I presume the last part would be to disallow /catalogsearch/ in the robots.txt file. Thanks
Intermediate & Advanced SEO | | GrappleAgency0 -
Remove URLs that 301 Redirect from Google's Index
I'm working with a client who has 301 redirected thousands of URLs from their primary subdomain to a new subdomain (these are unimportant pages with regards to link equity). These URLs are still appearing in Google's results under the primary domain, rather than the new subdomain. This is problematic because it's creating an artificial index bloat issue. These URLs make up over 90% of the URLs indexed. My experience has been that URLs that have been 301 redirected are removed from the index over time and replaced by the new destination URL. But it has been several months, close to a year even, and they're still in the index. Any recommendations on how to speed up the process of removing the 301 redirected URLs from Google's index? Will Google, or any search engine for that matter, process a noindex meta tag if the URL's been redirected?
Intermediate & Advanced SEO | | trung.ngo0 -
Altering Breadcrumbs based on User Path to Product URL
Hi, Our products are listed in multiple categories, and as the URLs are path dependent (example.com/fruit/apples/granny-smith/, example.com/fruit/green-fruit/granny-smith/ and so forth) we canonicalise to the 'default' URL (in this case example.com/fruit/apples/granny-smith/). For mainly crawling bandwidth issues I'm looking to change all product URL's to path neutral so there is only ever one URL per product (example.com/granny-smith/), but still list the product in multiple categories. If a user comes directly to example.com/granny-smith/ then the breadcrumbs will use the default path "Fruit > Apples", however if the user navigated to the product via another category then I'd like the breadcrumbs to reflect this. I'm not worried about cloaking as it's not based on user-agent and it's very logical why it's being done so I don't expect a penalty. My question is - how do you recommend this is achieved from a technical standpoint? Many sites use path neutral product URL's (Ikea, PCWorld etc) but none alter the breadcrumbs depending upon path. Our site is mostly behind a CDN so it has to be a client side solution. I currently view the options as: Store Path to product in a cookie and/or browsers local-cache Attach the Path details after a # in the URL and use Javascript to alter breadcrumbs onload with JQuery When a user clicks to a product from a listing page, use AJAX to pull in the product info but leave the rest of the page (including the breadcrumbs) as-is, updating the URL accordingly Do you think any of these wouldn't work? Do you have a preference on which one is best? Is there another method you'd recommend? We also have "Next/Previous" functionality (links to the previous and next product URLs) on the page so I suspect we'd need to attach the path after a # and make another round trip to the server onload to update the previous and next links. Finally, does anyone know of any sites that do update the breadcrumbs depending upon path? Thanks in advance for your time FashionLux
Intermediate & Advanced SEO | | FashionLux1 -
Changing URL Structure
We are going to be relaunching our website with a new URL structure. My question is, how is it best to deal with the migration process in terms of old URLS appearing whilst we launch the new ones. How best should we launch the new structure, considering we've in the region of 10,000 pages currently indexed in Google.
Intermediate & Advanced SEO | | NeilTompkins0 -
Sitemap - % of URL's in Google Index?
What is the average % of links from a sitemap that are included in the Google index? Obviously want to aim for 100% of the sitemap urls to be indexed, is this realistic?
Intermediate & Advanced SEO | | stats440