Unlimited Product Pages
-
While browsing through my Moz campaign, I noticed that my site is pulling up unlimited numbers of product pages even though no products appear on them. i.e.
http://www.interstellarstore.com/star-trek-memorabilia?page=16
http://www.interstellarstore.com/star-trek-memorabilia?page=100
http://www.interstellarstore.com/star-trek-memorabilia?page=200
I have no ideal how to resolve this issue. I can't possible 301 an unlimited number of pages, and I can see this being a big SEO problem. Any thoughts?
-
Everett, you've been incredibly helpful - thank you!
-
This is a deeper issue that will probably require the help of a developer, though it shouldn't take them long to fix. Basically, paginated category pages that have ZERO products should return a 404 status code because they really shouldn't exist. The fact that any content appears on those URLs is a bug that needs to be fixed, because they should show "Page Not Found". Also, as you mentioned, the very last page in the set should NOT have a Rel "Next" tag, and there should be no links, (whether href or javascript) to subsequent pages beyond the last one with products on it.
-
Oh I understand now, thank you!
_**Re #1: **_Would putting Rel Prev on the last page in the set resolve the issue? Or is this a deeper issue within my website code which needs to be resolved by not allowing pages to be created beyond the end of the paginated set?
-
#1. The way you're handing paginated URLs with products on them is fine, but the problem is when you get to the end of the paginated set and the system continues to make pages even though there are no more products to show. That is the issue I was trying to help you resolve, which is not the same as paginated pages that show products.
#2 Yes, the canonical for the brand pages should be the URL listed above as "Category". Thus both the "Category" URL and the "Brand" URL in your example should list the Category URL as Canonical.
I agree that this is likely the cause of your duplicate title issues.
-
Thank you for the link, but I'm still unsure about these two specific issues.
Issue One:
So I have say the two below links, which are the same category but just page 1 and 2.
Page 1: http://www.interstellarstore.com/star-wars-memorabilia
Page 2: http://www.interstellarstore.com/star-wars-memorabilia/page/2/
Should Page 1 be the Canonical for all subsequent pages, or by their own rights is Page 2 and onward considered their own page? From what I understand of the former this would cause a Duplicate Title problem. The latter however would surely cause problems for Google as it would ignore all of the products on Pages 2 and onward?
Currently I have Rel Canonical and Rel Prev/Next in place, am I correct in thinking this is the correct way to do this?
Issue Two:
One of the below links is the main category (Canonical) while the other is indirect from the Brand option available on our site.
Category: http://www.interstellarstore.com/star-wars-memorabilia
Brand: http://www.interstellarstore.com/star-wars-memorabilia?page=2%2F
Am I correct in thinking that the Canonical URL displayed on the Brand page should in fact be the Category URL above?
I'm sorry for all the questions, I'm just trying to make heads to tails of the issue. Currently there are over 20,000 Duplicate Title issues and I'm thinking this may be the cause.
-
It means the page has a Rel="Canonical" tag that tells search engines it is the right page to show in search results -- as opposed to one that says another page should show up instead. You can learn more here.
-
What do you mean by ''self-referencing Rel ="canonical" tags''?
-
Google may have figured out that they don't want those pages in their index, but the problem still exists and needs to be fixed. The pages show a 200 status code, do not have a robots Noindex tag, and have self-referencing Rel ="canonical" tags. All of these things are problems, but if you just have the system return a 404 status code on those pages that don't list any products you should be fine.
-
Hello Everett, according to Oleg these pages aren't being indexed, so the resulting soft 404's aren't a problem, correct?
-
Hello Oleg. I noticed by accident. Instead of entering page 2, I entered page 20 and then noticed the problem. I then entered 200, 1000 and so on and found no matter what number I entered a page with no products always showed up.
Thanks for letting me know about Google not picking up on them, that makes me feel a whole lot better!
-
Hello Spacecollective,
Your shopping cart is throwing Soft 404 errors on paginated URLs that don't exist. These pages need to return a 404 status code in order to be removed from the index and to address this issue. And yes, it can be a big problem.
I did a quick search for "OpenCart soft 404 errors" and here's a forum post that might help you out:
http://forum.opencart.com/viewtopic.php?f=20&t=102748
Good luck!
-
I can see the links working in reverse (page=200 links to page=199 via rel="prev" code) but what is the largest number and where did that link come from?
Good news is that Google isn't picking up on those urls so you should be find rankings wise (its a Moz crawler issue) - https://www.google.com/search?q=site:http://www.interstellarstore.com/star-trek-memorabilia%3Fpage%3D&start=0
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Paginated Pages Page Depth
Hi Everyone, I was wondering how Google counts the page depth on paginated pages. DeepCrawl is showing our primary pages as being 6+ levels deep, but without the blog or with an infinite scroll on the /blog/ page, I believe it would be only 2 or 3 levels deep. Using Moz's blog as an example, is https://mza.seotoolninja.com/blog?page=2 treated to be on the same level in terms of page depth as https://mza.seotoolninja.com/blog? If so is it the https://site.comcom/blog" /> and https://site.com/blog?page=3" /> code that helps Google recognize this? Or does Google treat the page depth the same way that DeepCrawl is showing it with the blog posts on page 2 being +1 in page depth compared to the ones on page 1, for example? Thanks, Andy
Intermediate & Advanced SEO | | AndyRSB0 -
We are switching our CMS local pages from a subdomain approach to a subfolder approach. What's the best way to handle this? Should we redirect every local subdomain page to its new subfolder page?
We are looking to create a new subfolder approach within our website versus our current subdomain approach. How should we go about handling this politely as to not lose everything we've worked on up to this point using the subdomain approach? Do we need to redirect every subdomain URL to the new subfolder page? Our current local pages subdomain set up: stores.websitename.com How we plan on adding our new local subfolder set-up: websitename.com/stores/state/city/storelocation Any and all help is appreciated.
Intermediate & Advanced SEO | | SEO.CIC0 -
PDF or HTML Page?
One of our sales team members has created a 25 page word document as a topical page. The plan was to make this into an html page with a table of contents. My thoughts were why not make it a pdf? Is there any con to using a PDF vs an html page? If the PDF was properly optimized would it perform just as well? The goal is to have folks click back to our products and hopefully by after reading about how they work.
Intermediate & Advanced SEO | | Sika220 -
Duplicate content within sections of a page but not full page duplicate content
Hi, I am working on a website redesign and the client offers several services and within those services some elements of the services crossover with one another. For example, they offer a service called Modelling and when you click onto that page several elements that build up that service are featured, so in this case 'mentoring'. Now mentoring is common to other services therefore will feature on other service pages. The page will feature a mixture of unique content to that service and small sections of duplicate content and I'm not sure how to treat this. One thing we have come up with is take the user through to a unique page to host all the content however some features do not warrant a page being created for this. Another idea is to have the feature pop up with inline content. Any thoughts/experience on this would be much appreciated.
Intermediate & Advanced SEO | | J_Sinclair0 -
Page URL keywords
Hello everybody, I've read that it's important to put your keywords at the front of your page title, meta tag etc, but my question is about the page url. Say my target keywords are exotic, soap, natural, and organic. Will placing the keywords further behind the URL address affect the SEO ranking? If that's the case what's the first n number of words Google considers? For example, www.splendidshop.com/gift-set-organic-soap vs www.splendidshop.com/organic-soap-gift-set Will the first be any less effective than the second one simply because the keywords are placed behind?
Intermediate & Advanced SEO | | ReferralCandy0 -
Why are new pages not being indexed, and old pages (now in robots.txt) remain in the index?
I currently have a site that was recently restructured, causing much of its content to be reposted, creating new URL's for each page. To avoid duplicates, all of the existing pages were added to the robots file. That said, it has now been over a week - I know Google has recrawled the site - and when I search for term X, it is stil the old page that is ranking, with the new one nowhere to be seen. I'm assuming it's a cached version, but why are so many of the old pages still appearing in the index? Furthermore, all "tags" pages (it's a Q&A site, like this one) were also added to the robots a few months ago, yet I think they are all still appearing in the index. Anyone got any ideas about why this is happening, and how I can get my new pages indexed?
Intermediate & Advanced SEO | | corp08030 -
Not sure why Home page is outranked by less optimized internal pages.
We launched our website just three weeks ago, and one of our primary keyword phrases is "e-business consultants". Here's what I don't get. Our home page is the page most optimized around this search phrase. Using SEOmoz On-Page Optimization tool, the home page scores an "A". And yet it doesn't rank in the top 50 on Google Canada, although two other INTERNAL pages - www.ebusinessconsultants.ca/about/consulting-team/ & /www.ebusinessconsultants.ca/about/consulting-approach/ - rank 5 & 6 on Google Canada, even though they only score a grade "C" for on-page optimization for this keyword phrase. I've always understood that the home page is the most powerful page. Why are these others outranking it? I checked the crawl and Google Webmaster, and there is no obvious problem on the home page. Is this because the site is so new? It goes against all previous experience I've had in similar situation. Any guidance/ insight would be highly appreciated!!
Intermediate & Advanced SEO | | axelk0 -
High number of items per page or low number with more category pages?
In SEO terms, what would be the best method: High number of items per page or low number with more pages? For example, this category listing here: http://flyawaysimulation.com/downloads/90/fsx-civil-aircraft/ It has 10 items per page. Would there be any benefit of changing a listing like that to 20 items in order to decrease the number of pages in the category? Also, what other ways could you increase the SEO of category listings like that?
Intermediate & Advanced SEO | | Peter2640