@terentyev - sorry, can't edit my questions once submitted and I wait for approval (why?) the statement should read my question SHOULD be very specific, whereas my original question was much more general - you answered that question very nicely. Sorry for any misunderstanding
- Home
- AspenFasteners
AspenFasteners
@AspenFasteners
Company: Aspen Fasteners
Latest posts made by AspenFasteners
-
RE: What happens to crawled URLs subsequently blocked by robots.txt?
-
RE: What happens to crawled URLs subsequently blocked by robots.txt?
@terentyev thanks for the reply. We have no reason to believe these URL's are backlinked. These aren't consumer products that individual are interested in, our site is a wholesale B2B selling very narrow categories in bulk quantities typically for manufacturing. Therefore, almost zero chance for backlinks anywhere for something as specific as a particular size/material/package quantity of a product.
We have already initiated a canonicalization project started but we are stuck between two concerns from sales, 1) we can't wait for canonicalization (which is complex) we need sales now and 2) don't touch robots.txt because MAYBE the individual products are indexed.
So that is why my question is very specific - do we KNOW that Google will immediately de-index URL's blocked by robots.txt?
-
What happens to crawled URLs subsequently blocked by robots.txt?
We have a very large store with 278,146 individual product pages. Since these are all various sizes and packaging quantities of less than 200 product categories my feeling is that Google would be better off making sure our category pages are indexed.
I would like to block all product pages via robots.txt until we are sure all category pages are indexed, then unblock them. Our product pages rarely change, no ratings or product reviews so there is little reason for a search engine to revisit a product page.
The sales team is afraid blocking a previously indexed product page will result in in it being removed from the Google index and would prefer to submit the categories by hand, 10 per day via requested crawling.
Which is the better practice?
-
RE: Is one page with long content better than multiple pages with shorter content?
What if the long, detailed content is for a page for which you cannot possibly win rankings for because huge, long-standing enterprises with high PA and DA already dominate the first few SERPs? Would it not be better to move the details of the content to the specific pages where they apply?
-
Is one page with long content better than multiple pages with shorter content?
(Note, the site links are from a sandbox site and has very low DA or PA)
If you look at this page, you will see at the bottom a lengthy article detailing all of the properties of the product categories in the links above.
http://www.aspensecurityfasteners.com/Screws-s/432.htm
My question is, is there more SEO value in having the one long article in the general product category page, or in breaking up the content and moving the sub-topics as content to the more specific sub-category pages? e.g.
http://www.aspensecurityfasteners.com/Screws-Button-Head-Socket-s/1579.htm
http://www.aspensecurityfasteners.com/Screws-Cap-Screws-s/331.htm
http://www.aspensecurityfasteners.com/Screws-Captive-Panel-Scre-s/1559.htm -
RE: Duplicate content warning: Same page but different urls???
Thanks Tim. Do you have any examples of what those problems might be? With such a large catalog managing those rel canonical tags will be difficult (I don't even know if the store allows them, it's a hosted store solution and little code customization is allowed).
-
RE: Looks like keyword stuffing, but it isn't
Thanks, Hurf. Thanks for the excellent reply. Unfortunately, with Volusion as our web-store we don't have the option of displaying anything but the full product name on listings by category unless we engage a Volusion partner to modify our store or we migrate 80K products to a different and more complex product/option structuring model.
For the sake of the clients, 12-products-per-page would be maddening. Text search is very ineffective when it comes to parsing product dimensions, so we produce long listings, sorted by size so the user can quickly locate their product.
In the short term, I think it makes most sense to pay for custom programming to reduce the product description to just the dimension and quantity when displayed on a category page.
-
Looks like keyword stuffing, but it isn't
(There was a similar older question on the forum, but it wasn't really answered so please forgive me if this looks like a repeated question)
Looks like keyword stuffing, but it isn't
We have a massive web store with 80k "commodity products" (and this amount will only increase) which aren't sold the same way normal products are sold (i.e. by brand and model). Commodity products are sold by specification, therefore their product names are actually descriptions of the product. In our case, industrial fasteners (nuts, bolts, washers, screws, etc) sold in bulk.
If you click on the link below, you will see that our catalog involves a tremendous amount of repetition, where the products all appear the same, but are varying by dimensions and/or package quantities. The solutions the web store software offers to solve this problem cause issues for us (i.e. displaying the dimensions and quantities ONLY under a common header) but more importantly, we are concerned that search engines are seeing this as keyword stuffing and penalizing the pages.
http://www.aspenfasteners.com/Step-Bolts-Inch-Standard-s/407.htm
If we can't change the presentation of the page, should we be concerned and if so, how do we let a search engine know that the repetition is legitimate?
-
RE: Duplicate content warning: Same page but different urls???
We have a similar problem, but much more complex to handle as we have a massive catalog of 80,000 products and growing.
The problem occurs legitimately because our catalog is so large that we offer different navigation paths to the same content.
http://www.aspenfasteners.com/Self-Tapping-Sheet-Metal-s/8314.htm
http://www.aspenfasteners.com/Self-Tapping-Sheet-Metal-s/8315.htm
(If you look at the "You are here" breadcrumb trail, you will see the subtle differences in the navigation paths, with 8314.htm, the user went through Home > Screws, with 8315.htm, via Home > Security Fasteners > Screws).
Our hosted web store does not offer us htaccess, so I am thinking of excluding the redundant navigation points via robots.txt.
My question: is there any reason NOT to do this?
-
RE: Trying to reduce pages crawled to within 10K limit via robots.txt
Hi Andres!
Sorry, I thought I answered this earlier. If I understand correctly wildcards ARE allowed, according to this reply to my question on the topic: http://www.seomoz.org/q/does-rogerbot-read-url-wildcards-in-robots-txt
Hope THIS reply sticks this time!
Best posts made by AspenFasteners
-
What happens to crawled URLs subsequently blocked by robots.txt?
We have a very large store with 278,146 individual product pages. Since these are all various sizes and packaging quantities of less than 200 product categories my feeling is that Google would be better off making sure our category pages are indexed.
I would like to block all product pages via robots.txt until we are sure all category pages are indexed, then unblock them. Our product pages rarely change, no ratings or product reviews so there is little reason for a search engine to revisit a product page.
The sales team is afraid blocking a previously indexed product page will result in in it being removed from the Google index and would prefer to submit the categories by hand, 10 per day via requested crawling.
Which is the better practice?
Looks like your connection to Moz was lost, please wait while we try to reconnect.