Can someone evaluate this page so I can continue adding others?
-
Hi,
I am adding a bunch of similar category stickers and I am not looking into that good SEO for these since there will be hundreds of them coming but I just want to include the relevant keywords that people perhaps use in the Google image search to take them to our site. They are all related to JDM (Japanese Domestic Motors) so I decided to include JDM at the end of all the SEO titles. I am writing totally different short descriptions for all of these stickers and the Related Products are changing as well. I just want to achieve something like Amazon or eBay listings do - not the perfect SEO since I cannot spend too much time with each sticker optimizing it but I don't want to NOINDEX, FOLLOW them either - hence the different related products for all items and also unique short descriptions. If you check one of the pages: http://www.redrockdecals.com/rising-sun-wakaba-leaf-sticker-red-black-jdm
Do you think I should be in the safe side so I don't hurt my overall SEO? Thanks!!
-
Ya, a 15% non-indexed rate is not bad. There was a Q&A here earlier that was looking at similar things: http://moz.com/community/q/some-urls-in-the-sitemap-not-indexed. That should help ease your mind!
DA isn't a function of how many pages your site has, but how many other credible sites link to your site: http://moz.com/learn/seo/domain-authority. Cheers!
-
Thanks so much. Our DA in Moz is currently just 20. GWT shows that exactly 1000 pages have been submitted via sitemap and 853 of these are indexed. I do have some NOINDEX, FOLLOW items as well so perhaps those are increasing the gap as well. How does this number sound? Not that bad I guess?
At the moment it seems like I shouldn't use NOINDEX even though the descriptions might be similar. I should rather use Related Products and try to get more reviews instead.
There must be a way to increase DA of our domain since we have like 1000 products. I guess something else is stopping our site from becoming better ranked. Perhaps it's the lack of reviews and more unique pages meaning related products as well....?
-
You'll be able to answer the first question via GWT. Compare your sitemap submitted page count to the number of pages Google says it has placed in its index. The larger percentage gap of pages submitted to indexed is an indicator that the pages are simply not being listed. The related products and up sell items is another way to make your pages more unique as well.
If you start not ranking for your branded search then that would be an indicator that of a general de-rank. Something like that though is much more common for sites with spammy links.
Amazon and eBay get listed in Google because they are massively represented on the internet. One of Google's short-hand ways of describing ranking is placing sites in order of the likelihood that someone would encounter them just clicking and surfing through the internet. Trying to break down how Amazon and eBay get indexed to the level of on page elements doesn't really apply. Those companies are determining their design elements purely on UX and sales.
-
Thank you so much, Ryan. Very helpful. But if I have products with similar descriptions or some even with exactly the same descriptions, just the titles and images being different, will Google de-rank our overall website or more likely simply avoid listing the duplicates?
As far as I have understood, it's useful to add different Related Products and Upsell items to the similar products just to make the HTML code more different and unique even if the descriptions are the same... Ofcourse customer reviews would also help a bunch. Isn't this how Amazon products and eBay items with very short and not that unique descriptions still get listed in Google?
-
The page looks fine, but people run into NOINDEX - FOLLOW considerations when their pages aren't only similar to other pages on their domain, but similar to other pages in the rest of the Index. As their domain gains strength overall their able to see more of their similar product pages get indexed, especially if they grow in unique content over time as well: reviews, videos, photos from the product in use, etc.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Remove a page after redirection
Hi, I had page eg. www.example.com/page1 and I redirect 302 it to > www.example.com/page2 After that I fatch this page (page2) with GSC and this page was index in serp. Can I remove this old redirect page > www.example.com/page1 now? Will this remove harm my page?
Technical SEO | | Tormar0 -
Website SEO Product Pages - Condense Product Pages
We are managing a website that has seen consistently dropping rankings over the last 2 years (http://www.independence-bunting.com/). Our long term strategy has been purely content-based and is of high quality, but isn’t seeing the desired results. It is an ecommerce site that has a lot of pages, most of which are category or product pages. Many of the product pages have duplicate or thin content, which we currently see as one of the primary reasons for the ranking drops.The website has many individual products which have the same fabric and size options, but have different designs. So it is difficult to write valuable content that differs between several products that have similar designs. Right now each of the different designs has its own product page. We have a dilemma, because our options are:A.Combine similar designs of the product into one product page where the customer must choose a design, a fabric, and a size before checking out. This way we can have valuable content and don’t have to duplicate that content on other pages or try to find more to say about something that there really isn’t anything else to say about. However, this process will remove between 50% and 70% of the pages on the website. We know number of indexed pages is important to search engines and if they suddenly see that half of our pages are gone, we may cause more negative effects despite the fact that we are in fact aiming to provide more value to the user, rather than less.B.Leave the product pages alone and try to write more valuable content for each product page, which will be difficult because there really isn’t that much more to say, or more valuable ways to say it. This is the “safe” option as it means that our negative potential impact is reduced but we won’t necessarily see much positive trending either. C.Test solution A on a small percentage of the product categories to see any impact over the next several months before making sitewide updates to the product pages if we see positive impact, or revert to the old way if we see negative impact.Any sound advice would be of incredible value at this point, as the work we are doing isn’t having the desired effects and we are seeing consistent dropping rankings at this point.Any information would be greatly appreciated. Thank you,
Technical SEO | | Ed-iOVA0 -
Should I deindex my pages?
I recently changed the URLs on a website to make them tidier and easier to follow. I put 301s in place to direct all the previous page names to the new ones. However, I didn't read moz's guide which says I should leave the old sitemap online for a few weeks afterwards. As I result, webmaster tools is showing duplicate page titles (which means duplicate pages) for the old versions of the pages I have renamed. Since the old versions are no longer on the sitemap, google can no longer access them to find the 301s I have put in place. Is this a problem that will fix itself over time or is there a way to quicken up the process? I could use webmaster tools to remove these old urls, but I'm not sure if this is recommended. Alternatively, I could try and recreate the old sitemap, but this would take a lot of time.
Technical SEO | | maxweb0 -
Why is this page not ranking but is indexed?
I have a page http://jobs.hays.co.uk/jobs-in-norfolk and it is indexed by Google but will not show up for any keywords I try. Any ideas?
Technical SEO | | S_Curtis0 -
How to Delete a Page on the Web?
Google reports and I have confirmed that the following old page is presenting on the Web. http://www.audiobooksonline.com/The_Great_American_Baseball_Box_Greatest_Moments_from_the_Last_80_Years_original_audio_collection_compact_discs.html This page hasn't been in our site's directory for some time and is no longer needed by us. What is the best way to fix this Google reported crawl error?
Technical SEO | | lbohen0 -
Page not Accesible for crawler in on-page report
Hi All, We started using SEOMoz this week and ran into an issue regarding the crawler access in the on-page report module. The attached screen shot shows that the HTTP status is 200 but SEOMoz still says that the page is not accessible for crawlers. What could this be? Page in question
Technical SEO | | TiasNimbas
http://www.tiasnimbas.edu/Executive_MBA/pgeId=307 Regards, Coen SEOMoz.png0 -
Consolidate page strength
Hi, Our site has a fair amount of related/similiar content that has been historically placed on seperate pages. Unfortuantely this spreads out our page strength across multiple pages. We are looking to combine this content onto one page so that our page strength will be focused in one location (optimized for search). The content is extensive so placing it all on one page isn't ideal from a user experience (better to separate it out). We are looking into different approaches one main "tabbed" page with query string params to seperate the seperate pages. We'll use an AJAX driven design, but for non js browsers, we'll gracefully degrade to separate pages with querystring params. www.xxx.com/content/?pg=1 www.xxx.com/content/?pg=2 www.xxx.com/content/?pg=3 We'd then rel canonical all three pages to just be www.xxx.com/content/ Same concept but useAJAX crawlable hash tag design (!#). Load everything onto one page, but the page could get quite large so latency will increase. I don't think from an SEO perspective there is much difference between options 1 & 2. We'll mostly be relying on Google using the rel canonical tag. Have others dealt with this issue were you have lots of similiar content. From a UX perspective you want to separate/classifiy it, but from an SEO perspective want to consolidate? It really is very similiar content so using a rel canonical makes sense. What have others done? Thoughts?
Technical SEO | | NicB10 -
On Page 301 redirect for html pages
For php pages youve got Header( "HTTP/1.1 301 Moved Permanently" );
Technical SEO | | shupester
Header( "Location: http://www.example.com" );
?> Is there anything for html pages? Other then Or is placing this code redirect 301 /old/old.htm http://www.you.com/new.php in the .htaccess the only way to properly 301 redirect html pages? Thanks!0