Why would my product pages no longer be indexed in Google?
-
Our UK site has 72 pages in our sitemap. 30 of them are product pages which take a productid parameter. Prior to 1st Feb 2011, all pages were indexed in Google but since then all of our product pages seem to have dropped from the index?
If I check in webmaster tools, I can see that we have submitted 72 pages and 42 are indexed.
I realise we should have some better url structuring and I'm working on that but do you have any ideas on how we can get our product poages back into googles index
-
Hi Ian,
I see there are some 40 URLs with product in the URL from your site that are indexed in Google. Can you share with us how you were able to get those back in the index and any tips you may have learned from the experience? If you have a good writeup, do consider submitting to YOUmoz as well.
Thanks!
-
Yes it is. Going through this whole thing myself with a few sites.
-
Thanks for your comments. As well as rewriting descriptions, I have a couple of things in development that should help e.g customer reviews etc
Just frustrating...
-
Thanks Alfredo,
The content was actually written by us. The other 6000 links are duplicated from ours. We are the manufacturer of the product and we wrote the product descriptions. I find it ironic that the original author is penalised and dropped from the index when the 6000 pages contaiing duplicate content manage to stay in the index.
I can update the content but its very frustrating.
-
It may take some time, but you better start making a move on your content. Think about what else you can bring into your product descriptions that other sites don't have (apart from rewriting descriptions). Providing additional data/value could be the easier way out - especially if you have data floating around or can collect it using your current traffic through surveys etc.
-
Alfrado is correct
I assume you are selling though somesort of franchise and you product content is supplied to you.
You need to create original content for them all.Also sitemaps should not contain every page, they should only contain your most impoartatnt pages. see whitboad friday with Duane Forrester from Bing, still on seomoz front page
If making orginal content for all pages is too hard, have some summery pages from each category and get them to rank.
while your proiduct pages used to be in the index, they probably did not rank well asn theyy were duplicates, but they would of been usefull for linksculpting, only pages in the index can be used for link sculpting
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Google: What factors contribute to rank a landing page in a specific country?
Hi community, I would like to know what is the criteria to rank a landing page in specific country Google search. For example, if we want to rank our landing page for "GDPR Australia", what are the factors which will impact beside writing the cotent related to the above mentioned keyword? Thanks
Algorithm Updates | | vtmoz0 -
Sub-directory pages must be optimised well?
Hi all, We have help pages as sub-directory which have been linked from our website pages (3 clicks depth). But these pages are not well optimised with minor issues like header tags, image alts, etc...Moreover some of these pages are dead-end pages. Will these things hurt us? Thanks
Algorithm Updates | | vtmoz0 -
A Google Update Happened?
I'm curious to know what us MOZ folks have to say about an update on Google. Article here: http://searchengineland.com/big-google-search-update-happening-chatter-thinks-258142 Any ideas?
Algorithm Updates | | Chenzo0 -
How long for google to de-index old pages on my site?
I launched my redesigned website 4 days ago. I submitted a new site map, as well as submitted it to index in search console (google webmasters). I see that when I google my site, My new open graph settings are coming up correct. Still, a lot of my old site pages are definitely still indexed within google. How long will it take for google to drop off or "de-index" my old pages? Due to the way I restructured my website, a lot of the items are no longer available on my site. This is on purpose. I'm a graphic designer, and with the new change, I removed many old portfolio items, as well as any references to web design since I will no longer offering that service. My site is the following:
Algorithm Updates | | rubennunez
http://studio35design.com0 -
Are SEO Friendly URLS Less Important Now That Google Is Indexing Breadcrumb Markup?
Hi Moz Community and staffers, Would appreciate your thoughts on the following question: **Are SEO friendly URLS less important now that Google is indexing breadcrumb markup in both desktop and mobile search? ** Background that inspired the question: Our ecommerce platform's out of the box functionality has very limited "friendly url" settings and would need some development work to setup an alias for more friendly URLS. Meanwhile, the breadcrumb markup is implemented correctly and indexed so it seems there's no longer an argument for improved CTR with SEO friendly URLS . With that said I'm having a hard time justifying the URL investment, as well as the 301 redirect mapping we would need to setup, and am wondering if more friendly URLs would lead to a significant increase in rankings for level of effort? Sidenote: We already rank well for non-brand and branded searches since we are brand manufacturer with an ecommerce presence. Our breadcrumbs are much cleaner & concise than our URL structure. Here are a couple examples. Category URL: http://www.mysite.com/browse/category1/subcat2/subcat3/_/N-7th
Algorithm Updates | | jessekanman
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 Product URL: http://www.mysite.com/product/product-name/_/R-133456E112
Breadcrumb: www.mysite.com > category1 > subcat2 > subcat3 > product name The "categories" contain actual keywords just hiding them here in the example. According to my devs they can't get rid of the "_" but could possible replace it with a letter. Also they said it's an easier fix to make the URLs always lower case. Lastly some of our product URLS contain non-standard characters in the product name like "." and "," which is also a simpler fix according to my developers. Looking forward to your thoughts on the topic! Jesse0 -
Google & Tabbed Content
Hi I wondered if anyone had a case study or more info on how Google treats content under tabs? We have an ecommerce site & I know it is common to put product content under tabs, but will Google ignore this? Becky
Algorithm Updates | | BeckyKey1 -
Could Retail Price Be A Google Ranking Factor???
I have not done any detailed studies on this but it seems that Google might be using low retail prices for specific items as a ranking factor in their organic SERPs. Does anyone else suspect this? Just askin' to hear your thoughts. Thanks!
Algorithm Updates | | EGOL0 -
Removing secure subdomain from google index
we've noticed over the last few months that Google is not honoring our main website's robots.txt file. We have added rules to disallow secure pages such as: Disallow: /login.cgis Disallow: /logout.cgis Disallow: /password.cgis Disallow: /customer/* We have noticed that google is crawling these secure pages and then duplicating our complete ecommerce website across our secure subdomain in the google index (duplicate content) https://secure.domain.com/etc. Our webmaster recently implemented a specific robots.txt file for the secure subdomain disallow all however, these duplicated secure pages remain in the index. User-agent: *
Algorithm Updates | | marketing_zoovy.com
Disallow: / My question is should i request Google to remove these secure urls through Google Webmaster Tools? If so, is there any potential risk to my main ecommerce website? We have 8,700 pages currently indexed into google and would not want to risk any ill effects to our website. How would I submit this request in the URL Removal tools specifically? would inputting https://secure.domain.com/ cover all of the urls? We do not want any secure pages being indexed to the index and all secure pages are served on the secure.domain example. Please private message me for specific details if you'd like to see an example. Thank you,0