Campaign shows 5,000 warnings from shared database feed, made pages no-follow and no-index, are we OK now?
-
One of our campaigns shows 5,000 warnings for dup content, meta descriptions, and urls.
This is from a xml database feed that is shared throughout the industry. We made the pages no-follow and no-index, but on Moz crawl still get the warnings. No warnings on Webmaster tools.
Should we ignore these warnings and are we OK now, or is there more work to do?
-
I think best practice is to make them "noindex,follow". You'll still get the warnings in Moz. They're ok to ignore if you have have noindexed.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
An informational product page AND a shop page (for same brand)
Hi all, This is my first foray into e-commerce SEO. I'm working with a new client who sells upscale eBikes online. Since his products are expensive, he wants to have informational pages about the brands he sells eg. www.example.com/brand. However these brands are also category pages for his online shop eg. www.example.com/shop/brand I'm worried about keyword cannibalization and adding an extra step/click to get to the shop (right now the navigational menu takes you to the information page and from there you have to click to get to the shop) I'm pretty sure it would make more sense to have ONE killer shopping page that includes all the brand information but I want to be 100% sure before I advise him to take this big step. Thoughts?
Technical SEO | | MouthyPR1 -
Home Pages of Several Websites are disappearing / reappearing in Google Index
Hi, I periodically use the Google site command to confirm that our client's websites are fully indexed. Over the past few months I have noticed a very strange phenomenon which is happening for a small subset of our client's websites... basically the home page keeps disappearing and reappearing in the Google index every few days. This is isolated to a few of our client's websites and I have also noticed that it is happening for some of our client's competitor's websites (over which we have absolutely no control). In the past I have been led to believe that the absence of the home page in the index could imply a penalty of some sort. This does not seem to be the case since these sites continue to rank the same in various Google searches regardless of whether or not the home page is listed in the index. Below are some examples of sites of our clients where the home page is currently not indexed - although they may be indexed by the time you read this and try it yourself. Note that most of our clients are in Canada. My questions are: 1. has anyone else experienced/noticed this? 2. any thoughts on whether this could imply some sort of penalty? or could it just be a bug in Google? 3. does Google offer a way to report stuff like this? Note that we have been building websites for over 10 years so we have long been aware of issues like www vs. non-www, canonicalization, and meta content="noindex" (been there done that in 2005). I could be wrong but I do not believe that the site would keep disappearing and reappearing if something like this was the issue. Please feel free to scrutinize the home pages to see if I have overlooked something obvious - I AM getting old. site:dietrichlaw.ca - this site has continually ranked in the top 3 for [kitchener personal injury lawyers] for many years. site:burntucker.com - since we took over this site last year it has moved up to page 1 for [ottawa personal injury lawyers] site:bolandhowe.com - #1 for [aurora personal injury lawyers] site:imranlaw.ca - continually ranked in the top 3 for [mississauga immigration lawyers]. site:canadaenergy.ca - ranks #3 for [ontario hydro plans] Thanks in advance! Jim Donovan, President www.wethinksolutions.com
Technical SEO | | wethink0 -
Search pages showing up as soft 404 in WMT
Hi ....we are getting allot of "site search" pages showing up in wmt as soft 404's and wanted to know what the best would be to stop this. All search pages are already noindex follow but maybe we should block them in robots txt as well. Would the below help to solve this ? User-agent: *
Technical SEO | | nomad-202323
Disallow: /?s=
Disallow: /search/ Any other suggestions or direction would be appreciated to prevent these pages showing up as soft 404's tks0 -
Test site got indexed in Google - What's the best way of getting the pages removed from the SERP's?
Hi Mozzers, I'd like your feedback on the following: the test/development domain where our sitebuilder works on got indexed, despite all warnings and advice. The content on these pages is in active use by our new site. Thus to prevent duplicate content penalties we have put a noindex in our robots.txt. However off course the pages are currently visible in the SERP's. What's the best way of dealing with this? I did not find related questions although I think this is a mistake that is often made. Perhaps the answer will also be relevant for others beside me. Thank you in advance, greetings, Folko
Technical SEO | | Yarden_Uitvaartorganisatie0 -
Moving Some Content From Page A to Page B
Page A has written content, pictures, videos. The written content from Page A is being moved to Page B. When Google crawls the pages next time around will Page B receive the content credit? Will there not be any issues that this content originally belonged to Page A? Page A is not a page I want to rank for (just have great pictures and videos for users). Can I 301 redirect from Page A to B since the written content from A has been deleted or no need? Again, I intent to keep Page A live because good value for users to see pictures and videos.
Technical SEO | | khi50 -
Getting More Pages Indexed
We have a large E-commerce site (magento based) and have submitted sitemap files for several million pages within Webmaster tools. The number of indexed pages seems to fluctuate, but currently there is less than 300,000 pages indexed out of 4 million submitted. How can we get the number of indexed pages to be higher? Changing the settings on the crawl rate and resubmitting site maps doesn't seem to have an effect on the number of pages indexed. Am I correct in assuming that most individual product pages just don't carry enough link juice to be considered important enough yet by Google to be indexed? Let me know if there are any suggestions or tips for getting more pages indexed. syGtx.png
Technical SEO | | Mattchstick0 -
Importance of an optimized home page (index)
I'm helping a client redesign their website and they want to have a home page that's primarily graphics and/or flash (or jquery). If they are able to optimize all of their key sub-pages, what is the harm in terms of SEO?
Technical SEO | | EricVallee340 -
HTML and no index, follow
I’m just learning about HTML and I was wondering can a tag be put into a dynamic HTML page?
Technical SEO | | EricVallee340