From what I can gather about Magento is the Layered Nav can create seemingly endless URL's. Even if you were to use one of the modules created to make them 'friendly', you would still technically have reems of duplicate pages...right? All nicely re-written but effectively with the same titles and meta...
You may be able to put a wildcard disallow in the robots file for the parameter 'dir=' , which is associated with all the filters. I dont know how well this will work or if Google may on occasision ignore this or find a way into the layered pages anyway? Does anyone know? What if the spider entered the site through a direct link to filtered page...would the robots.txt file go by the way side in this instance?
You could in theory also use WMT to dictate that Google does not index pages with the 'dir=' parameter. Again, I am not sure as to the success rate using this.
Its one of those areas that has many open and unaswered discussions but nothing definitive anywhere to address the issue. Yet Magento is very popular and as you look at people sites who use it you can see they have some how found a way to sort this out. Id love to be a fly on the wall in their office!