On an large ecommerce site the main navigation links to URLs that include a legacy parameter. The parameter doesn’t actually seem to do anything to change content - it doesn’t narrow or specify content, nor does it currently track sessions. We’ve set the canonical for these URLs to be without the parameter. (We did this when we started seeing that Google was stripping out the parameter in the majority of SERP results themselves.)
We’re trying to best strategize on how to set the parameters in WMT (search console).
Our options are to set to:
1. No: Doesn’t affect page content’ - and then the Crawl field in WMT is auto-set to ‘Representative URL’.
(Note, that it's unclear what ‘Representative URL’ is defined as. Google’s documentation suggests that a representative URL is a canonical URL, and we've specifically set canonicals to be without the parameter so does this contradict? )
OR
2. ‘Yes: Changes, reorders, or narrows page content’
And then it’s a question of how to instruct Googlebot to crawl these pages:
'Let Googlebot decide' OR 'No URLs'.
The fundamental issue is whether the parameter settings are an index signal or crawl signal. Google documents them as crawl signals, but if we instruct Google not to crawl our navigation how will it find and pass equity to the canonical URLs?
Thoughts?
Posted by Susan Schwartz, Kahena Digital staff member