Q Parameters
-
I'm having several site issues and I want to see if the Q parameter in the URL is the issue.
Both of these index. Any capitalization combination brings up another indexed page:
http://www.website.com/index.php?q=contact-us.
and
http://www.website.com/index.php?q=cOntact-us
The other issue is Google crawl errors. The website has received increasingly more spam crawl errors. I've read that this is a common issue and most likely is a Google Bot problem. Would removing the q parameter fix this entirely?
Here is an example:
http://www.website/index.php?q=uk-cheap-chloe-bay-bag-wholesale-shoes
-
Thanks Ryan. I'm going to remove the parameters. I'm glad these issues are related.
-
Using a parameter to determine which page to bring up is problematic.
Search engines are getting better at crawling these types of URLs, but why leave anything to chance? And from the human perspective a traditional static URL is lightyears better.
Which is easier for you to tell a friend about?
coolsite.com/neat-article or coolsite.com/index.php?q=neat-article
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Does Google add parameters to the URL parameters in webmaster tools/
I am seeing new parameters added (and sometimes removed) from the URL Parameter tool. Is there anything that would add parameters to the tool? Or does it have to be someone internally? FYI - They always have no date in the configured column, no effect set, and crawl is set to Let Google decide.
Technical SEO | | merch_zzounds0 -
How can I best handle parameters?
Thank you for your help in advance! I've read a ton of posts on this forum on this subject and while they've been super helpful I still don't feel entirely confident in what the right approach I should take it. Forgive my very obvious noob questions - I'm still learning! The problem: I am launching a site (coursereport.com) which will feature a directory of schools. The directory can be filtered by a handful of fields listed below. The URL for the schools directory will be coursereport.com/schools. The directory can be filtered by a number of fields listed here: Focus (ex: “Data Science”) Cost (ex: “$<5000”) City (ex: “Chicago”) State/Province (ex: “Illinois”) Country (ex: “Canada”) When a filter is applied to the directories page the CMS produces a new page with URLs like these: coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago coursereport.com/schools?cost=$>5000&city=buffalo&state=newyork My questions: 1) Is the above parameter-based approach appropriate? I’ve seen other directory sites that take a different approach (below) that would transform my examples into more “normal” urls. coursereport.com/schools?focus=datascience&cost=$<5000&city=chicago VERSUS coursereport.com/schools/focus/datascience/cost/$<5000/city/chicago (no params at all) 2) Assuming I use either approach above isn't it likely that I will have duplicative content issues? Each filter does change on page content but there could be instance where 2 different URLs with different filters applied could produce identical content (ex: focus=datascience&city=chicago OR focus=datascience&state=illinois). Do I need to specify a canonical URL to solve for that case? I understand at a high level how rel=canonical works, but I am having a hard time wrapping my head around what versions of the filtered results ought to be specified as the preferred versions. For example, would I just take all of the /schools?focus=X combinations and call that the canonical version within any filtered page that contained other additional parameters like cost or city? Should I be changing page titles for the unique filtered URLs? I read through a few google resources to try to better understand the how to best configure url params via webmaster tools. Is my best bet just to follow the advice on the article below and define the rules for each parameter there and not worry about using rel=canonical ? https://support.google.com/webmasters/answer/1235687 An assortment of the other stuff I’ve read for reference: http://www.wordtracker.com/academy/seo-clean-urls http://www.practicalecommerce.com/articles/3857-SEO-When-Product-Facets-and-Filters-Fail http://www.searchenginejournal.com/five-steps-to-seo-friendly-site-url-structure/59813/ http://googlewebmastercentral.blogspot.com/2011/07/improved-handling-of-urls-with.html
Technical SEO | | alovallo0 -
50,000 pages or a page with parameters
I have a site with about 12k pages on a topic... each of these pages could use another several pages to go into deeper detail about the topic. So, I am wondering, for SEO purposes would it be better to have something like 50,000 new pages for each sub topic or have one page that I would pass parameters to and the page would be built on the fly in code behind. The drawback to the one page with parameters is that the URL would be static but the effort to implement would be minimal. I am also not sure how google would index a single page with parameters. The drawback to the 50k pages model is the dev effort and possibly committed some faux pas by unleashing so many links to my internal pages. I might also have to mix aspx with html because my project can't be that large. Anyone here ever have this sort of choice to make? Is there a third way I am not considering?
Technical SEO | | Banknotes0 -
Why is google webmaster tools ignoring my url parameter settings
I have set up several url parameters in webmaster tools that do things like select a specific products colour or size. I have set the parameter in google to "narrows" the page and selected to crawl no urls but in the duplicate content section each of these are still shown as being 2 pages with the same content. Is this just normal, i.e. showing me that they are the same anyway or is google deliberately ignoring my settings (which I assume it does when they are sure they know better or think I have made a mistake)?
Technical SEO | | mark_baird0 -
Language parameter
Hi there, I have a quick technical question regarding to on site language. Is it possible to use a parameter in order to define the language of the web. My web is www.vallnord.com and its default language is catalan. I would like the site to display in the language of the user, maybe the browser. Thanks, G.
Technical SEO | | SilbertAd0 -
Dynamic Parameters in URL
I have received lots of warnings because of long urls. Most of them are because my website has many Attributes to FILTER out products. And each time the user clicks on one, its added to the URL. pls see my site here: www.theprinterdepo.com The warning is here: Although search engines can crawl dynamic URLs, search engine representatives have warned against using over 2 parameters in any given URL. The question to the community is: -What should I do? These attributes really help the user to find easier the products. I could remove some of the attributes, I am not sure if my ecommerce solution (MAGENTO), allows to change the behavior of this so that this does not use querystring parameters.
Technical SEO | | levalencia10 -
Will a "blog=example "parameter at the end of my URLs affect google's crawling them?
For example, I'm wondering if www.example.com/blog/blog-post is better than www.example.com/blog/blog-post?blog=example? I'm currently using the www.example.com/blog/blog-post?blog=example structure as our canonical page for content. I'm also wondering, if the parameter doesn't affect crawling, if it would hurt rankings in any way. Thanks!
Technical SEO | | Intridea0 -
Canonical for stupid _GET parameters or not? [deep technical details]
Hi, Im currently working on www.kupwakacje.pl which is something like travel agency. People can search for holidays and buy/reserve them. I do know plenty of problems on my website, and thx to seomoz hopefully I will be able to fix them but one is crucial and it's kind of hard to fix I think. The search engine is provided by external party in form of simple API which is in the end responding with formatted HTML - which is completly stupid and pointless, but that's not the main problem. Let's dive in: So for example the visitor goes to homepage, selects Egypt and hit search button. He will be redirected to www.kupwakacje.pl/wczasy-egipt/?ep3[]=%3Fsp%3D3%26a%3D2%26kt%3D%26sd%3D10.06.2011%26drt%3D30%26drf%3D0%26px and this is not a joke 😉 'wczasy-egipt' is my invention obviously and it means 'holidays-egypt'. I've tried to at least have 'something' in the url that makes google think it's related to Egypt indeed. Rest which is the complicated ep3[] thingy is a bunch of encoded parameters. This thing renders in first step a list of hotels, in next one hotel specific offer and in next one the reservation page. Problem is that all those links generated by this so-called API are only changing subparameters in ep3[] parameter so for example clicking on a single hotel changes to url to: www.kupwakacje.p/wczasy-egipt/?url=wczasy-egipt/&ep3[]=%3Fsid%3Db5onrj4hdnspb5eku4s2iqm1g3lomq91%26l ang%3Dpl%26drt%3D30%26sd%3D10.06.2011%26ed%3D30.12.1999%26px%3D99999 %26dsr%3D11%253A%26ds%3D11%253A%26sp%3D which is obviously looking not very different to the first one. what I would like to know is shall i make all pages starting with 'wczasy-egipt' a rel-canonical to the first one (www.kupwakacje.pl/wczasy-egipt) or shoudn't I? google recognizes the webpage according to webmasters central, and recognizes the url but responses with mass duplicate content. What about positioning my website for the hotel names - so long tail optimalization? I know it's a long and complicated post, thx for reading and I would be very happy with any tip or response.
Technical SEO | | macic0