Robots.txt question
-
I notice something weird in Google robots. txt tester
I have this line
Disallow: display=
in my robots.text but whatever URL I give to test it says blocked and shows this line in robots.text
for example this line is to block pages like
http://www.abc.com/lamps/floorlamps?display=table
but if I test
http://www.abc.com/lamps/floorlamps or any page
it shows as blocked due to Disallow: display=
am I doing something wrong or Google is just acting strange? I don't think pages with no display= are blocked in real.
-
Yes - there is bug in your robots.txt. You should wrote some as:
Disallow: /?display=table
or:
Disallow: /?display=*
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
How to answer questions when there no questions for my keyword
Hello, Let's say I want to rank on "Alsace bike tour" whatever tool I use Moz keyword explorer, google suggest , keyword.io, answer the public ... there are not questions... so... what do I need to answer ? I imagine that for google there are some questions more relevant than others ? Should I answer do I need to bring my own bike or where will I go... ? and will google give me "points " for answering those questions even though people don't have questions... For the keyword title tag, it is easy, people ask the character limit, title tag generator and so on but for may keywords like that ones I am targeting people have NO Questions ! Thank you,
Intermediate & Advanced SEO | | seoanalytics0 -
Set Robots.txt file to crawl my website at specific times
Our website provider has stated that they can only 'lift' their block on our website in order for it to be crawled as specific times. Is there any way to amend a robots.txt to ensure that it crawls our website at a specific time of day/night in order to coincide with the block being lifted? Many Thanks, Charlene
Intermediate & Advanced SEO | | CharleneKennedy120 -
Google Search Console Site Property Questions
I have a few questions regarding Google Search Console. Google Search Console tells you to add all versions of your website https, http, www, and non-www. 1.) Do I than add ALL the information for ALL versions? Sitemaps, preferred site, etc.? 2.) If yes, when I add sitemaps to each version, do I add the sitemap url of the site version I'm on or my preferred version? - For instance when adding a sitemap to a non-www version of the site, do I use the non-www version of the sitemap? Or since I prefer a https://www.domain.com/sitemap.xml do I use it there? 3.) When adding my preferred site (www or non-www) do I use my preferred site on all site versions? (https, http, www, and non-www) Thanks in advance. Answers vary throughout Google!
Intermediate & Advanced SEO | | Mike.Bean0 -
Canonicals question ref canonicals pointing to redundant urls
Hi, SCENARIO: A site has say 3 examples of the same product page but with different urls because that product fits into 3 different categories e.g. /tools/hammer /handtools/hammer /specialoffers/hammer and lets say the first 2 of those have the canonical pointing to /specialoffers/hammer YET that page is now redundant e.g. the webmaster decided to do away with the /specialoffers/ folder. ASSUMPTIONS: That is going to seriously hamper the chances of the 2 remaining versions of the hammer page being able to rank as they have canonicals pointing to a url that no longer exists. The canonical tags should be changed to point to 1 of the remaining url versions. As an added complication - lets say /specialoffers/hammer still exists, the url works, but just isn't navigable from the site. Thoughts/feedback welcome!
Intermediate & Advanced SEO | | AndyMacLean0 -
GWT url parameter issue/question
Hi Moz community, I'm having an issue with URL parameters in GWT. The tracking taxonomy for my websites is used as either /?izid=... (internal) OR /?dzid=... (external) I put tracking parameters in GWT as izid & dzid, but it hasn't picked up any URLs or examples in regards to these parameters. It's been about 2 months since we've started using this so I want to make sure Google isn't indexing as duplicate content. Side note: any page that uses a tracking parameter automatically adds rel="canonical" to the original page. Could this be the reason that GWT doesn't pick up any URLs for tracking parameters and/or do I not need to worry about adding paramters if I already have the canonical attribute automatically in place. Thanks for your help,
Intermediate & Advanced SEO | | IceIcebaby
-Reed0 -
Simple Link Question
Hi Guys, I will appreciate if you answer 1 small question..... Will our site benefit from that link?
Intermediate & Advanced SEO | | Webdeal
Valuable website related to our business ---nofollow link--> PDF Doc(on second site) ---link to our site ---> Kind Regards,
webdeal0 -
Rel="canonical" questions?
On our site we have some similar pages for example in our parts page we have the link to all the electrical parts you can see here http://www.rockymountainatvmc.com/c/43/53/160/Electrical and we have a very similar page going from our accessories page to electrical here http://www.rockymountainatvmc.com/c/43/72/221/Electrical We are thinking about putting rel="canonical" from the accessories electrical page to the parts one. We would do this for several pages not just this one. Thoughts???
Intermediate & Advanced SEO | | DoRM0 -
Quick URL structure question
Say you've got 5,000 articles. Each of these are from 2-3 generations of taxonomy. For example: example.com/motherboard/pc/asus39450 example.com/soundcard/pc/hp39 example.com/ethernet/software/freeware/stuffit294 None of the articles were SUPER popular as is, but they still bring in a bit of residual traffic combined. Few thousand or so a day. You're switching to a brand new platform. Awesome new structure, taxonomy, etc. The real deal. But, historically, you don't have the old taxonomy functions. The articles above, if created today, file under example.com/hardware/ This is the way it is from here on out. But what to do with the historical files? keep the original URL structure, in the new system. Readers might be confused if they try to reach example.com/motherboard, but at least you retain all SEO weight and these articles are all older anyways. Who cares? Grab some lunch. change the urls to /hardware/, and redirect everything the right way. Lose some rank maybe, but its a smooth operation, nice and neat. Grab some dinner. change the urls to /hardware/ DONT redirect, surprise Google with 5k articles about old computer hardware. Magical traffic splurge, go skydiving. Panic, cry into your pillow. Get job signing receipts at CostCo Thoughts?
Intermediate & Advanced SEO | | EricPacifico0