Robots.txt - blocking JavaScript and CSS, best practice for Magento
-
Hi Mozzers,
I'm looking for some feedback regarding best practices for setting up Robots.txt file in Magento.
I'm concerned we are blocking bots from crawling essential information for page rank.
My main concern comes with blocking JavaScript and CSS, are you supposed to block JavaScript and CSS or not?
You can view our robots.txt file here
Thanks,
Blake
-
As Joost said, you should not block access to files with help in the reading / rendering of the page.
Looking at your Robots file, I would look at the following two exclusions. Do they block anything else that runs on a live page that Google should be seeing?
Disallow: /includes/ Disallow: /scripts/ -Andy
-
Best practice is not to block access to JS / CSS anymore, to allow google to properly understand the website and give determine mobile-friendliness.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Robots blocked by pages webmasters tools
a mistake made in software. How can I solve the problem quickly? help me. XTRjH
Intermediate & Advanced SEO | | mihoreis0 -
What is the best SEO way for a shop
Hi there ! A client want to sell some products on its future website but just a small range (the most part of this website will not be an online shop). The idea is to add a "shop" button in the menu to redirect clients in this shop. I would like your opinion about how should I construct this shop, what do you think is the best for SEO : "www.website.com/shop" or "shop.website.com" thank you in advance for your answers !
Intermediate & Advanced SEO | | EnjinFrance0 -
Linking to other peoples you tube videos related to our "How to do " articles on our website. Is there Best practices ?
Hi All, We are currently writing some "How to do" articles on our tool hire website and as there is alot of DIY related you tube videos out there, we thought It would be good to link to some of these at the bottom of our articles. From an SEO perspective, is there any do's and don'ts with regards how we should implement this. We are unable to do our videos so linking to others would be our preferred option. Does anyone know if this would give an SEO ranking benefit even though it's an outbound link to someone's video etc. thanks Pete
Intermediate & Advanced SEO | | PeteC120 -
JavaScript Issue? Google not indexing a microsite
We have a microsite that was created on our domain but is not linked to from ANYwhere EXCEPT within some Javascript elements on pages on our site. The link is in one JQuery slide panel. The microsite is not being indexed at all - when i do site:(microsite name) on Google, it doesn't return anything. I think it's because the link's only in a Java element, but my client assures me that if I submit to Google for crawling the problem will be solved. Maybe so, but my point is that if you just create a simple HTML link from at least one of our site pages, it will get indexed no problem. The microsite has been up for months and it's still not being indexed - another newer microsite that's been up for a few weeks and has simple links to it from our pages is indexing fine. I have submitted the URL for crawling but had to use the google.com/webmasters/tools/submit-url/ method as I don't have access to the top level domain WMT account. p.s. when we put the microsite URL into the SEOBook spider-test tool it returns lots of lovely information - but that just tells me the page is findable, does exist, right? That doesn't mean Google's going to necessarily index it, as I am surmising...Moz hasn't found in the 5 months the microsite has been up and running. What's going on here?
Intermediate & Advanced SEO | | Jen_Floyd0 -
Best .htaccess guides in your bookmark?
It would be helpful if you can share .htaccess guides you're currently using. Thanks in advance! 🙂
Intermediate & Advanced SEO | | esiow20130 -
SEO Best Practice for a multi-language and multi-country website
Hello Moz Community, I hope someone could help me identify the best action to take on an on-page optimization confusion I am currently having. The website I am currently trying to optimize is http://www.riafinancial.com/locations/us/home.aspx. There is an option to view a country specific version of the page, or language version (there are 2 drop down menus on the top, for country or for language). When viewing a country specific version of the page, the URL changes depending on country selected. Some country versions also updates the content to the language of that country, but some remain English. Example, when viewing the France version of the page (http://www.riafinancial.com/locations/FR/home.aspx), the content is updated to french version, but when viewing the China version (http://www.riafinancial.com/locations/CN/home.aspx), the content is in English. This is because we have not yet translated for all countries (this will eventually be all translated). Now, when viewing by language, the URL does NOT change. Example, in http://www.riafinancial.com/locations/us/home.aspx, you can choose French, German, Italian, Polish, etc. The content of the page will change based on language chosen, but the URL (including page titles, meta-descriptions) will not change. My question is, how should I approach this for on-page optimization? Canonical? Hreflang? Any input, feedback, recommendation, suggestion will be greatly appreciated. Thanks! Sharon
Intermediate & Advanced SEO | | RiaMT0 -
About robots.txt for resolve Duplicate content
I have a trouble with Duplicate content and title, i try to many way to resolve them but because of the web code so i am still in problem. I decide to use robots.txt to block contents that are duplicate. The first Question: How do i use command in robots.txt to block all of URL like this: http://vietnamfoodtour.com/foodcourses/Cooking-School/
Intermediate & Advanced SEO | | magician
http://vietnamfoodtour.com/foodcourses/Cooking-Class/ ....... User-agent: * Disallow: /foodcourses ( Is that right? ) And the parameter URL: h
ttp://vietnamfoodtour.com/?mod=vietnamfood&page=2
http://vietnamfoodtour.com/?mod=vietnamfood&page=3
http://vietnamfoodtour.com/?mod=vietnamfood&page=4 User-agent: * Disallow: /?mod=vietnamfood ( Is that right? i have folder contain module, could i use: disallow:/module/*) The 2nd question is: Which is the priority " robots.txt" or " meta robot"? If i use robots.txt to block URL, but in that URL my meta robot is "index, follow"0 -
Robots.txt file - How to block thosands of pages when you don't have a folder path
Hello.
Intermediate & Advanced SEO | | Unity
Just wondering if anyone has come across this and can tell me if it worked or not. Goal:
To block review pages Challenge:
The URLs aren't constructed using folders, they look like this:
www.website.com/default.aspx?z=review&PG1234
www.website.com/default.aspx?z=review&PG1235
www.website.com/default.aspx?z=review&PG1236 So the first part of the URL is the same (i.e. /default.aspx?z=review) and the unique part comes immediately after - so not as a folder. Looking at Google recommendations they show examples for ways to block 'folder directories' and 'individual pages' only. Question:
If I add the following to the Robots.txt file will it block all review pages? User-agent: *
Disallow: /default.aspx?z=review Much thanks,
Davinia0