Removing CSS & JS Files from Index
-
Hi,
Google has indexed a few .CSS and .JS files that belong to our WordPress plugins and themes. I had them blocked via robots, but realized this doesn't prevent indexation (and can likely hurt us since Google wants to access these files).
I've since removed the robots instructions, submitted a removal request via Search Console, but want to make sure they don't come back.
Is there a way to put a noindex tag within .CSS and .JS files? Or should I do something with .htaccess instead?
-
I figured .htaccess would be the best route. Thank you for researching and confirming. I appreciate it.
-
Hi Tim,
Assigning a noindex tag to these files will not block them, only prevent them from showing in SERPs. This is the intended goal and the reason I deleted my robots.txt file which prevented crawling.
-
There's quite a big difference between crawling directives, which block and indexing directives. This article by (former?) Moz user S_ebastian_ is a good foundation read.
This article at developers.google.com is a good second read. If I'm understanding it right, Google thinks in terms of crawling directives vs indexing / serving directives.
My attempt at <tl rl="">:</tl>
crawling = looking, using in any way :: controlled via robots.txt
indexing / serving = indexing, archiving, displaying snippets in results, etc :: controlled via html meta tags or web server htaccess (or similar for other web servers).
I'm not convinced yet, that asking for noindex via htaccess causes the same sort of grief that deny in robots.txt causes.
-
I would seriously think again when it comes to blocking/no-indexing your CSS and JS files - Google has in the past stated that if they cannot fully render your site properly then this could lead to poorer rankings.
You will also likely get notifications in your Search Console as errors for this too.
Check out this great article from July this year which goes into more details.
-
I haven't encountered undesirable .css or .js indexing myself (yet), but as you surmised, maybe this htaccess directive might be worth trying?
<filesmatch ".(txt|log|xml|css|js)$"="">Header set X-Robots-Tag "noindex"</filesmatch>
Google seems to support it
-
Unless I'm severely misreading the links provided, which I've read before, it seems Google is stating that they read, render, and sometimes index .CSS and .JS files. Here's an article written a week after the second article you posted.
The aforementioned WordPress plugin and theme files hosted on my server are indeed showing up in Google SERPs.
I do not want to prevent Googlebot from reaching these files as they're needed for optimal site performance, but I do want them to be no-indexed. Thus, I don't want robots.txt to prevent crawling, only indexing.
Let me know if I'm misunderstanding.
-
TL;DR - You're hesitated about problem that doesn't exist.
Googlebot doesn't index CSS or JS files. They index text files, HTML, PDF, DOC, XLS, etc. But doesn't index style sheets or javascript files.
All you need in WordPress is to create blank robots.txt file where WP is installed with this content:
User-agent: *
Disallow:
Sitemap: http://site/sitemap-file-name.xmlAnd that's all. This is explain many times:
http://googlewebmastercentral.blogspot.bg/2014/05/understanding-web-pages-better.html
http://googlewebmastercentral.blogspot.bg/2014/10/updating-our-technical-webmaster.html
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Removed URLs
recently my site has got some problem some of my URLs are repeating in the SERP ! I removed them by search console and also site : but they show up again Does anyone know what is wrong?
Technical SEO | | talaabshode20200 -
Website is not indexing
Hi All, My website URL is https://thepeopeople.com and it is neither caching nor indexing in Google. Earlier the URL was https://peopeople.com. I have redirected it to https://thepeopeople.com by using 301 redirections. I have checked the redirection and everything else is fine and I have submitted all the URLs in search console also, still the website is not indexing. Its been more than 5 months now. Please suggest a solution for this. Thanks in Advance.
Technical SEO | | ResultfirstGA0 -
Historic issue with incomplete indexing
Hi there We run quite a big site in the UK in the commercial real-estate space. Historically we have always had a challenge getting our "primary" landing pages indexed, which are location based property result pages. e.g. https://realla.co/to-rent/commercial-property/oxford For example, for the "towns" category we have 8,549 submitted in our xml sitemap, with only 3,171 indexed. This is a general issue across all our sitemaps. 120k submitted, 80k indexed. Our pages are linked through breadcrumbs, and nearby links. In the new search console these pages are reported as "crawled - currently not indexed" These all sit under the folder: site:https://realla.co/to-rent/commercial-property/* site:https://realla.co/to-rent/office/* We have done extensive work to optimise performance, including AMP pages. Each location page has many details pages for individual properties e.g. https://realla.co/to-rent/details/0ffbbd0a1a1147edb8847c5ce6179509 One action we have remaining is to nest the details under the locations pages, which may help. These details pages are indexed fully. Any feedback much appreciated
Technical SEO | | ianparryuk0 -
Sitemap nos being indexed
Hi! How are you? I'm having a problem: for some reason I don't understand, Google Webmasters Tool isn't indexing the sitemaps I'm uploading. One of them is http://chelagarto.com/index.php?option=com_xmap&sitemap=1&view=xml&lang=en . Do you see what could be the problem? It says it only indexed 2 website. I've already sent this Sitemap several times and I'm always getting the same result. I'd really use some advice. Thanks!
Technical SEO | | arielbortz0 -
301 Redirect with index.asp
I am very new to all of this so forgive the newbie questions I will get better. Ok so after starting a campaign I see that I have many issues including where some pages are being deemed as duplicate content. 1. The report says the http://lucid8.com has duplicate content on 2 other pages 2. When I look at them it shows that http://lucid8.com/index.asp and http://www.lucid8.com are duplicates. 3. Really these are the exactly the same page because the default page that is opened for www.lucid8.com http://www.lucid8.com etc always opens the index.asp page. 4. Now I read that I should do permanent redirects and how to do this via IIS and I tried to do a redirect from index.asp to www.lucid8.com but that does not work because www.lucid8.com is pointing to index.asp and so we end up in a circle. So the question is how do I get rid of these duplicate page references without causing problems. Thanks
Technical SEO | | TroyW0 -
Children in this Sitemap index Warnings
Hi, I have just submitted a sitmap for one website. But I am getting this warning: Number of children in this Sitemap index 3
Technical SEO | | knockmyheart
Sitemap contains urls which are blocked by robots.txt.Sitemap: www.zemtube.com/videoscategory-sitemap.xmlValue: http://www.zemtube.com/videoscategory/exclusive/www.zemtube.com/videoscategory-sitemap.xmlValue: http://www.zemtube.com/videoscategory/featured/www.zemtube.com/videoscategory-sitemap.xmlValue: http://www.zemtube.com/videoscategory/other/It is a wordpress website and the robots.txt file is:# Exclude Files From All Robots: User-agent: *
Disallow: /wp-admin/
Disallow: /wp-includes/
Disallow: /tag/ End robots.txt file#I have also tried adding this to the robots.txtSitemap: http://www.zemtube.com/sitemap_index.xmlWebmaster-Tools-Sitemaps-httpwww.zemtube.com_.pdf0 -
Switching Hosting & SEO
Hello friends, We are facing the prospect of switching to a new hosting account or company. We are currently using a third-party reseller account but are outgrowing that account. We are considering VPS and dedicated servers. However, this will mean updates for IPs and nameservers. Does anyone have experience with SEO consequences of making switch? Best practices? Tips? Obstacles? Any and all comments/advice welcome. We're trying to balance the potential SEO ramifications of making the switch with the consequences of reduced site speed.
Technical SEO | | Gyi0