Eliminate render blocking javascript and css recommendation?
-
Our site's last Red flag issue is the "eliminate render blocking javascript and css" message. I don't know how to do that, and while I'm not sure if I could spend hours/days cutting and pasting and guessing until I made progress, I'd rather not. Does anyone know of a plugin that will just do this? Or, if not, how much would it cost to get a web developer to do this?
Also, if there is not plugin (and it didn't look like there was when I looked) how long do you think this would take someone who knows what they are doing to complete.
The site is: www.kempruge.com
Thanks for any tips and/or suggestions,
Ruben
-
Yes, it's over a month, and for the most part of the month our page speed score was 66 ave for the 3 seconds. Now, with the adjustments I've made and switching to a new hosting company, we're at an 81 as of this morning. So, I guess if 3 seconds at a 66 isn't terrible, we'll probably be in an acceptable range following the improvements.
Either way, thanks so much for the industry stats and the article. It's easy to find "how to make your website faster" info, but MUCH more difficult to find an article that I can trust. Thanks for the tip!
Ruben
-
Hi Ruben,
That analytics data is over a month or so right? Just to make sure we are not talking about an unusually fast or slow day!
3 seconds is not too bad. It can depend a lot on the type of site you have or the industry. Check this for a recent rundown of stats by country/industry. Also check out this article for a good rundown of tactics to use in reducing load times.
I would look at doing some of the more easy fixes included in the above article (if you havent already) before you move to trying to adjust the script rendering issues, especially if you do not have an inhouse person that is comfortable doing it. If you have already done all of that, then really it is a matter of how much effort it will require to find someone to diagnose and make the needed changes to the site code versus how much load/rendering time that will shave off. Personally, I think it might not be worth it, but others may disagree
-
Thanks Lynn! Yes, they are from Google Page Speed Insights. Attached is our pagespeed times from GA. Unfortunately, I'm not sure if they're okay or not. I just don't know enough, other than, faster is usually better.
Your thoughts?
Thanks,
Ruben
-
Hi,
Are you getting this flag from google page speed insights? Render blocking scripts are basically scripts that are called in the beginning of the page (the head usually) but are not really used either for that page or for the content of that page that is immediately visible, so downloading them first delays the rendering of the page. Depending on the structure of your site/code, plugins used etc fixing this could be as simple as moving a couple of lines of code in the template or..... quite complicated indeed.
What are your page load times in google analytics looking like? I had a look at your page and it seemed to load pretty fast so I would check load times in GA and see if the problem is really as urgent as you think. The page speed insight tool will flag everything it sees, but sometimes it can give you kind of false positives and other times just be recommending things mechanically that are not a huge issue in the grand scale of things!
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Block session id URLs with robots.txt
Hi, I would like to block all URLs with the parameter '?filter=' from being crawled by including them in the robots.txt. Which directive should I use: User-agent: *
Intermediate & Advanced SEO | | Mat_C
Disallow: ?filter= or User-agent: *
Disallow: /?filter= In other words, is the forward slash in the beginning of the disallow directive necessary? Thanks!1 -
SEO Implications of firewalls that block "foreign connections"
Hello! A client's IT security team has firewalls on the site with GEO blocking enabled. This is to prevent foreign connections to applications as part of a contractual agreements with their own clients. Does anyone have any experience with workarounds for this? Thank you!
Intermediate & Advanced SEO | | SimpleSearch0 -
Is Link equity / Link Juice lost to a blocked URL in the same way that it is lost to nofollow link
Hi If there is a link on a page that goes to a URL that is blocked in robots txt - is the link juice lost in the same way as when you add nofollow to a link on a page. Any help would be most appreciated.
Intermediate & Advanced SEO | | Andrew-SEO0 -
Does google credit links from iFrames or created by Javascript, if so, is one more powerful than the other?
Consider this example, because I want to be clear about what I mean. You have two websites. Lets all them www.a.com and www.b.com. On www.a.com/some/page, there is an iframe something like this:
Intermediate & Advanced SEO | | adriandg
<iframe src="www.b.com/some/special/path"></iframe>
Then content of this iframe is a bunch of pictures, text and numbers, as well as a group of links, linking each picture to www.b.com for example the links might be:
www.b.com/content/1
www.b.com/content/2
www.b.com/content/3 Questions: When google crawls **www.a.com/some/page, **does it pass link juice to www.b.com/content/*? Does google instead consider these to be internal links within b.com itself. because links to www.b.com/content/ ** are actually from b.com itself, since the domain of the iframe is actually: www.b.com/some/special/path 3) Is there any amount of link juice passed from www.a.com/some/page to* www.b.com/some/special/path **because this is the src= element of an iframe that a.com is hosting? Consider an alternative setup. Where instead of using an iframe the contents of the above described iFrame is actually added the the page dynamically using javascript, and a call to an API endpoint at b.com. Resulting in these links being added directly to the body of a.com without being wrapped in an iframe element. Questions:
4) Do these links that were created after page load still get crawled and credited by google? (i have heard in the past that google was going to start crawling javascript, i just don't know if this is known for a fact yet).
5) Do links created on the client side hold the same weight as a link that was served directly via the backend html generation? If both the links within the iframe and the links within the javascript embed method pass link juice. Is one preferred over the other? is one known to be more effective than the other? Thanks!0 -
Block Googlebot from submit button
Hi, I have a website where many searches are made by the googlebot on our internal engine. We can make noindex on result page, but we want to stop the bot to call the ajax search button - GET form (because it pass a request to an external API with associate fees). So, we want to stop crawling the form button, without noindex the search page itself. The "nofollow" tag don't seems to apply on button's submit. Any suggestion?
Intermediate & Advanced SEO | | Olivier_Lambert0 -
301 redirect recommendations
One of our clients we are working on have two sites the main with a PR5 and a separate one with a PR4. We are planning on doing a 301 from the PR4 to a page on the PR5 Is it best to do: www.PR4.com ----> www.PR5.com/releveantPR4page or www.PR4.com/page ----> www.PR5.com/releveantPR4page Most pages on the PR4 site can fit into one PR5 page logically. However the PR4 has an about us, contact us, blog/with posts, FAQ, Applications, Legal Resources which are all pretty out dated.. The PR4 site is kinda messy and we are not sure if it will be easy to 301 each page individually with the user in mind. can we do a sitewide 301 redirect from the root PR4.com to a page PR/5.com/releveantPR4page and also do deeper 301's? PR4.com/PR4page ---> PR5.com/releveantPR4page
Intermediate & Advanced SEO | | Bryan_Loconto0 -
Large Site SEO - Dev Issue Forcing URL Change - 301, 302, Block, What To Do?
Hola, Thanks in advance for reading and trying to help me out. A client of mine recently created a large scale company directory (500k+ pages) in Drupal v6 while the "marketing" type pages of their site was still in manual hard-coded HTML. They redesigned their "marketing" pages, but used Drual v7. They're now experiencing server conflicts with both instances of Drupal not allowing them to communicate/be on the same server. Eventually the directory will be upgraded to Drupal v7, but could take weeks to months the client does not want to wait for the re-launch. The client wants to push the new marketing site live, but also does not want to ruin the overall SEO value of the directory and have a few options, but I'm looking to help guide them down the path of least resistance: Option 1: Move the company directory onto a subdomain and the "marketing site" on the www. subdomain. Client gets to push their redesign live, but large scale 301s to the directory cause major issues in terms of shaking up the structure of the site causing ripple effects into getting pulled out of the index for days to weeks. Rankings and traffic drop, subdomain authority gets lost and the company directory health looks bad for weeks to months. However, 301 maintains partial SEO value and some long tail traffic still exists. Once the directory gets moved to Drupal v7, the directory will then cancel the 301 to the subdomain and revert back to original www. subdomain URLs Option 2: Block the company directory from search engines with robots.txt and meta instructions, essentially cutting off the floodgates from the established marketing pages. No major scaling 301 ripple effect, directory takes a few weeks to filter out of the index, traffic is completely lost, however once drupal v7 gets upgraded and the directory is then re-opened, directory will then slowly gain back SEO value to get close to old rankings, traffic, etc. Option 3: 302 redirect? Lose all accumulate SEO value temporarily... hmm Option 4: Something else? As you can see, this is not an ideal situation. However, a decision has to be made and I'm looking to chose the lesser of evils. Any help is greatly appreciated. Thanks again -Chris
Intermediate & Advanced SEO | | Bacon0 -
Block Google Sitelinks for DSEO?
I am trying to manage DSEO for a client. The question is: would blocking a page listing from my client's Google Sitelinks cause that blocked sitelink page to be independently listed in the rankings and therefore potentially stuff a negative listing further down the rankings? Or would the blocked sitelink not show up at all in the SERPs
Intermediate & Advanced SEO | | bcmull0