Can you use multiple videos without sacrificing load times?
-
We're using a lot of videos on our new website (www.4com.co.uk), but our immediate discovery has been that this has a negative impact on load times. We use a third party (Vidyard) to host our videos but we also tried YouTube and didn't see any difference.
I was wondering if there's a way of using multiple videos without seeing this load speed issue or whether we just need to go with a different approach.
Thanks all, appreciate any guidance!
Matt
-
Thank you very much for that, my guys are having a look into both Wistia and also if/how we can defer videos using either Vidyard or YouTube.
Thanks again,
Matt
-
I use Wistia as well and recommend them I do not recommend using their plug-in
You can defer loading of the video and make it so that the site very quickly and is almost not affected at all.
- https://varvy.com/pagespeed/defer-videos.html
- https://varvy.com/pagespeed/defer-many-javascripts.html
- USE this to get JavaScript queries https://varvy.com/tools/js/
- This for an overall https://varvy.com/pagespeed/ test
- **Best practices **https://kinsta.com/learn/page-speed/
- https://varvy.com/pagespeed/defer-loading-javascript.html
- https://varvy.com/pagespeed/critical-render-path.html
How to defer videos
To do this we need to markup our embed code and add a small and extremely simple javascript. I will show the method I actually used for this page.
The html
<iframe width="560" height="315" src="" data-src="//www.youtube.com/embed/OMOVFvcNfvE" frameborder="0" allowfullscreen=""></iframe>
In the above code I took the embed video code from Youtube and made two small changes. The first change is that I made the "src" empty by removing the url from it as below.
src=""
The second change I made is I put the url I cut from "src" and added it to "data-src".
data-src="//www.youtube.com/embed/OMOVFvcNfvE"
The javascript
Script to call external javascript file
This code should be placed in your HTML just before the tag (near the bottom of your HTML file). So "**defer.js" is **the name of the external JS file.
I hope this helps, Tom
-
I'm very doubtful hosting the video off-site would have much effect on the site speed especially YouTube, Personally I use Wistia mainly due to the level of analytics that they provide. The only time this may be an issue if you have a quantity on a single page, in that case I would try and split it onto several different pages by means of categories or something.
To me it sounds like there may be a programming problem.
The other thing is it may not be the videos that is slowing the site down.
Just a few thoughts don't know if it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
301 Redirect Timing Questions
Hey all, Quick question on 301 redirects and the timing of creating them when transitioning from an old site to a new site. Does the timing matter? Can redirects interfere with DNS propigation (which seemed to happen to us when we did redirects minutes after redirecting someone's DNS A record to now point to the new site) And lastly, how long AFTER a new site launch can one still submit redirects and not lose the google juice? All the best,
Technical SEO | | WorldWideWebLabs0 -
Multiple H1 tags in Squarespace
Hi. I'm using Squarespace, and I've noticed they assign the page title and site title h1 tag status. So if I add an on-page h1 tag, that's three in total. I've seen what Matt Cutts said about multiple h1 tags being acceptable (although that video was back in 2009 and a lot has changed since then). But I'm still a little concerned that this is perhaps not the best way of structuring for SEO. Could anyone offer me any advice? Thanks.
Technical SEO | | The_Word_Department0 -
What is the best way to use canonical tag
Hi, i have been researching this since yesterday and have looked at this subject many times before but still cannot get my head around it. i done a report on my site which was very useful, i used http://www.juxseo.com for my site www.in2town.co.uk and it brought me some useful information but part of that info was it was telling me that i should have on my home page a canonical tag which would improve my seo. Now i am using sh404sef for my friendly urls and i am using joomla 3.0 and when i approached the makers of the sh404sef to ask about the tag they said i would need to be careful of using it as it could damage my site and my rankings. i have read lots of information but still do not have a clear understanding behind it. can anyone please explain the best way to use this and should i be using where i may have some sort of duplicate page, any help to understand this would be great.
Technical SEO | | ClaireH-1848860 -
Timely use of robots.txt and meta noindex
Hi, I have been checking every possible resources for content removal, but I am still unsure on how to remove already indexed contents. When I use robots.txt alone, the urls will remain in the index, however no crawling budget is wasted on them, But still, e.g having 100,000+ completely identical login pages within the omitted results, might not mean anything good. When I use meta noindex alone, I keep my index clean, but also keep Googlebot busy with indexing these no-value pages. When I use robots.txt and meta noindex together for existing content, then I suggest Google, that please ignore my content, but at the same time, I restrict him from crawling the noindex tag. Robots.txt and url removal together still not a good solution, as I have failed to remove directories this way. It seems, that only exact urls could be removed like this. I need a clear solution, which solves both issues (index and crawling). What I try to do now, is the following: I remove these directories (one at a time to test the theory) from the robots.txt file, and at the same time, I add the meta noindex tag to all these pages within the directory. The indexed pages should start decreasing (while useless page crawling increasing), and once the number of these indexed pages are low or none, then I would put the directory back to robots.txt and keep the noindex on all of the pages within this directory. Can this work the way I imagine, or do you have a better way of doing so? Thank you in advance for all your help.
Technical SEO | | Dilbak0 -
Urls with or without .html ending
Hello, Can anyone show me some authority info on wheher links are better with or without a .html ending? Thanks is advance
Technical SEO | | sesertin0 -
If non-paying customers only get a 2 min snippet of a video, can my video length in sitemap.xml be the full length?
I am working on a website that all of its primary contents are videos. They have an assortment of free videos, but the majority or viewable only with a subscription to the site. If you don't have a subscription, you can see a 2 min video clip of the contents of the video. But all the videos can be anywhere from 10min to 1.5 hours. When I am auto-generating the sitemap.xml, can I put the full length of the videos for paying members in the XML in the video:duration property? Or because publicly only 2 minutes is available (unless you pay for a membership) is that frowned upon?
Technical SEO | | nbyloff0