Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Can you use multiple videos without sacrificing load times?
-
We're using a lot of videos on our new website (www.4com.co.uk), but our immediate discovery has been that this has a negative impact on load times. We use a third party (Vidyard) to host our videos but we also tried YouTube and didn't see any difference.
I was wondering if there's a way of using multiple videos without seeing this load speed issue or whether we just need to go with a different approach.
Thanks all, appreciate any guidance!
Matt
-
Thank you very much for that, my guys are having a look into both Wistia and also if/how we can defer videos using either Vidyard or YouTube.
Thanks again,
Matt
-
I use Wistia as well and recommend them I do not recommend using their plug-in
You can defer loading of the video and make it so that the site very quickly and is almost not affected at all.
- https://varvy.com/pagespeed/defer-videos.html
- https://varvy.com/pagespeed/defer-many-javascripts.html
- USE this to get JavaScript queries https://varvy.com/tools/js/
- This for an overall https://varvy.com/pagespeed/ test
- **Best practices **https://kinsta.com/learn/page-speed/
- https://varvy.com/pagespeed/defer-loading-javascript.html
- https://varvy.com/pagespeed/critical-render-path.html
How to defer videos
To do this we need to markup our embed code and add a small and extremely simple javascript. I will show the method I actually used for this page.
The html
<iframe width="560" height="315" src="" data-src="//www.youtube.com/embed/OMOVFvcNfvE" frameborder="0" allowfullscreen=""></iframe>
In the above code I took the embed video code from Youtube and made two small changes. The first change is that I made the "src" empty by removing the url from it as below.
src=""
The second change I made is I put the url I cut from "src" and added it to "data-src".
data-src="//www.youtube.com/embed/OMOVFvcNfvE"
The javascript
Script to call external javascript file
This code should be placed in your HTML just before the tag (near the bottom of your HTML file). So "**defer.js" is **the name of the external JS file.
I hope this helps, Tom
-
I'm very doubtful hosting the video off-site would have much effect on the site speed especially YouTube, Personally I use Wistia mainly due to the level of analytics that they provide. The only time this may be an issue if you have a quantity on a single page, in that case I would try and split it onto several different pages by means of categories or something.
To me it sounds like there may be a programming problem.
The other thing is it may not be the videos that is slowing the site down.
Just a few thoughts don't know if it helps.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Good to use disallow or noindex for these?
Hello everyone, I am reaching out to seek your expert advice on a few technical SEO aspects related to my website. I highly value your expertise in this field and would greatly appreciate your insights.
Technical SEO | Jul 17, 2023, 12:28 PM | williamhuynh
Below are the specific areas I would like to discuss: a. Double and Triple filter pages: I have identified certain URLs on my website that have a canonical tag pointing to the main /quick-ship page. These URLs are as follows: https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black
https://www.interiorsecrets.com.au/collections/lounge-chairs/quick-ship+black+fabric Considering the need to optimize my crawl budget, I would like to seek your advice on whether it would be advisable to disallow or noindex these pages. My understanding is that by disallowing or noindexing these URLs, search engines can avoid wasting resources on crawling and indexing duplicate or filtered content. I would greatly appreciate your guidance on this matter. b. Page URLs with parameters: I have noticed that some of my page URLs include parameters such as ?variant and ?limit. Although these URLs already have canonical tags in place, I would like to understand whether it is still recommended to disallow or noindex them to further conserve crawl budget. My understanding is that by doing so, search engines can prevent the unnecessary expenditure of resources on indexing redundant variations of the same content. I would be grateful for your expert opinion on this matter. Additionally, I would be delighted if you could provide any suggestions regarding internal linking strategies tailored to my website's structure and content. Any insights or recommendations you can offer would be highly valuable to me. Thank you in advance for your time and expertise in addressing these concerns. I genuinely appreciate your assistance. If you require any further information or clarification, please let me know. I look forward to hearing from you. Cheers!0 -
I have a GoDaddy website and have multiple homepages
I have GoDaddy website builder and a new website http://ecuadorvisapros.com and I notices through your crawl test that there are 3 home pages http://ecuadorvisapros with a 302 temporary redirect, http://www.ecuadorvisapros.com/ with no redirect and http://www.ecuadorvisapros/home.html. GoDaddy says there is only one home page. Is this going to kill my chances of having a successful website and can this be fixed? Or can it. I actually went with the SEO version thinking it would be better, but it wants to auto change my settings that I worked so hard at with your sites help. Please keep it simple, I am a novice although I have had websites in the past I know more about the what's than the how's of websites. Thanks,
Technical SEO | Aug 25, 2015, 4:23 PM | ScottR.0 -
Canonical Tag when using Ajax and PhantomJS
Hello, We have a site that is built using an AJAX application. We include the meta fragment tag in order to get a rendered page from PhantomJS. The URL that is rendered to google from PhantomJS then is www.oursite.com/?escaped_fragment= In the SERP google of course doesnt include the hashtag in the URL. So my question, with this setup, do i still need a canonical tag and if i do, would the canonical tag be the escaped fragment URL or the regular URL? Much Appreciated!
Technical SEO | Feb 27, 2015, 11:40 AM | RevanaDigitalSEO0 -
Umbrella company and multiple domains
I'm really sorry for asking this question yet again. I have searched through previous answers but couldn't see something exactly like this I think. There is a website called example .com. It is a sort of umbrella company for 4 other separate domains within it - 4 separate companies. The Home page of the "umbrella" company website is example.com. It is just an image with no content except navigation on it to direct to the 4 company websites. The other pages of website example.com are the 4 separate companies domains. So on the navigation bar there is : Home page = example.com company1page = company1domain.com company2page= company2domain.com etc. etc. Clicking "home" will take you back to example.com (which is just an image). How bad or good is this structure for SEO? Would you recommend any changes to help them rank better? The "home" page has no authority or links, and neither do 3 out of the 4 other domains. The 4 companies websites are independent in content (although theme is the same). What's bringing them altogether is under this umbrella website - example.com. Thank you
Technical SEO | Oct 7, 2013, 11:48 AM | AL123al0 -
Can hotlinking images from multiple sites be bad for SEO?
Hi, There's a very similar question already being discussed here, but it deals with hotlinking from a single site that is owned by the same person. I'm interested whether hotlinking images from multiple sites can be bad for SEO. The issue is that one of our bloggers has been hotlinking all the images he uses, sometimes there are 3 or 4 images per blog from different domains. We know that hotlinking is frowned upon, but can it affect us in the SERPs? Thanks, James
Technical SEO | Mar 8, 2013, 12:41 PM | OptiBacUK0 -
Can you mark up a page using Schema.org and Facebook Open Graph?
Is it possible to use both Schema.org and Facebook Open Graph for structured data markup? On the Google Webmaster Central blog, they say, "you should avoid mixing the formats together on the same web page, as this can confuse our parsers." Source - http://googlewebmastercentral.blogspot.com/2011/06/introducing-schemaorg-search-engines.html
Technical SEO | Jan 9, 2013, 6:36 PM | SAMarketing1 -
Do we need to manually submit a sitemap every time, or can we host it on our site as /sitemap and Google will see & crawl it?
I realized we don't have a sitemap in place, so we're going to get one built. Once we do, I'll submit it manually to Google via Webmaster tools. However, we have a very dynamic site with content constantly being added. Will I need to keep manually re-submitting the sitemap to Google? Or could we have the continually updating sitemap live on our site at /sitemap and the crawlers will just pick it up from there? I noticed this is what SEOmoz does at http://www.seomoz.org/sitemap.
Technical SEO | Mar 13, 2012, 3:16 PM | askotzko0 -
How to use overlays without getting a Google penalty
One of my clients is an email subscriber-led business offering deals that are time sensitive and which expire after a limited, but varied, time period. Each deal is published on its own URL and in order to drive subscriptions to the email, an overlay was implemented that would appear over the individual deal page so that the user was forced to subscribe if they wished to view the details of the deal. Needless to say, this led to the threat of a Google penalty which _appears (fingers crossed) _to have been narrowly avoided as a result of a quick response on our part to remove the offending overlay. What I would like to ask you is whether you have any safe and approved methods for capturing email subscribers without revealing the premium content to users before they subscribe? We are considering the following approaches: First Click Free for Web Search - This is an opt in service by Google which is widely used for this sort of approach and which stipulates that you have to let the user see the first item they click on from the listings, but can put up the subscriber only overlay afterwards. No Index, No follow - if we simply no index, no follow the individual deal pages where the overlay is situated, will this remove the "cloaking offense" and therefore the risk of a penalty? Partial View - If we show one or two paragraphs of text from the deal page with the rest being covered up by the subscribe now lock up, will this still be cloaking? I will write up my first SEOMoz post on this once we have decided on the way forward and monitored the effects, but in the meantime, I welcome any input from you guys.
Technical SEO | Apr 4, 2011, 6:50 PM | Red_Mud_Rookie0