Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
Loading images below the fold? Impact on SEO
-
I got this from my developers. Does anyone know if this will be a SEO issue?
We hope to lazy-load images below the fold where possible, to increase render speed - are you aware of any potential issues with this approach from an SEO point of view?
-
Happy to help!
-
Thanks Tom!
As always, an amazing response.
Best
-
Hi Chris sorry for the late reply absolutely you can do this by using a plug-in cloudfare or PHP code
- https://wordpress.org/plugins/wp-deferred-javascripts/
- https://wordpress.org/plugins/defer-css-addon-for-bwp-minify/
Another plugin that does this solution but providing an administration area to configure it manually is Autoptimize, that allows to define a specific CSS code in a independent way of your theme CSS stylesheet
- http://www.oxhow.com/optimize-defer-javascript-wordpress/
- https://seo-hacker.com/optimizing-site-speed-asynchronous-deferred-javascript/
- http://www.laplacef.com/how-to-defer-parsing-javascript-in-wordpress/
The solution of these problem is removing those render-blocking scripts. But if you remove them, some plugins may not work properly. So, the best solution for the smooth rendering is:
1. Remove them from your website source page.
2. Use a single script, hosted by Google as the alternative.
3. Push down the new script at end of the page ( before “” tag).
Here is how to do it.
Copy the code from the following link and paste at your theme’s function.php file.
function optimize_jquery() { if (!is_admin()) { wp_deregister_script('jquery'); wp_deregister_script('jquery-migrate.min'); wp_deregister_script('comment-reply.min'); $protocol='http:'; if($_SERVER['HTTPS']=='on') { $protocol='https:'; } wp_register_script('jquery', $protocol.'//ajax.googleapis.com/ajax/libs/jquery/1.9.0/jquery.min.js', false, '3.6', true); wp_enqueue_script('jquery'); } } add_action('template_redirect', 'optimize_jquery');
Save the file and you are done! Now recheck the source of any page and you won’t see those two scripts at the head section. Alternatively, you can see the Google hosted JavaScriptscript source at the end of the page.
That’s all! Now the visible section of your page will be rendered smoothly.
Defer Loading JavaScript
Another suggestion from Google Page Speed tool is “Defer JavaScripts”. This problem happens when you use any inline JavaScripts like the scripts for Facebook like box or button, Google plus button, Twitter button etc. If you defer the JavaScript then the scripts are triggered after loading of the entire document.
How to defer JavaScript at WordPress
1. Create a JavaScript file and give the name as defer.js.
2. Place the JavaScripts codes that you want to defer into the defer.js file. For instance, if you want to defer Facebook like box script, paste the following at that file.
(function(d, s, id) { var js, fjs = d.getElementsByTagName(s)[0]; if (d.getElementById(id)) return; js = d.createElement(s); js.id = id; js.src = "//connect.facebook.net/en_GB/all.js#xfbml=1&appId=326473900710878"; fjs.parentNode.insertBefore(js, fjs); }(document, 'script', 'facebook-jssdk'));
3. Save the file and upload at your theme folder.
4. Now, copy the following code and paste at the head section of the source page. Here in WordPress, open header.php file of your theme and paste the code before the closing head tag.
Make sure to put the correct path of defer.js. For example, the source path should be like this:
/wp-content/themes/theme_name/defer.js ______________________________________________________________________________________________
I hope that helps,
Tom
-
happy I could help
-
Thomas,
Can this be implemented on a Wordpress site?
Apologize for hijacking!
-
What a great response! Just what I was looking for. Thank you!
-
lazy loading images is not as good as deferring an image. Because lazy loading images can cause issues can cause JavaScript issues that will not cause problems if you deferred the image instead of lazy loading.
Defer images you will have a easier time the method discussed here does not hurt search engine optimization in fact it will help it because increased load speeds or what people perceive as an increased load speed always helps the end-user.
Here is the best way
https://www.feedthebot.com/pagespeed/defer-images.html
This is where we defer the images without lazy loading
In the scenario of a one page template, there is no reason to do all the things that lazy loading does (observe, monitor and react to a scroll postion).
Why not just defer those images and have them load immediately after the page has loaded?
How to do it
To do this we need to markup our images and add a small and extremely simple javascript. I will show the method I actually use for this site and others. It uses a base 64 image, but do not let that scare you.
The html
The javascript
-
I have looked for information on this in the past and come up empty handed. With page speed Google really pits you against best SEO practices. I think if you follow most of the page speed insights you can severely limit your SEO. How many images are you talking about, how does Google render the page in their fetch as Google?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
SEO + Structured Data for Metered Paywall
I have a site that will have 90% of the content behind a metered paywall. So all content is accessible in a metered way. All users who aren't logged in will have access to 3 articles (of any kind) in a 30 day period. If they try to access more in a 30 day period they will hit a paywall. I was reading this article here on how to handle structured data with Google for content behind a paywall: https://www.searchenginejournal.com/paywalls-seo-strategy/311359/However, the content is not ALWAYS behind a paywall, since it is metered. So if a new user comes to the site, they can see the article (regardless of what it is). Is there a different way to handle content that will be SOMETIMES behind a paywall bc of a metered strategy? Theoretically I want 100% of the content indexed and accessible in SERPs, it will just be accessible depending on the user's history (cookies) with the site. I hope that makes sense.
Technical SEO | | triveraseo0 -
SEO - New URL structure
Hi, Currently we have the following url structure for all pages, regardless of the hierarchy: domain.co.uk/page, such as domain/blog name. Can you, please confirm the following: 1. What is the benefit of organising the pages as a hierarchy, i.e. domain/features/feature-name or domain/industries/industry-name or domain/blog/blog name etc. 2. This will create too many 301s - what is Google's tolerance of redirects? Is it worth for us changing the url structure or would you only recommend to add breadcrumbs? Many thanks Katarina
Technical SEO | | Katarina-Borovska1 -
Image Sitemap
I currently use a program to create our sitemap (xml). It doesn't offer creating an mage sitemaps. Can someone suggest a program that would create an image sitemap? Thanks.
Technical SEO | | Kdruckenbrod0 -
Express js and SEO?
Hi fellow Mozzers, I have been tasked with providing some SEO recommendations for a website that is to be built using express.js and Angular. I wondered whether anyone has had any experience in such a framework? On checking a website built in this and viewing as a GoogleBot etc using the following tools it appears as though most of the content is invisible: http://www.webconfs.com/search-engine-spider-simulator.php http://www.browseo.net/ Obviously this is a huge issue and wonder if there are any workarounds, or reccomendations to assist (even if means moving away from this - would love to hear about it)
Technical SEO | | musthavemarketing2 -
Can a CMS affect SEO?
As the title really, I run www.specialistpaintsonline.co.uk and 6 months ago when I first got it it had bad links which google had put a penalty against it so losts it value. However the penalty was lift in Sept, the site corresponds to all guidelines and seo work has been done and constantly monitored. the issue I have is sales and visits have not gone up, we are failing fast and running on 2 or 3 sales a month isn't enough to cover any sort of cost let alone wages. hence my question can the cms have anything to do with it? Im at a loss and go grey any help or advice would be great. thanks in advance.
Technical SEO | | TeamacPaints0 -
How much will changing IP addresses impact SEO?
So my company is upgrading its Internet bandwidth. However, apparently the vendor has said that part of the upgrade will involve changing our IP address. I've found two links that indicate some care needs to be taken to make sure our SEO isn't harmed: http://followmattcutts.com/2011/07/21/protect-your-seo-when-changing-ip-address-and-server/ http://www.v7n.com/forums/google-forum/275513-changing-ip-affect-seo.html Assuming we don't use an IP address that has been blacklisted by Google for spamming or other black hat tactics, how problematic is it? (Note: The site hasn't really been aggressively optimized yet - I started with the company less than two weeks ago, and just barely got FTP and CMS access yesterday - so honestly I'm not too worried about really messing up the site's optimization, since there isn't a lot to really break.)
Technical SEO | | ufmedia0 -
Changing DNS -- SEO implications?
Hey Moz, We're migrating an old site on an old server over to a new server/DNS. The plan is to keep the same URL structure and reuse our existing URL's. As long as we make minimal changes to each page's content, we should be able to update our DNS entry and get all the pages recreated and assigned to their correct URLs without any reduction in SEO rankings. Is this correct? This site gets a lot of organic traffic and ranks highly on some challenging keywords, so it's key that we retain our rankings as much as possible. I've read that it's wise to lower the DNS time-to-live to one hour, about a day before the move, to help Google crawl the DNS a little quicker. Are there any other recommendations you guys can offer or past experiences?
Technical SEO | | stephen_reply0 -
200 Redirects for SEO instead of 301
We are working with a company on re-platforming our website. On a call yesterday they outlined a strategy to use 200 redirects for our top keywords instead of 301s. I am not familiar with this type of redirect and was wondering if anyone could provide some more insight.
Technical SEO | | EvergladesDirect0