Tools to check mobile speed performance
-
Hey Guys,
Looking at a site which has mobile versions of each page example: m.domain.com.au
Some of these pages have images which are over 1mb.
I want to quickly identify these pages with high image file sizes, any good tools which can do this?
Cheers.
-
Run Screaming Frog on your subdomains and check the Images tab in the report, then sort by image size and you'll find the large images.
Download Screaming Frog from here: http://www.screamingfrog.co.uk/seo-spider/
-
You can use Google PageSpeed Insights:
https://developers.google.com/speed/pagespeed/insights/And if your site is on WordPress then you can install this plugin:
https://wordpress.org/plugins/google-pagespeed-insights/You also can use WebPageTest, GTMetrix, Pingdom Tools. Even one simply *nix tool as wget can make local mirror to site where you can see filesizes (command is wget -r -m http://m.domain.com/au ).
Alternative you can use desktop crawlers (like mine SEOSpyder) where you can see also images size in bytes.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Silo Architecture and Mobile First
This goes to the age-old SEO argument - how many links in the navigation. We are a well-known brick and mortar brand We have 20,000 SKUs and over 500 categories and sub-catetgories. 95%+ of our backlinks go to the home page. We don't have a blog, but it's in the works. Our site is not responsive. It serves up different versions based on device type, but is not an "M Dot". Our rankings are pretty strong in spite of a large number of technical SEO issues (different discussion). Currently, our e-commerce desktop site is "Siloed" (I'm new to the company - I didn't do it). The home page links via the top nav to categories. The category pages link to subcategories via sidebar navigation, or via images on the category pages (instead of product images). It's pretty close to textbook silos, and it's very near how I would have designed it. This silo architecture passes the most link juice to our categories which target our highest search volume (head) terms. The categories pass link juice (albeit significantly less) to our subcats which target secondary terms. In terms of search volume and commercial value, our tiers line up very neatly. On average, the targeted subcat terms get about 1/6 of the volume of our head terms. The Silo concept has been around forever, and is evangelized by Bruce Clay and other respected SEOs. Every time I've siloed an ecommerce site, the rankings improve dramatically, so who am I to argue? So, what's the problem? Read on... Our mobile navigation, on the other hand, links to every category and subcategory via flyout navigation (I didn't do this, either). In theory, this distributes an equal amount of link juice to all categories and subcategories. It robs link juice from our categories and passes it to subcategories. Right now, this isn't a problem. Rankings are based on the desktop site, and minor adjustments are made for mobile rankings. When Mobile First rolls out, our mobile nav will be the default navigation for Google, and in theory, link juice distribution across the site will change radically, and potentially harm our rankings for our head terms. I always study site architecture for a number of respected ecommerce sites. Target and Walmart, for example, link to every category and subcategory through their mobile and desktop navigation. Wayfair takes a silo approach on mobile and desktop, linking in tiers. I would argue that Walmart and Target have so much DA/TF/CF that they don't give a damn about targeted link juice distribution - it's all about UX. Wayfair's backlink profile is strong, but it's not Walmart or Target, so they need to be concerned about link juice distribution - hence the silo approach. Have the Google spokespeople said anything about this? I see this as a potential landmine across the industry. Is this something I should be concerned about? Has anyone had any experience with de-siloing a website? Am I making a big deal out of a non-issue? Please - no arguments about usability. UX is absolutely part of the equation. Usability is a ranking factor, but if our rankings and traffic take a nose dive, UX isn't going to matter. This is a theoretical discussion discussion on link juice distribution, and I know that compromises need to be made between SEO and UX.
Intermediate & Advanced SEO | | Satans_Apprentice0 -
How to speed up transition towards new 301 redirected landing pages?
Hi SEO's, I have a question about moving local landing pages from many separate pages towards integrating them into a search results page. Currently we have many separate local pages (e.g. www.3dhubs.com/new-york). For both scalability and conversion reasons, we'll integrate our local pages into our search page (e.g. www.3dhubs.com/3d-print/Bangalore--India). **Implementation details: **To mitigate the risk of a sudden organic traffic drop, we're currently running a test on just 18 local pages (Bangalore) = 1 / 18). We applied a 301 redirect from the old URL's to the new URL's 3 weeks ago. Note: We didn't yet update the sitemap for this test (technical reasons) and will only do this once we 301 redirect all local pages. For the 18 test pages I manually told the crawlers to index them in webmaster tools. That should do I suppose. **Results so far: **The old url's of the 18 test cities are still generating > 99% of the traffic while the new pages are already indexed (see: https://www.google.nl/webhp?sourceid=chrome-instant&ion=1&espv=2&ie=UTF-8#q=site:www.3dhubs.com/3d-print/&start=0). Overall organic traffic on test cities hasn't changed. Questions: 1. Will updating the sitemap for this test have a big impact? Google has already picked up the new URL's so that's not the issue. Furthermore, the 301 redirect on the old pages should tell Google to show the new page instead, right? 2. Is it normal that search impressions will slowly shift from the old page towards the new page? How long should I expect it to take before the new pages are consistently shown over the old pages in the SERPS?
Intermediate & Advanced SEO | | robdraaijer0 -
Google Webmaster tools -Fixing over 20,000+ crawl errors
Hi, I'm trying to gather all the 404 crawl errors on my website after a recent hacking that I've been trying to rectify and clean up. Webmaster tools states that I have over 20 000+ crawl errors. I can only download a sample of 1000 errors. Is there any way to get the full list instead of correcting 1000 errors, marking them as fixed and waiting for the next batch of 1000 errors to be listed in Webmaster tools? The current method is quite timely and I want to take care of all errors in one shot instead of over a course of a month.
Intermediate & Advanced SEO | | FPK0 -
Do I miss traffic (thus, page value) by using the GWMT Parameter Handling Tool?
I'm working through duplicate content issues. The tracking code or the session id in the URL is being recognized as a different page than the original. Example: www.example.com is dup content to www.example.com?_nk=x&ad=y&_ga=z, which is tied to a marketing campaign If my setup in the URL parameter tool is set to: Effect = None Crawl = Representative URL, then do I: 1. Miss all the traffic being driven to the ?_nk page?
Intermediate & Advanced SEO | | johnnybgunn
2. With a Rep URL, there still would be two indexed listings: the .com & the .com?_nk...right? Neither is good. Redirects of all the URLs is not an option b/c there are hundreds of these that would need to be redirected. And I also don't want to slow down page load time with excessive redirects, which has been the case when adding 100+ redirects for the recent website migration we did.0 -
SEO mobile app optimization: multi tag link alternate media per every devices is acceptable in the desktop page?
Hi All, Hi hope someone could answer to this question because on internet I haven't found a clear solution so far: I have: 1 desktop website (let's make www.example.com) and different mobile websites for each main device (let's make iphone.example.mobi; android.example.mobi; winphone.example.mobi) In order to optimize my mobile websites, According to the Google guideline of the above separate urls configuration , I should add a tag link alternate media in the desktop page and a canonical tag in the corresponding mobile page in order to create a connection between them. But, I need to keep a 1-to-1 connection between desktop page and mobile page (Google recommends to have 1 desktop page linked to 1 mobile page and viceversa and discourages the 1-to-multi connections). What I would like: In my case, I have to add the a single desktop page of desktop site (example www.example.com/category1/), 3 links alternate media tag,( one for iphone.example.mobi, one for android.example.mobi and one for winphone.example.mobi). Furthemore, I have to add a canonical tag in every corresponding mobile page of the 3 mobile site version, a canonical tag pointing to my sektop page www.example.com/category1/. Now my worries are: having a single desktop page with 3 different link alternate tags pointing to 3 different mobile websites (one each), is something or not aligned to the google seo mobile guideline? If not, How should I configure my desktop website and my 3 mobile web applications(iphone, android, winphone) in order to follow the Google requirements for Separate urls apllication? Thanks, Massimliano
Intermediate & Advanced SEO | | AdiRste0 -
Bad performance for low competition term
Hi everybody. I've been working on this page for some time, http://www.double-glazing-forum.com/anglian-windows.aspx. Until several months ago, it ranked really well for the terms 'Anglian windows' and 'Anglian windows reviews'. However, following a Google update it tanked and has got worse ever since. Here's what I've done to try and fix it. Added 800 words of unique copy Added YouTube videos Replaced scraped press releases with unique descriptions that link to the source Analysed the backlink profile and uploaded a disavow file containing all bad links Contacted webmaster to remove them where possible Getting a bit low on ideas now, so any help would be great!
Intermediate & Advanced SEO | | Blink-SEO0 -
Preferred domain can't set in Web master Tool
I have put my domain name as xxxxxtours.com without www in web master tool. i have redirect to www version using htaccess file .So I wanna put Preferred domain "Display urls as www.xxxxtours.com .When trying it give error as attached image.but i have verified site the .waiting for expert help . Ar5qx.png
Intermediate & Advanced SEO | | innofidelity0 -
Page Crawling Check after Modification Done without staying 7 days
Page Crawling Check after Modification Done without staying 7 days. I have dome modification to my site and uploaded .so i wanna check remaining errors but Moz Crawl web site once per 7 days ,is there any way to check before that . Thank you
Intermediate & Advanced SEO | | innofidelity0