Long load time
-
My site takes double the time per kb than my competitors.
it hosted on shared hosting with Godaddy.com
Any ideas why this may be happening?
-
To be fair, your site isn't really overly slow.
www.appliance-repair-ny.com loads in an average of 3.9 seconds and is 194kb
www.all-appliance-repair-ny.com loads in an average of 5.6 seconds and is 327kb
www.newyorkappliancerepair.net loads in an average of 1.5 seconds and is 115kbAnd I think that's from Sweden. Your server is in Arizona so will be quicker from NY.
You could gzip your css, but it's not going to really give you a big improvement.
Yes, shared hosting will always be slower than a dedicated server but for the cost I don't think it will be worth going for a dedicated server and CDN delivery.
If you really wanted to track it you could add webmaster tools and (do you not have analytics on the page?) _gaq.push(['_trackPageLoadTime']); into your Google Analytics. This would let you see what times Google thought your page was loading in.
Speed is unlikely to be a defining ranking factor for you and you should concentrate your efforts more on acquiring links, reviews, and local optimisation.
-
-
What is your site and your competitors sites? It could be a lot of things.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Why Arabic URLs considered long length by Moz Pro Audit?
I am running a multi-language website (Ar/En): https://drmoamenada.com/ When I carry out Audit using Moz Pro, I see many issues related to long URL length in Arabic pages although they don't exceed 65 characters long in the Arabic language. Can you help me with this issue, please?
Technical SEO | | MoamenNada0 -
How fast should a page load to get a Green light at Googles PageSpeed?
So, trying to get e big e-commerce site to work on their page loading issuses. Their question left me without an answer, so how fast should a site be, so that it will get a Green light at the Googles Page Speed test? Is there a number in seconds? Do we know that?
Technical SEO | | ziiiva1230 -
Homepage Not Ranking in Google - How long do old (not current) bad SEO practices exert influence?
I'm trying to get to the bottom of a problem I have with the Google ranking for mauiactivities.com - it's far below what I would hope for. My research so far has uncovered the following, and any advice on where to go from here would be appreciated. _Edit:_No problems with Bing or Yahoo - the site is #1 for primary key word 'maui activities' 1. Running a site:http://www.mauiactivities.com search in Google reveals that the homepage doesn't rank. At all. I looked through the 17 pages of results and can't spot it. Edit: I have now, after fresh checks after submitting the homepage through Search Console, found it at #1 - still, the following applies ... 2. I've found that the domain (before it was purchased by my client in 2011) had some bad inbound links, specifically from scubamaui.com (no longer active). The links where white, on a white background. This web archive snapshot will reveal all. 3. Those bad links were 'cleaned up' (i.e. they don't show in the web archive) from 2014, and as mentioned above, the website is now 'down'. 4. Search Console doesn't have a manual penalty. 5. When I search for 'tropical divers maui' in Google I find www.mauiactivities.com is the 4th result. To me, this indicates a current relationship with the dead site (Tropical Divers Maui). No other term comes close to ranking to high for the homepage. So, to summarise - can the old, dead Tropical Divers Maui website still be affecting the Google ranking, and what would you suggest I do next?
Technical SEO | | jsherwin0 -
JS loading blocker
Is there a tool, or Chrome extension I can use to load a page, identify the .js on the page, 'uncheck' selected .js and load the page again to check loading correctly? Even better to be able to defer/move to the end of the file to test.
Technical SEO | | MickEdwards0 -
Long title problem
I'm getting an incredible number of 4xx errors and long titles from a small website (northstarpad.com); over 13k 4xx errors and almost 20k "title element is too long". The number keeps climbing, but the site shouldn't have more than a couple hundred pages. When I look at the 4xx errors they are clearly being generated by some program since they have multiple and repeating keywords. Here's an example: | http://northstarpad.com/category/wedding-photographer-farmington-michigan/pet-photography/wedding-photography/pet-photography/wedding-photography/wedding-photography/wedding-photography/pet-photography/wedding-photography/wedding-photography/pet-photography/ | I looked at the ftp files and plugins and couldn't see anything that could cause it, but I'm a beginner so no surprise there. Any suggestions where to look or how to fix this?
Technical SEO | | dwerkema0 -
Timely use of robots.txt and meta noindex
Hi, I have been checking every possible resources for content removal, but I am still unsure on how to remove already indexed contents. When I use robots.txt alone, the urls will remain in the index, however no crawling budget is wasted on them, But still, e.g having 100,000+ completely identical login pages within the omitted results, might not mean anything good. When I use meta noindex alone, I keep my index clean, but also keep Googlebot busy with indexing these no-value pages. When I use robots.txt and meta noindex together for existing content, then I suggest Google, that please ignore my content, but at the same time, I restrict him from crawling the noindex tag. Robots.txt and url removal together still not a good solution, as I have failed to remove directories this way. It seems, that only exact urls could be removed like this. I need a clear solution, which solves both issues (index and crawling). What I try to do now, is the following: I remove these directories (one at a time to test the theory) from the robots.txt file, and at the same time, I add the meta noindex tag to all these pages within the directory. The indexed pages should start decreasing (while useless page crawling increasing), and once the number of these indexed pages are low or none, then I would put the directory back to robots.txt and keep the noindex on all of the pages within this directory. Can this work the way I imagine, or do you have a better way of doing so? Thank you in advance for all your help.
Technical SEO | | Dilbak0 -
Duplicate pages, overly dynamic URL’s and long URL’s in Magento
Hi there, I’ve just completed the first crawl of my Magento site and SEOMOZ has picked up 1,000’s of duplicate pages, overly dynamic URL’s and long URL’s due to the sort function which appends URL’s with variables when sorting products (e.g. www.example.com?dir=asc&order=duration). I’m not particularly concerned that this will affect our rankings as Google has stated that they are familiar with the structure of popular CMS’s and Magento is pretty popular. However it completely dominates my crawl diagnostics so I can’t see if there are any real underlying issues. Does anyone know a way of preventing this? Cheers,
Technical SEO | | WendyWuTours
Al.1 -
When is the best time to submit a sitemap?
What changes to a website constitute resubmitting a sitemap? For example, if I add new in-site links, should I then resubmit? Or is it more for changes to URLs, Page titles, etc?
Technical SEO | | MichaelWeisbaum0