Long load time
-
My site takes double the time per kb than my competitors.
it hosted on shared hosting with Godaddy.com
Any ideas why this may be happening?
-
To be fair, your site isn't really overly slow.
www.appliance-repair-ny.com loads in an average of 3.9 seconds and is 194kb
www.all-appliance-repair-ny.com loads in an average of 5.6 seconds and is 327kb
www.newyorkappliancerepair.net loads in an average of 1.5 seconds and is 115kbAnd I think that's from Sweden. Your server is in Arizona so will be quicker from NY.
You could gzip your css, but it's not going to really give you a big improvement.
Yes, shared hosting will always be slower than a dedicated server but for the cost I don't think it will be worth going for a dedicated server and CDN delivery.
If you really wanted to track it you could add webmaster tools and (do you not have analytics on the page?) _gaq.push(['_trackPageLoadTime']); into your Google Analytics. This would let you see what times Google thought your page was loading in.
Speed is unlikely to be a defining ranking factor for you and you should concentrate your efforts more on acquiring links, reviews, and local optimisation.
-
-
What is your site and your competitors sites? It could be a lot of things.
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Help Center/Knowledgebase effects on SEO: Is it worth my time fixing technical issues on no-indexed subdomain pages?
We're a SaaS company and have a pretty extensive help center resource on a subdomain (help.domain.com). This has been set up and managed over a few years by someone with no knowledge of SEO, meaning technical things like 404 links, bad redirects and http/https mixes have not been paid attention to. Every page on this subdomain is set to NOT be indexed in search engines, but we do sometimes link to help pages from indexable posts on the main domain. After spending time fixing problems on our main website, our site audits now flag almost solely errors and issues on these non-indexable help center pages every week. So my question is: is it worth my time fixing technical issues on a help center subdomain that has all its pages non-indexable in search engines? I don't manage this section of the site, and so getting fixes done is a laborious process that requires going through someone else - something I'd rather only do if necessary.
Technical SEO | | mglover19880 -
URLs too long, but run an eCommerce site
Hi, When I started out I was pretty green to SEO, and didn't consider the usability/SEO impact of URL structure. Flash forward, I'm 5 years deep into using the following: mysite.com/downloads/category/premium-downloads/sub-category/ ("category" is quite literally one rung on the link - thanks, WordPress - however "sub-category" is a placeholder) I run a digital downloads store, and I now have 100s ofinternal links beholden to this hideous category linking structure. Not to mention external links at Google Ads, etc. I would LOVE to change this, but if I were to do so, what should I consider? For instance, is there a checklist for making a change like this? I was thinking of changing it to something like the following: mysite.com/shop/c/premium/sub-category/ And also, how much damage, if any, would this be doing to my SEO? Thanks in advance,
Technical SEO | | LouCommaTheCreator
Lou1 -
An immediate and long-term plan for expired Events?
Hello all, I've spent the past day scouring guides and walkthroughs and advice and Q&As regarding this (including on here), and while I'm pretty confident in my approach to this query, I wanted to crowd source some advice in case I might be way off base. I'll start by saying that Technical SEO is arguably my weakest area, so please bear with me. Anyhoozles, onto the question (and advance apologies for being vague): PROBLEM I'm working on a website that, in part, works with providers of a service to open their own programs/centers. Most programs tend to run their own events, which leads to an influx of Event pages, almost all of which are indexed. At my last count, there were approximately 800 indexed Event pages. The problem? Almost all of these have expired, leading to a little bit of index bloat. THINGS TO CONSIDER A spot check revealed that traffic for each Event occurs for about a two-to-four week period then disappears completely once the Event expires. About half of these indexed Event pages redirect to a new page. So the indexed URL will be /events/name-of-event but will redirect to /state/city/events/name-of-event. QUESTIONS I'M ASKING How do we address all these old events that provide no real value to the user? What should a future process look like to prevent this from happening? MY SOLUTION Step 1: Add a noindex to each of the currently-expired Event pages. Since some of these pages have link equity (one event had 8 unique links pointing to it), I don't want to just 404 all of them, and redirecting them doesn't seem like a good idea since one of the goals is to reduce the number of indexed pages that provide no value to users. Step 2: Remove all of the expired Event pages from the Sitemap and resubmit. This is an ongoing process due to a variety of factors, so we'd wrap this up into a complete sitemap overhaul for the client. We would also be removing the Events from the website so there are not internal links pointing to them. Step 3: Write a rule (well, have their developers write a rule) that automatically adds noindex to each Event page once it's expired. Step 4: Wait for Google to re-crawl the site and hopefully remove the expired Events from its index. Thoughts? I feel like this is the simplest way to get things done quickly while preventing future expired events from being indexed. All of this is part of a bigger project involving the overhaul of the way Events are linked to on the website (since we wouldn't be 404ing them, I would simply suggest that they be removed entirely from all navigation), but ultimately, automating the process once we get this concern cleaned up is the direction I want to go. Thanks. Eager to hear all your thoughts.
Technical SEO | | Alces0 -
How fast should a page load to get a Green light at Googles PageSpeed?
So, trying to get e big e-commerce site to work on their page loading issuses. Their question left me without an answer, so how fast should a site be, so that it will get a Green light at the Googles Page Speed test? Is there a number in seconds? Do we know that?
Technical SEO | | ziiiva1230 -
Pages with a short life time... example Flash Sales Ecommerce Sites?
Hello everyone, I am managing an ecommerce website and I am not sure what policy I need to make for a lot of the product pages.The Product pages, example Givency Bag, go live on a specific date and go down in a few days... like Groupon. Please shed some light in this dark tunnel.
Technical SEO | | MTalhaImtiaz
Thanks and regards,0 -
Long Domain Name - Subpage URL Question
I have a long domain name, so domainname/services/page title can get pretty lengthy. I have a services page, as a summary page since there's a few of htem, with more detailed on the actual page. In this situation, would it be better to do domainname.com/services/service-name which can exceed the suggested 70 characters, or would it be a better idea to do domain.com/service-name and just have hte m under the services menu? Is there any advantage/disadvantage to going out 2-3 tiers? or having the sub pages of those services off the domain instead of a child of the root child page Please let me know if any clarification is needed. Thanks!
Technical SEO | | tgr0ss0 -
Updating rss feed times without changing content
my question is like the title reads If I have an rss feed in an xml file and from time to time I update the pubdate and time. Will this have a positive effect on my website in terms of the rss aggregators coming to my site thinking that it was recently updated and creating links to these pages or will they be able to determine that there is nothing new by comparing it to the old page that they may have stored. thus doing nothing or maybe even hurting the website.
Technical SEO | | mickey112 -
Google Places Citations - How Long to Aggregate?
Hi, On Google Places we have clients that have bad data (incorrect name, address, #) on aggregated sites (citysearch, merchantcircle, yelp) which prevents those sites from pulling into the Google Places accounts. We've been manually correcting listings for a while and many times it still hasn't pulled into the Google Places listings for months on end despite the data matching. What are your experiences with correcting aggregated Google Places data, how long has it taken for this data to pull into your places accounts?
Technical SEO | | qlkasdjfw0