Server response time: restructure the site or create the new one? SEO opinions needed.
-
Hi everyone,
The internal structure of our existing site increase server response time (6 sec) which is way below Google 0.2sec standards and also make prospects leave the site before it's loaded.
Now we have two options (same price):
- restructure the site's modules, panels etc
- create new site (recommended by developers)
Both options will extend the same design and functionality.
I just wanted to know which option SEO community will recommend?
-
Yes, correct - multiple CCS files & javascript will not affect server response time - I think ryan was referring to page load speed.
-
Hello.
Before starting from scratch, try to optimize Drupal. You have some simple things to do which speed Drupal amazingly:
- Go to Administer » Site configuration » Performance page, enable the option ""Aggregate and compress CSS files." and "Aggregate Javascript Files".
- On the same page, activate the cache: "Cache page for anonymous users" and "Cache blocks".
Try if it helps while you find the source of the problem.
-
There is one huge thing that is being missed here by both of you. The Google Insight grades on server response time. Server response time has no bearing on if a site loads 1 css file or 30 css files. It has not bearing on how many js files are loaded and if the parsing of them is deferred or not. If you follow all of the suggestions that pingdom gives you to the T, it will not affect your server response time one bit.
The only way to affect your server response time is going to be to reduce the processing time of your site. Not the loading time in the browser. To reduce your server response time you are going to have to explore server caching, mysql optimization, and things such as that.
This might help to read as well.
https://support.google.com/analytics/answer/2383341?hl=en
http://www.blogaid.net/server-response-time-vs-page-load-time-for-google-crawl
-
We'll make it as fast as possible! Thanks John. Just need to figure out if we should restructure the existing site, or make it from scratch.
-
Ryan
Yes. I do not worry about the speed variations - there are too many variables on each test. ie Which server did the test use?
My view on page speed is forget "time" and "time ranges" on various tools. If you have identified page speed as issue which you have focus on what you know you can and should fix. Don't just fix the minimum - on page speed fix the maximum. I believe page speed is a key factor on ranking.
-
John, thanks for the tool. Site has multiple CSS for the same types of content, too much of different modules, panels and blocks for the simple site. Btw, in Google page speed test it shows different time speed in the range between 2.1 sec and 6,5 sec. Have you ever seen this dependency?
-
Lesley is correct it is important to understand why the issues before you move forward.
I am not sure if you are familiar with tools.pingdom.com - but free test your site on tools.pingdom. Then review the performance tab - and see what your loading problems are. Also .2 of a second is best in class - if you can get below 2 seconds I would be happy with that. Not suggesting you do not go for .2 - just that it is onerous and likely not time efficient.
The positive is I have seen several times dropping a site from 6 seconds to 2 seconds gets me an uplift in rankings without doing anything else!
-
I am not familiar with Drupal, when you say you are restructuring is that something internal in Drupal? Or does that mean you are changing the page structure of your site, like for instance moving pages around? Or are you removing some widgets and things like that from pages?
-
It's Drupal 7. We don't redesign, we're restructuring. Yes, server takes too much time to generate the pages, they're dynamic.
-
Server response time is tied to two factors. The first one is the DNS look up, the second one is the time it takes your server to generate a page and spit it out. Generally both of those can be improved without having to redesign your site. What is your site currently developed in? Is it constantly changing?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
GMB Website Create Competition That Can Hurt Your Own Site?
Hello, Does anyone know if creating a Google My Business website for a business using the GMB builder creates competition for that business's main, non-GMB website? Thanks.
Local Website Optimization | | lawfather0 -
Research on industries that are most competitive for SEO?
I am trying to see if there is a reputable / research-backed source that can show which industries are most competitive for search engine optimization. In particularly, I'd be interested in reports / research related to the residential real estate industry, which I believe based on anecdotal experience to be extremely competitive.
Local Website Optimization | | Kevin_P3 -
Discourage search engines from indexing this site AFTER a site launch
Hi, I have unticked "Discourage search engines from indexing this site" a few months before the initial release of my website. I don't want to be found by search engines until the official release (still a few months left). Do you think that ticking this box again will harm the website's long-term ranking or have any repercussion on the website? Do you have any additional advice to avoid being temporarily ranked until the official release which won't harm the website in SERPs? Thanks for your answers.
Local Website Optimization | | Juvo0 -
What is the effect of CloudFlare CDN on page load speeds, hosting IP location and the ultimate SEO effect?
Will using a CDN like CloudFlare.com confuse search engines in terms of the location (IP address) of where the site is actually physically hosted especially since CloudFlare distributes the site's content all around the globe? I understand it is important that if customers are mostly in a particular city it makes sense to host on an IP address in the same city for better rankings, all things else being equal? I have a number of city-based sites but does it make having multiple hosting plans in multiple cities/ countries (to be close to customers) become suddenly a ridiculous thing with a CDN? In other words should I just reduce it down to having one hosting plan anywhere and just use the CDN to distribute it? I am really struggling with this concept trying to understand if I should consolidate all my hosting plans under one, or if I should get rid of CloudFlare entirely (can it cause latency in come cases) and create even more locally-based hosting plans (like under site5.com who allow many city hosting plans). I really hope you can help me somehow or point me to an expert who can clarify this confusing conundrum. Of course my overall goal is to have:
Local Website Optimization | | uworlds
1. lowest page load times
2. best UX
3. best rankings I do realise that other concepts are more important for rankings (great content, and links etc.) but assuming that is already in place and every other factor is equal, how can I fine tune the hosting to achieve the desirable goals above? Many thanks!
Mark0 -
Canonical for blog tag or search site
Dear all, I have problem with duplicate content on my site and crawled by seomoz as "duplicate content", might be i am not clear enough about how to put "canoncial" but the problem is with my site mostly on blog or tags or categories, so some link that actually different tags ....come with same result..so like: http://www.livingwordfreelutheran.org/news-events/blog/tag/ Gymnastics and http://www.livingwordfreelutheran.org/news-events/blog/tag/ God's Power It will show same result..the problem is,all are dynamic... and what i should put the canonical for that page? Both of link use same page or controller? If i put the canonical itself on each result it will be fix it? Or how? …and also I confusing how I put it also on search result? Like ?query=keywords that show same result? How I put canonical on there? Sorry if this duplicate question... I very very appreciate for the help…thank you! Best regards,
Local Website Optimization | | lwflc
Harrison0 -
Expert Advice Needed: Single Domain vs Multiple Domain for 2 Different Countries?
Hi MOZers, We are looking for some advice on whether to have a single TLD(.com) or 2 separate domains (.ca) & (.com) Our website will have different products & pricing for each of US users(.com) and Canada users(.ca). Since, we are targeting different countries & user groups with each domain - we are not concerned about "duplicate content". So, does it make more sense to have a single domain for compounding our content marketing efforts? Or, Will it be more beneficial to have seperate domains for the geo-targeting benefits on Google.CA & Google.COM? Looking forward to some great suggestions.
Local Website Optimization | | ScorePromotions0 -
Does Schema Replace Conventional NAP in local SEO?
Hello Everyone, My question is in regards to Schema and whether the it replaces the need for the conventional structured data NAP configuration. Because you have the ability to specifically call out variables (such as Name, URL, Address, Phone number ect.) is it still necessary to keep the NAP form-factor that has historically been required for local SEO? Logically it makes sense that schema would allow someone to reverse this order and still achieve the same result, however I have yet to find any conclusive evidence of this being the case. Thanks, and I look forward to what the community has to say on this matter.
Local Website Optimization | | toddmumford0 -
Single Site For Multiple Locations Or Multiple Sites?
Hi, Sorry if this rambles on. There's a few details that kind of convolute this issue so I'll try and be as clear as possible. The site in question has been online for roughly 5 years. It's established with many local citations, does well in local SERPs (working on organic results currently), and represents a business with 2 locations in the same county. The domain is structured as location1brandname.com. The site was recently upgraded from a 6-10 page static HTML site with loads of duplicate content and poor structure to a nice, clean WordPress layout. Again, Google is cool with it, everything was 301'd properly, and our rankings haven't dropped (some have improved). Here's the tricky part: To properly optimize this site for our second location, I am basically building a second website within the original, but customized for our second location. It will be location1brandname.com/secondcity and the menu will be unique to second-city service pages, unique NAP on footer, etc. I will then update our local citations with this new URL and hopefully we'll start appearing higher in local SERPs for the second-city keywords that our main URL isn't currently optimized for. The issue I have is that our root domain has our first city location in the domain and that this might have some negative effect on ranking for the second URL. Conversely, starting on a brand new domain (secondcitybrandname.com) requires building an entire new site and being brand new. My hunch is that we'll be fine making root.com/secondcity that locations homepage and starting a new domain, while cleaner and compeltely separate from our other location, is too much work for not enough benefit. It seems like if they're the same company/brand, they should be on the same sitee. and we can use the root juice to help. Thoughts?
Local Website Optimization | | kirmeliux0