Server response time: restructure the site or create the new one? SEO opinions needed.
-
Hi everyone,
The internal structure of our existing site increase server response time (6 sec) which is way below Google 0.2sec standards and also make prospects leave the site before it's loaded.
Now we have two options (same price):
- restructure the site's modules, panels etc
- create new site (recommended by developers)
Both options will extend the same design and functionality.
I just wanted to know which option SEO community will recommend?
-
Yes, correct - multiple CCS files & javascript will not affect server response time - I think ryan was referring to page load speed.
-
Hello.
Before starting from scratch, try to optimize Drupal. You have some simple things to do which speed Drupal amazingly:
- Go to Administer » Site configuration » Performance page, enable the option ""Aggregate and compress CSS files." and "Aggregate Javascript Files".
- On the same page, activate the cache: "Cache page for anonymous users" and "Cache blocks".
Try if it helps while you find the source of the problem.
-
There is one huge thing that is being missed here by both of you. The Google Insight grades on server response time. Server response time has no bearing on if a site loads 1 css file or 30 css files. It has not bearing on how many js files are loaded and if the parsing of them is deferred or not. If you follow all of the suggestions that pingdom gives you to the T, it will not affect your server response time one bit.
The only way to affect your server response time is going to be to reduce the processing time of your site. Not the loading time in the browser. To reduce your server response time you are going to have to explore server caching, mysql optimization, and things such as that.
This might help to read as well.
https://support.google.com/analytics/answer/2383341?hl=en
http://www.blogaid.net/server-response-time-vs-page-load-time-for-google-crawl
-
We'll make it as fast as possible! Thanks John. Just need to figure out if we should restructure the existing site, or make it from scratch.
-
Ryan
Yes. I do not worry about the speed variations - there are too many variables on each test. ie Which server did the test use?
My view on page speed is forget "time" and "time ranges" on various tools. If you have identified page speed as issue which you have focus on what you know you can and should fix. Don't just fix the minimum - on page speed fix the maximum. I believe page speed is a key factor on ranking.
-
John, thanks for the tool. Site has multiple CSS for the same types of content, too much of different modules, panels and blocks for the simple site. Btw, in Google page speed test it shows different time speed in the range between 2.1 sec and 6,5 sec. Have you ever seen this dependency?
-
Lesley is correct it is important to understand why the issues before you move forward.
I am not sure if you are familiar with tools.pingdom.com - but free test your site on tools.pingdom. Then review the performance tab - and see what your loading problems are. Also .2 of a second is best in class - if you can get below 2 seconds I would be happy with that. Not suggesting you do not go for .2 - just that it is onerous and likely not time efficient.
The positive is I have seen several times dropping a site from 6 seconds to 2 seconds gets me an uplift in rankings without doing anything else!
-
I am not familiar with Drupal, when you say you are restructuring is that something internal in Drupal? Or does that mean you are changing the page structure of your site, like for instance moving pages around? Or are you removing some widgets and things like that from pages?
-
It's Drupal 7. We don't redesign, we're restructuring. Yes, server takes too much time to generate the pages, they're dynamic.
-
Server response time is tied to two factors. The first one is the DNS look up, the second one is the time it takes your server to generate a page and spit it out. Generally both of those can be improved without having to redesign your site. What is your site currently developed in? Is it constantly changing?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What's your proudest accomplishment in regards to SEO?
After many years in the industry, you come to realize a few things. One of of the biggest pain points for us at web daytona was being able to give clients a quick keyword ranking cost estimation. After multiple trial and error and relying on API data from one of the most reliable SEO softwares in our industry, we were able to develop an SEO tool that allows us to quickly and accurately get the estimated cost for a given keyword (s) using multiple variables. Most agencies can relate to that story. It’s something my colleagues and I at Web Daytona have been through before. Finding the cost and amount of time needed to rank for a keyword is a time consuming process. That’s why it’s a common practice to sell SEO packages of 5-10 keywords for about $1000-2000 / month. The problem is not all keywords are equally valuable, and most clients know this. We constantly get questions from clients asking: “how much to rank for this specific keyword?” It’s difficult to answer that question with a pricing model that treats the cost of ranking every keyword equally. So is the answer to spend a lot more time doing tedious in-depth keyword research? If we did we could give our clients more precise estimates. But being that a decent proposal can take as long as 2-5 hours to make, and agency life isn’t exactly full of free time, that wouldn’t be ideal. That’s when we asked a question. What if we could automate the research needed to find the cost of ranking keywords? We looked around for a tool that did, but we couldn’t find it. Then we decided to make it ourselves. It wasn’t going to be easy. But after running an SEO agency for over a decade, we knew we had the expertise to create a tool that wouldn’t just be fast and reliable, it would also be precise. Fast forward to today and we’re proud to announce that The Keyword Cost Estimator is finally done. Now we’re releasing it to the public so other agencies and businesses can use it too. You can see it for yourself here. Keyword-Rank-Cost-Ectimator-Tool-by-Web-Daytona-Agency.png
Local Website Optimization | | WebDaytona0 -
What Mystery Local SEO Factors Are At Play Here?
Absolutely perplexed on the ranking factors for Google Maps (hence also the 3-pack in normal search results). Are seeing search queries that return 3-pack and organic result like this and wondering why these sites are getting 3-pack preference?Not that sites 2 and 3 are no closer to the test user's location than Site 4. All 4 sites have a street address showing.3-pack result:#1 - Site 1 - No reviews. Same distance as Site 4 to user. #2 - Site 2 - 1 review for 1 star. Farther from user than site 4. #3 - Site 3 - 2 reviews for 5-star average. Farther from user than site 1, 2, and 4.#4 (not show in 3-pack) - Site 4 - 6 reviews with 6 star rating, closer to user than site 2 and 3.Organic results below 3-pack:#1 - Site 4#2 - Site 4#3 - Other site#4 - Site 1Sites 2 and 3 not in top 10 organic non-map resultsSo what would be the most likely ranking factors keeping making site 1-3 rank above site 4 in the 3-pack/map results?If on-page and backlink factors were at play, you'd expect to see sites 1, 2 and 3 higher than site 4, and in the case of site 2 and 3 at least in the top 10 of the organic results. All sites were similar distance to the user.
Local Website Optimization | | SEO18050 -
Even after doing every possible thing required for SEO my client's website is not coming on top.can you tell me where i am lacking?
_ Hi team_ I have been working on a website called signboards.co.in since 4 months.it was not in top 100 but now below 50 for 2-3 keywords.even after submitting in many directories after competitor analysis moz shows only one external link in its link metrics.apart from this every possible thing required for SEO is done in a proper way,but still it is not giving results.can you help me out?all my other clients work is going good except this one.can you please let me know what is going wrong with my project?As the project submission date is near i need your help as soon as possible. Thanks Najia jehan
Local Website Optimization | | Najia-ali0 -
What Is The Best Strategy For Writing Image Alt Text For SEO?
Curious on this topic, as websites that are image heavy, but have little written content can have depend on alt text for "readable content". I am aware the "best practice" is to write it as if you were describing the image to a blind person, but are there any SEO strategies that people have seen good results with? Some examples I've heard are: "unique keyword phrase" "unique keyword phrase + brand name" "Unique Keyword Phrase + LSI Keyword" Interested to hear feedback from the Moz Community! And thanks in advance for sharing your insight.
Local Website Optimization | | LureCreative0 -
Subdomain vs. Separate Domain for SEO & Google AdWords
We have a client who carries 4 product lines from different manufacturers under a singular domain name (www.companyname.com), and last fall, one of their manufacturers indicated that they needed to move to separate out one of those product lines from the rest, so we redesigned and relaunched as two separate sites - www.companyname.com and www.companynameseparateproduct.com (a newly-purchased domain). Since that time, their manufacturer has reneged their requirement to separate the product lines, but the client has been running both sites separately since they launched at the beginning of December 2016. Since that time, they have cannibalized their content strategy (effective February 2017) and hacked apart their PPC budget from both sites (effective April 2017), and are upset that their organic and paid traffic has correspondingly dropped from the original domain, and that the new domain hasn't continued to grow at the rate they would like it to (we did warn them, and they made the decision to move forward with the changes anyway). This past week, they decided to hire an in-house marketing manager, who is insisting that we move the newer domain (www.companynameseparateproduct.com) to become a subdomain on their original site (separateproduct.companyname.com). Our team has argued that making this change back 6 months into the life of the new site will hurt their SEO (especially if we have to 301 redirect all of the old content back again, without any new content regularly being added), which was corroborated with this article. We'd also have to kill the separate AdWords account and quality score associated with the ads in that account to move them back. We're currently looking for any extra insight or literature that we might be able to find that helps explain this to the client better - even if it is a little technical. (We're also open to finding out if this method of thinking is incorrect if things have changed!)
Local Website Optimization | | mkbeesto0 -
Subdomain versus Subfolder for Local SEO
Hello Moz World, I'm wanting to know the best practices for utilizing a subdomain versus a subfolder for multi location businesses, i.e. miami.example.com vs. example.com/miami; I would think that that utilizing the subdomain would make more sense for a national organization with many differing locations, while a subfolder would make more sense for a smaller more nearby locations. I wanted to know if anyone has any a/b examples or when it should go one way or another? Thank you, Kristin Miller
Local Website Optimization | | Red_Spot_Interactive0 -
2 Relevant local websites but closing one and redirecting it to an older site
We have 2 websites, 1 domain is about 10 years old and another is about 4 years old, the 4 yr old domain we are thinking of shutting down since its the same type of service we run but it was a 'keyword domain' that used to rank on 1st page but now its 4th page back. If we put the blog posts and other content + setup re-directs from the 4yr old domain to the 10 yr old domain, would this help the 10 yr old domain with more link juice that it might need for the extra boost? There isnt really any point having both websites up since both are about the same content and targeting the same local market.
Local Website Optimization | | surfsup0