Server response time: restructure the site or create the new one? SEO opinions needed.
-
Hi everyone,
The internal structure of our existing site increase server response time (6 sec) which is way below Google 0.2sec standards and also make prospects leave the site before it's loaded.
Now we have two options (same price):
- restructure the site's modules, panels etc
- create new site (recommended by developers)
Both options will extend the same design and functionality.
I just wanted to know which option SEO community will recommend?
-
Yes, correct - multiple CCS files & javascript will not affect server response time - I think ryan was referring to page load speed.
-
Hello.
Before starting from scratch, try to optimize Drupal. You have some simple things to do which speed Drupal amazingly:
- Go to Administer » Site configuration » Performance page, enable the option ""Aggregate and compress CSS files." and "Aggregate Javascript Files".
- On the same page, activate the cache: "Cache page for anonymous users" and "Cache blocks".
Try if it helps while you find the source of the problem.
-
There is one huge thing that is being missed here by both of you. The Google Insight grades on server response time. Server response time has no bearing on if a site loads 1 css file or 30 css files. It has not bearing on how many js files are loaded and if the parsing of them is deferred or not. If you follow all of the suggestions that pingdom gives you to the T, it will not affect your server response time one bit.
The only way to affect your server response time is going to be to reduce the processing time of your site. Not the loading time in the browser. To reduce your server response time you are going to have to explore server caching, mysql optimization, and things such as that.
This might help to read as well.
https://support.google.com/analytics/answer/2383341?hl=en
http://www.blogaid.net/server-response-time-vs-page-load-time-for-google-crawl
-
We'll make it as fast as possible! Thanks John. Just need to figure out if we should restructure the existing site, or make it from scratch.
-
Ryan
Yes. I do not worry about the speed variations - there are too many variables on each test. ie Which server did the test use?
My view on page speed is forget "time" and "time ranges" on various tools. If you have identified page speed as issue which you have focus on what you know you can and should fix. Don't just fix the minimum - on page speed fix the maximum. I believe page speed is a key factor on ranking.
-
John, thanks for the tool. Site has multiple CSS for the same types of content, too much of different modules, panels and blocks for the simple site. Btw, in Google page speed test it shows different time speed in the range between 2.1 sec and 6,5 sec. Have you ever seen this dependency?
-
Lesley is correct it is important to understand why the issues before you move forward.
I am not sure if you are familiar with tools.pingdom.com - but free test your site on tools.pingdom. Then review the performance tab - and see what your loading problems are. Also .2 of a second is best in class - if you can get below 2 seconds I would be happy with that. Not suggesting you do not go for .2 - just that it is onerous and likely not time efficient.
The positive is I have seen several times dropping a site from 6 seconds to 2 seconds gets me an uplift in rankings without doing anything else!
-
I am not familiar with Drupal, when you say you are restructuring is that something internal in Drupal? Or does that mean you are changing the page structure of your site, like for instance moving pages around? Or are you removing some widgets and things like that from pages?
-
It's Drupal 7. We don't redesign, we're restructuring. Yes, server takes too much time to generate the pages, they're dynamic.
-
Server response time is tied to two factors. The first one is the DNS look up, the second one is the time it takes your server to generate a page and spit it out. Generally both of those can be improved without having to redesign your site. What is your site currently developed in? Is it constantly changing?
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
What Mystery Local SEO Factors Are At Play Here?
Absolutely perplexed on the ranking factors for Google Maps (hence also the 3-pack in normal search results). Are seeing search queries that return 3-pack and organic result like this and wondering why these sites are getting 3-pack preference?Not that sites 2 and 3 are no closer to the test user's location than Site 4. All 4 sites have a street address showing.3-pack result:#1 - Site 1 - No reviews. Same distance as Site 4 to user. #2 - Site 2 - 1 review for 1 star. Farther from user than site 4. #3 - Site 3 - 2 reviews for 5-star average. Farther from user than site 1, 2, and 4.#4 (not show in 3-pack) - Site 4 - 6 reviews with 6 star rating, closer to user than site 2 and 3.Organic results below 3-pack:#1 - Site 4#2 - Site 4#3 - Other site#4 - Site 1Sites 2 and 3 not in top 10 organic non-map resultsSo what would be the most likely ranking factors keeping making site 1-3 rank above site 4 in the 3-pack/map results?If on-page and backlink factors were at play, you'd expect to see sites 1, 2 and 3 higher than site 4, and in the case of site 2 and 3 at least in the top 10 of the organic results. All sites were similar distance to the user.
Local Website Optimization | | SEO18050 -
Weird SEO Problem - No Longer Ranking in Some Areas
Hi Everyone, I’ve got a weird SEO issue that I hope you’ll be able to help with. I’ve broken it down in to the key points below: Impressions for our primary and secondary keywords dropped dramatically on 02.10.17. Impressions have only dropped on non geographical keywords. “UK” variants are still ranking well. Investigation shows we’re not ranking outside of London at all for primary and secondary keywords. Primary and secondary keywords are still ranking well in London, the city where we’re based We’ve looked at our competition who do rank for the primary keyword both in and outside London. We noticed we have our “postaladdress” in our schema. The competition don’t have their address in their schema. We updated our schema 2 weeks ago and now use the Yoast schema which is the same as our competitors use. Approx 1 week after removing the schema we started showing up for primary and secondary keyword again, but very low - fluctuating between page 15 and page 24. It’s been 2 weeks now and no improvement. AHREFS and google webmaster, both incorrectly detail that we rank top 5. Which is true to a degree, but only in London. Thank you in advance!
Local Website Optimization | | rswhtn0 -
More pages on website better for SEO?
Hi all, Is creating more pages better for SEO? Of course the pages being valuable content. Is this because you want the user to spend as much time as possible on your site. A lot of my competitors websites seem to have more pages than mine and their domain authorities are higher, for example the services we provide are all on one page and for my competitors each services as its own page. Kind Regards, Aqib
Local Website Optimization | | SMCCoachHire0 -
I have a Wordpress site that ranks well and a blog (uses blogger) with slightly different URL/domain that also ranks decently. Should I combine the 2 under the website domain or keep both?
I realize that I am building essentially 2 different sites even though they are connected, but on some local town pages i have 2-3 results on Page #1. Nice problem to have eh? But i am worried as for a lot of my surrounding towns my competitor has the top listing or definitely ahead of me, so i am wondering if i combine or convert my blog into the same domain as my site, then all of that content + links should hopefully propel my site to #1. Anyone have an experience like this? thanks, Chris
Local Website Optimization | | Sundance_Kidd0 -
Local SEO - Adding the location to the URL
Hi there, My client has a product URL: www.company.com/product. They are only serving one state in the US. The existing URL is ranking in a position between 8-15 at the moment for local searches. Would it be interesting to add the location to the URL in order to get a higher position or is it dangerous as we have our rankings at the moment. Is it really giving you an advantage that is worth the risk? Thank you for your opinions!
Local Website Optimization | | WeAreDigital_BE
Sander0 -
How to approach SEO for a national umbrella site that has multiple chapters in different locations that are different URLS
We are currently working with a client who has one national site - let's call it CompanyName.net, and multiple, independent chapter sites listed under different URLs that are structured, for example, as CompanyNamechicago.org, and sometimes specific to neighborhoods, as in CompanyNamechicago.org/lakeview.org. The national site is .net, while all others are .orgs. These are not subdomains or subfolders, as far as we can tell. You can use a search function on the .net site to find a location near you and click to that specific local site. They are looking for help optimizing and increasing traffic to certain landing pages on the .net site...but similar landing pages also exist on a local level, which appear to be competing with the national site. (Example: there is a landing page on the national .net umbrella site for a "dog safety" campaign they are doing, but also that campaign has led to a landing page created independently on the local CompanyNameChicago.org website, which seems to get higher ranking due to a user looking for this info while located in Chicago. We are wondering if our hands are tied here since they appear to be competing for traffic with all their localized sites, or if there are best practices to handle a situation like this. Thanks!
Local Website Optimization | | timfrick0 -
Ecommerce Site with Unique Location Pages - Issue with unique content and thin content?
Hello All, I have an Ecommerce Site specializing in Hire and we have individual location pages on each of our categories for each of our depots. All these pages show the NAP of the specific branch Given the size of our website (10K approx pages) , it's physically impossible for us to write unique content for each location against each category so what we are doing is writing unique content for our top 10 locations in a category for example , and the remaining 20 odd locations against the same category has the same content but it will bring in the location name and the individual NAP of that branch so in effect I think this thin content. My question is , I am quite sure I we are getting some form of algorithmic penalty with regards the thin/duplicate content. Using the example above , should we 301 redirect the 20 odd locations with the thin content , or should be say only 301 redirect 10 of them , so we in effect end up with a more 50/50 split on a category with regards to unique content on pages verses thin content for the same category. Alternatively, should we can 301 all the thin content pages so we only have 10 locations against the category and therefore 100% unique content. I am trying to work out which would help most with regards to local rankings for my location pages. Also , does anyone know if a thin/duplicate content penalty is site wide or can it just affect specific parts of a website. Any advice greatly appreciated thanks Pete
Local Website Optimization | | PeteC120 -
Bing ranking a weak local branch office site of our 200-unit franchise higher than the brand page - throughout the USA!?
We have a brand with a major website at ourbrand.com. I'm using stand-ins for the actual brandname. The brand is a unique term, has 200 local offices with sites at ourbrand.com/locations/locationname, and is structured with best practices, and has a well built sitemap.xml. The link profile is diverse and solid. There are very few crawl errors and no warnings in Google Webmaster central. Each location has schema.org markup that has been checked with markup validation tools. No matter what tool you use, and how you look at it t's obvious this is the brand site. DA 51/100, PA 59/100. A rouge franchisee has broken their agreement and made their own site in a city on a different domain name, ourbrandseattle.com. The site is clearly optimized for that city, and has a weak inbound link profile. DA 18/100, PA 21/100. The link profile has low diversity and generally weak. They have no social media activity. They have not linked to ourbrand.com <- my leading theory. **The problem is that this rogue site is OUT RANKING the brand site all over the USA on Bing. **Even where it makes no sense at all. We are using whitespark.ca to check our ranking remotely in other cities and try to remove the effects of local personalization. What should we do? What have I missed?
Local Website Optimization | | scottclark0