[Nottingham] Zero-PHP wiki!?

Graeme Fowler graeme at graemef.net
Tue May 18 13:01:40 UTC 2010


On Tue, 2010-05-18 at 13:35 +0100, Martin wrote:
> Why cannot a web page be displayed ALWAYS more quickly than 0.5s?

Database/interpreter considerations aside, connection latency is a big
killer.

I recently spent an (ash enforced) additional 3 weeks in Australia
(additional to a 3 week holiday!); the speed of light factor over the
distances involved for me to get to various work-related webapps put the
page load time up to well over 5s for most things.

>From a page design POV, having as much of the meta-content - javascript,
CSS definitions and so on - included in the page means that there's a
minimal number of TCP sessions required to fetch all the content.

If you look at a number of websites - The Register, for example - and
analyze them with something like Firebug+PageSpeed you should get a good
handle on how many requests need to be processed to fetch a full page,
and how they can be improved.

If it takes even 0.1s to do the SYN/ACK/SYN+ACK ... data ... FIN/FIN+ACK
sequence for a single element, and you have 25 individual elements on a
page - images, .js files, CSS, AJAX calls etc - then the maths for a
single-threaded connection should be fairly easy to calculate.

Browsers (and servers) do try to help as much as they can by using
parallel connections, keepalives and so on but some pages now are so
complex that there can be more than 100 individual calls for each one.
That's a lot of overhead!

I guess that's a long way of saying: before you start looking at the
extra bits, get the basics of the design right first. Then, and only
then, can you get extra performance out of your webapps.

Graeme




More information about the Nottingham mailing list