Something I have forgotten for some time but increasingly started to spend more time analysing is the raw traffic for a website. Locally transfer speeds are basically nothing and a website might appear as user friendly and performing well.
Is it really for the end user?
I have to consider that the web applications I am making are used in South Africa and the server is in the Netherlands. Latency is high and throughput as a modem. A large page might be 200kb, imagine the transfer time at 5kb/s.
What I have found really useful lately is to use a tool like Fiddler. You can really learn a lot of studying what is actually sent to and from a web application. This way you also find what is redundant. HTML is mostly redundant!
How to avoid large pages? Use XML and XSL transformation to just send a minimum of information and transform it to a HTML user interface on the client.