Most browser support gzip-compression. If you have large HTML pages (40K or more), compressing them can really make a big difference (a 50K page might only be 5-6K when it's compressed).
Doing that with CherryPy is really easy. It only takes a few lines of code and uses the initNonStaticResponse special function:
import gzip, cStringIO
def initNonStaticResponse():
# Only compress the page if the client said it accepted it
if request.headerMap.get('accept-encoding', '').find('gzip')!=-1:
# Compress page
zbuf=cStringIO.StringIO()
zfile=gzip.GzipFile(mode='wb', fileobj=zbuf, compresslevel=9)
zfile.write(response.body)
zfile.close()
response.body=zbuf.getvalue()
response.headerMap['content-encoding']='gzip'
See About this document... for information on suggesting changes.