[Nottingham] Debian devotion [was: OE Reply Fixer]

Simon Huggins nottingham at mailman.lug.org.uk
Wed Mar 5 10:51:01 2003


On Wed, Mar 05, 2003 at 10:11:31AM +0000, Robert Davies wrote:
> On Wednesday 05 March 2003 08:17, you wrote:
> > On Wed, Mar 05, 2003 at 07:54:51AM +0000, Robert Davies wrote:
> > > For dialup, surely downloading source patches and applying then
> > > re-compiling, linking and installing automatically, will be the
> > > fastest way.
> > With 1meg CM or whatever you've got then grabbing source patches will
> > indeed be quick however you then have to patch and compile.  I know
> > people who run gentoo and all their computers seem to *do* is be
> > compiling something.
> > Depends what you want to do really.
> Well my Debian dialup, all it seemed to do was download something, if
> you do not have flat rate package it is a royal pain.

That's unusual.  My stable systems rarely see updates and when they do
they are only security updates or the point releases.  (The point
releases being mostly security or evil bug fixes).

> I'm going to try Gentoo out, once it's installed then have a working
> KDE and X environment, then I'd try to keep that stable, as indeed
> source changes there will result in a lot of re-compilation.

The other problem I would have is dedicating enough disk space to all
this compilation.  For instance OpenOffice is supposed to take 4GB[0].  I
imagine X is hardly small when it's compiling etc.

> All depends on what trade offs you want, if you have a lot of CPU and
> diskspace, but poor bandwidth then using source makes sense.  Even the
> patch rpm's which are available now for SuSE, on something like kernel
> result in 12MB downloads compared to 15 or so.  The source patches are
> rarely bigger than 100KB, and often trivially small.

Well perhaps ;)
There's a reasonable amount of context in diffs though when compressed
this does get fairly small.  Even bzipped kernel patches can be several
megs but yes, still less than the source originally though potentially more
than a binary package.

> > Is the i386 vs optimisations really that big a step?  Are there any
> > decent benchmarks on it in normal use for instance?
> It's become more important with P4 and Athlon, gcc-3.2 is getting
> cleverer to.  The P4 has some annoying braindamage, which Intel
> excepts Software Developers to work round, according to their
> optimisation guides.

Right.

> In past my experience was it wrung out 5-10% for architecture, with using -O3 
> sensibly to, gave up to 30% increased performance on CPU intensive 
> applications, so gzip & bzip, crypto libraries, ssh, glibc and gcc itsell 

Those choices seem very sensible.

> would qualify for special attention.  There's some news out about someone 
> re-compiling gcc itself with optomisation (I always used to do it on Sun 
> anyway) and finding a 20% increase in compile speed.

But once you get into gentoo it's not a "Oh well I'll recompile bzip but
won't bother with xterm because once it's loaded, it's fine" attitude.
You have to do it all.  I guess that's why I like binary packages.  I
should probably try out apt-build and see if I can see any difference
with anything.


[0] From the site at:
    http://www.linux-debian.de/openoffice/install_build_howto.html
    Though I note that they've somehow managed to shave a gig off this
    (see front page) so this may not be entirely valid

-- 
Simon  [ huggie@earth.li ] *\     "Clear?" - Holly. "No." - Lister.  \**
****** ]-+-+-+-+-+-+-+-+-[ **\                     "Tough." - Holly.  \*
****** [  Htag.pl 0.0.22 ] ***\                                        \