[Sussex] Re: Gentoo problems.

Geoff Teale gteale at cmedltd.com
Tue Feb 24 16:26:23 UTC 2004


On Tue, 2004-02-24 at 15:57 +0000, Steve Dobson wrote:
> You'd think that, wouldn't you.  But there is some evidence that that 
> reverse it true - see "The Cathedral and the Bazaar".  Personally I think
> that the biggest factor contributing to this is that if you have a big
> enough group of testers then all bugs become shallow.

Because of the _very_ tight regulations around our industry we currently
have as many testers as we have developers.  As a result we're currently
two weeks into a 5 week bug fixing spree killing of many, many bugs that
the other companies I have worked for would never of caught for lack of
resource.  Was this because those over companies saw no value in
testing?  No.  Simply put it was because there was no money hanging on
the end of QA.  Payment was and is based on fulfilling functional
requirements in a superficially stable way, not on exercising every
logic path available.

My point here is that it is not _just_ the software houses that are
negligent in this regard, it is the customers as well.  Most customers
don't demand enough from their suppliers up front, because they don't
see any value in understanding what they're dealing with (technical
issues are after all what you're paying the software house to deal
with!).   I don't know a company that wouldn't expect "correctly
calculated figures" to be a prerequisite for any external accounting
organisation they did business with, but basic notions of QA in software
seem to be less important to customers than "more features, more
quickly".  If customers  demanded quality rather than "flash" then the
industry would soon change to meet the demand.


> Few, if any commercial projects have the man power for testing and 
> developing that the Linux Kernel, Debian and Gentoo have available to 
> them.

True.  One of the main gains that Red Hat get from Fedora (as opposed to
the old "Red Hat Linux" line) is a large community of users itching to
do their pre-release testing and bug fixing.  In return Red Hat continue
(as they have done for years) to employ developers working on a lot of
the software we all take for granted (Alan Cox, Havoc Pennington and the
like). 

Most of the big companies who GPL software do so for precisely these
reasons - they can make a bundle selling support and services and save a
load of testing money by giving the software to communities who do there
own support.  Some people see this as cynical, though I do not. I know a
lot of Debian developers have been pissed off by Red Hat and Gentoo's
communities work being used for commercial gain (although this always
seems to go hand with petty "Debian rules distro X sux" statements) but
the simple truth is F/OSS tools have been advanced a lot by these
companies and we all get to benefit from it - equally we all have the
same opportunity to build business around that software.

> Another problem with commercial timescales and budgets is that they are 
> based on "fiddled" engineering estimates.  The original estimates are
> poor to begin with, and the better at estimating one gets the more likely
> you are to extend you estimates.  Management, when it doesn't like what
> it sees, moves the estimates to the left.  And what engineer is going 
> to state that "it will take six weeks not four".  Is it any wonder that
> projects overrun?

Yes.  Even here estimation causes a little tension.  Working with
eXtreme Programming techniques improves our life a lot (XP encourages a
more Bazaar like approach than most formal methodologies) because we
only every deal with very small units of work - we also find that we are
able to adjust to our customers ever changing needs a lot better.

One truth - if you are an open source developer (or your company employs
them) you find that the bugs that need fixing (or features that need
adding) in your 3rd party software get done much more quickly.  We fixed
a bug in PyGTK just this week - we now benefit from it immediately -
other users will get it shortly.

> Here is an example.  One of my old companies bid for a new bit of work
> in a new sector.  The Board wanted to move the company in that direction
> so they made it a "must win" bid.  To help, they reduced both the timescale
> and the cost.
> 
> The project, in order to "meet" the timescale needed to run far more 
> overtime than in the forecast.  As a result the 20% profit margin was
> eaten up.  The Board had the senior software engineer investigate.
> 
> The report was not good reading (for the Board).  It basically said that 
> Engineering executed the project to within a reasonable margin of error.
> If the Board decides to reduce the bid to win the business then the Board
> could not blame Engineering for executing the project as Engineering
> originally said it would.

That's actually more rational than most companies, who will simply blame
engineering for not doing the impossible!

Here we're lucky enough to be told about the real situations we are
facing.  If we have a do or die piece of functionality to produce then
we pull together as a team and do our best because we understand what is
required and what the significance of it is.  

-- 
Geoff Teale
Cmed Technology <gteale at cmedltd.com>
Free Software Foundation <tealeg at members.fsf.org>





More information about the Sussex mailing list