[Gllug] Trends in programming (was: long thread about anything)

Walter Stanish walter.stanish at saffrondigital.com
Wed Nov 10 18:08:41 UTC 2010


>> Well call me impatient but I'd rather someone summarised those nuggets
>> for me in a short blog post,
>
> Here's a few:
> http://camltastic.blogspot.com/2008/09/what-can-ocaml-do-that-you-cant-do-in.html

Followed a link there and learned that OCaml is derived from ML, which
was itself invented by Robin Milner, a Turing award recipient and UK
academic.

Sadly he passed away in March this year, which is a real shame, since
we may have met otherwise... I presented at a conference there later in
the year that had some of the few people in the world who it can be
said definitively are commercially tackling the same ideas he seemed
to be working on mathematically: 'Code Generation 2010'.

A really interesting group of people, there were representatives from the Open
Group (creators of UML), commercial model driven software development
"modelling software" vendors, academics and industry people.  There seemed
to be a sense that, as 'believers' in code generation, everyone was on safe
and common ground, ground that is normally hard to find as "Redefine your
entire software development process." isn't exactly the easiest sell in most
organisations.  An interesting conference, one for which I had absolutely
no particular expectations and was pleasantly surprised.

Anyway, in Mr. Milner's archived site at http://www.cl.cam.ac.uk/archive/rm135/
he states: "to understand Ubiquitous Computing we must make more systematic
use of models, which are increasingly common in our discipline. This strand is
not mathematical, but proposes how models can form a hierarchy, allowing
them to be combined and higher models to explain lower ones."

This is really the essence of programming.  However, because the
classical 'stored program and CPU' computing model of the 'von Neumann
Machine' (really of Turing's conception) is largely broken for today's focus
on parallel systems, Mr. Milner was working on a solution, "a rigorous
mathematical  model, based on Bigraphs. It is proposed as a platform on
which to build more abstract models, especially for Ubiquitous Computing.
It can be considered an abstract machine, called the Ubiquitous Abstract
Machine. In contrast with the von Neumann machine, it models the
concurrent and interactive behaviour of populations of mobile
communicating agents."

So - on the questions...

Is higher level modelling going to become a trend in computing?  Has UML
failed to deliver?  Are its failings really its own, or are they inherent in
modelling itself?  Does all modelling break at a certain degree of complexity?
Is visual logic expressive enough?  How come models are never complete?
Isn't a model just a program that doesn't have to answer to execution?
Are DSLs (domain specific languages) the answer?  Aren't most general
libraries simply a DSL with less language features and more pragmatism?
What do the language features add?  And *is all this really helping us to
solve problems any faster*?  What does the history of the development
of mathematical notation tell us about our ability to use new models to
reason with entirely new problem spaces?  Are all languages just
mental crutches doomed forever to incompletely approximate perfect
systems of thought? Why do all the great mathematicians go mad?

<insert opinions here>

Nomination for the best random modelling tool that people possibly
haven't heard of - mscgen:
  http://www.mcternan.me.uk/mscgen/

- Walter
-- 
Gllug mailing list  -  Gllug at gllug.org.uk
http://lists.gllug.org.uk/mailman/listinfo/gllug




More information about the GLLUG mailing list