[GLLUG] Bash Bug

chris procter chris-procter at talk21.com
Thu Sep 25 17:28:49 UTC 2014

>On 25 September 2014 10:14, Sunny Aujla <sunnyfedora99 at googlemail.com> wrote:

>Thought I'd share this with everyone.
>So I'm finding the linked Red Hat article quite difficult to read (being poorly written and discussing useless diversions) and therefore fail to actually understand the issue here. 
>So my attempt at understanding it, and please correct me if I'm wrong, is:
>Bash (like any shell) has access to environment variables, and it loads those variables when it starts up. If those variables contains a function then bash will execute that code. So (and this is the bit where it gets a bit hazy) if you are running a service such as mod_cgi in apache2/httpd and those CGI scripts are running in bash, then they can somehow create an environment variable that will be loaded by other bash instances?
>Also there doesn't appear to be a synchronised release of updates such as with Heartbleed. So is this issue not as severe, or was it not disclosed properly?

So when you launch a bash shell it takes the environment 
variables inherited from its parent and because environment variables 
can be functions it evaluates them. However it evaluates the whole 
string so any trailing code gets evaluated (i.e. run). so for example
export X='() { :;}; echo vulnerable'
bash -c "echo this is a test"

creates an environment variable X with the value of '() { :;}; echo vulnerable',  so when bash -c  is run it creates a new bash shell and evaluates the X environment variable the  () { :;}; part is a valid shell function that does nothing but the echo vulnerable part also gets executed and prints "vulnerable". Only then does the  "echo this is a test".   If instead of "echo vulnerable"  you had "rm -rf /"  then bad things happen.

cgi scripts take the query string and all of the other http parameters in 
the request and turn them into environment variables, then they invoke 
the cgi script which can read the environment variables to get the 
parameters passed in. So if your cgi script is written in bash then a 
malicious hacker could send a 
suitably crafted http header then would run code when the cgi script was started.

So (stealing from a comment on hacker news) running: 

curl -H 'User-Agent: () { :;}; rm -rf /' http://<webserver>/<shell script>.cgi

against a bash cgi script would execute rm -rf / on the webserver.  It runs it 
as the apache (httpd) user however so its not (quite) as bad as that 
makes it sound but still not fun.

There are a few other places you can set environment variables and then 
invoke a shell (ssh, postfix etc) but cgi is the easiest to exploit

It works on solaris and OSX as well, but not esx which uses busybox and so the ash shell

I'd still say heartbleed is worse though, who runs bash cgi scripts?


More information about the GLLUG mailing list