[Klug-general] audio systems/servers/apis.

James Morris jwm.art.net at gmail.com
Sat Apr 23 10:22:14 UTC 2011


On 23 April 2011 07:35, Peter Childs <pchilds at bcs.org> wrote:
> The issue is that many apps talk a back end api when they should be
> talking a front end api and end up doing.
>
>
> app -> alsa -> pulse -> alsa -> out

This is where an application is coded to talk directly to ALSA. If
Pulse wasn't running it would be:

app -> alsa.

and that would be it. As mentioned before, ALSA can do mixing via
DMIX, but (according to Pulse Audio wikipedia entry) "It (DMIX) does
not provide the advanced features (such as device aggregation,
timer-based scheduling, and network audio) of PulseAudio."


> gstreamer does
>
> app -> gstreamer -> pulse -> alsa -> out (I believe)

Well that's fairly understandable. The app (ie a media player) is
trying to play a file/stream of a particular codec, gstreamer handles
the codec, and pulse hands the resultant PCM audio onto ALSA where it
is output to the soundcard.

I am certain of three things here: I don't want to reinvent any audio
file format/codec handling library, I don't want to reinvent any kind
of audio routing server, and I don't want to reinvent hardware level
drivers.

>
> phonon is a slightly incomplete api in that it can't currently handle
> input but...
>
> app -> phonon -> gstreamer -> pulse -> alsa -> out



More information about the Kent mailing list