App development is rarely a simple task. Modern connected devices are often accompanied by feature-rich applications that users expect to use on multiple platforms. Integrating multimedia, such as audio and video, is more and more seen as the norm, rather than the exception.

How do we do it here at Witekio? We can’t give away ALL our secrets, but I recently sat down with one of our chief engineers to pick their brains about GStreamer.

Getting Started with GStreamer

GStreamer acts as a versatile multimedia framework and provides a standardized interface for codec and format support. With low system requirements and cross-platform support, it’s great for embedded application development. It’s easy to extend to app-specific functions or new hardware, and there are bindings for most programming languages, making it a versatile tool for any device application developer.

First things first, you need to install the packages.

On an Ubuntu desktop:

$ sudo apt-get install gstreamer1.0-dev

On Yocto Linux:

$ bitbake gstreamer-1.0

On Windows:

Download the installer (.msi) from here:

Once installed we get a few tools and libraries to help us develop applications with gstreamer.

  • Gst-launch-1.0: Prototype gstreamer pipelines at the command line without the need for writing any wrapping application code – we’ll be using this in a few examples
  • Gst-inspect-1.0: Essentially man  for gstreamer plugins/elements
  • Gst-play-1.0: Play a media file using a pre-generated pipeline, good for sanity-checking

We get the C library by default, which is based on the GLib ecosystem (Gnome, not GNUs glibc). Bindings for other languages such as Python, Ruby, and Java (great for Android application development) are available from GStreamer’s website.

Need help with device application development?
We can help

Hello World? The Basics.

The greatest journey begins with a single step, so let’s start with the smallest application we can think of.

$ gst-launch-1.0 videotestsrc pattern=smpte ! video/x-raw,width=1280,height=720 ! autovideosink

This essentially just tells GStreamer to generate some 720p SMPTE frames and output them to your default video output device.

One core aspect of any Gstreamer application is the “pipeline”. Pipelines are made up of elements to create-process-output media, such as:

  • Source: Input of media data, could be a file, piece of hardware e.g. camera, microphone, etc
  • Property: Elements often offer properties that can alter their behavior.
  • Caps Filter: Each element has a list of capabilities (or caps) that describe what the element can handle. A Caps Filter will force a given configuration. Without this, the elements will negotiate some safe defaults.
  • Sink: Where the data will ultimately end up, could be a file, frame buffer, etc.

Let's get serious - Developing a multimedia app

Pretty much any application will need to perform the following steps:

  • Setup the pipeline contents
  • Start the pipeline
  • Wait for events (as the media is processed)
  • Perform cleanup

Let’s make a pipeline that produces an MP4 video file. This involves handling separate video and audio streams, and ultimately multiplexing them into a single media.

Note the -e argument to gst-launch. We need it to send EOS (end of stream) on shutdown, you don’t want to end up with broken files!

Pipelines with heavier processing tasks, such as software encoding, image processing, and waiting for file I/O can lead to poor performance – especially on embedded platforms. Mobile applications often rely heavily on rich multimedia and reliable streaming. Keeping your pipeline delay-free will help you avoid issues such as low FPS recordings, stuttering audio, high latency for live feeds, and so on.

Gstreamer provides controls to split big pipelines across multiple pthreads (or platform equivalent) with the queue element.

Organizing pipelines

As you can imagine, more complex functions lead to more complex pipelines. If you find your code is getting messy, the best thing to do is to chuck it in a Bin! Fear not, we aren’t recommending you give up your career as an application developer, a Bin is a special element that encapsulates a whole gstreamer pipeline. This keeps this tidy and makes code reuse much easier.

GStreamer ships with several handy Bins that implement common use cases, such as:

  • Playbin: Plays a given media file to the given audio/video sinks (what gst-play-1.0 uses under the hood)
  • Camerabin: Implements a video recording device

Don’t forget that GStreamer pipelines are automatically assigned a state of GST_STATE_NULL upon creation. To actually use the pipeline you must set it to GST_STATE_PLAYING.

The four simple states for pipelines are:

  • NULL: Default initial state (stopped)
  • READY: The element will transition to PAUSED after startup
  • PAUSED: The elements will accept and process data, however, the pipeline’s sink will block
  • PLAYING: The sink will perform rendering/writing

Need help with your Device Application Development?

So there you have it, a crash course in GStreamer. You can do so much more with GStreamer, and the plugin architecture lets you easily implement features like new filters, software codecs, hardware support, and networking. These tools should help you get started embedding audio and video into your apps, but if you are looking for help developing feature-rich apps, Witekio is your trusted partner in embedded application development. 

GStreamer is just one of the many multimedia frameworks we use here at Witekio to craft efficient and user-friendly apps. With over 20 years of experience in embedded software and IoT app development, our secure-by-design approach will ensure you get the most out of your device. 

Get in touch 👇

Patrick HADDAD - Copywriter
03 January 2024