Wim Taymans is a Fedora contributor and the creator of PipeWire, the system service that takes audio and video handling under Linux to the next level. Combining the power of PulseAudio and JACK, and adding a video equivalent of those audio services, allows Linux to become a premier content creation platform for audio engineers and video creators alike.

Q: So its 2 years since we talked about PipeWire here as part of the Fedora Workstation 34 launch. What are your development highlights since we last spoke?

So many! We’ve done more than 40 releases since then.

A big part of the work was to close the gap between pulseaudio and PipeWire. We did the transition in Fedora 34 with quite a few missing features that were not enabled by default but that people often used, such as the various network sinks and sources. We also added echo-cancellation and many of the other missing modules and finally we added S/PDIF passthrough, Airplay support and multiple sample rates. Most of these new modules now have more features than the pulseaudio equivalents.

We’ve also added something that I wanted to do for a long time: a filter-chain. This allows you to write a graph of filters to do various things. We’ve been building filters for 3D sound, reverbs, delays and equalizers with this.

PipeWire now also has support for some of the more advanced network audio protocols. We’ve added experimental support for AVB. We have working AES67 support and we support low-latency synchronization of AES67 streams with a PTP clock. We’ve also added support for Apple MIDI and extended the protocol a little to also handle raw audio and OPUS compressed audio.

We switched to WirePlumber 0.4 in Fedora 35 and deprecated our old session manager. A lot of work has been done for the next version of WirePlumber 0.5. That should be released, hopefully, soon and that includes a rework of the event system.

Other than that the code has matured a lot. Performance has increased and there are fewer bugs. The wire protocol is fully documented now. The scheduling has improved a lot and has more features.

Q: So there are 3 main parts to PipeWire, the PulseAudio support which was mostly done already back in Fedora Workstation 34, then there was the JACK support which was there, but with gaps and finally the video support which was there, but with no real world users yet. So to start with JACK, where are we with JACK support now?

We mostly implemented the missing features in JACK. We have support for freewheeling, which allows you to export an Ardour project, and we also have latency reporting, which is necessary to correctly align tracks on a timeline.

Compatibility with applications was improved a lot over time and required quite a few iterations to get it right. We’re now able to run the JACK stress test successfully, which required some fundamental changes in core memory management and scheduling. 

We’ve also recently started implementing the last missing pieces of JACK, which is NETJACK2 support and a firewire audio driver called FFADO. The FFADO support likely needs some more work because we have not been able to actually test this because we don’t have the hardware.

We support JACK as a driver (with jackdbus), which gives the PipeWire graph the same low-latency as JACK. This should also make it possible for people that used PulseAudio and jackdbus to migrate to PipeWire.

We now also have an IRQ based ALSA driver that works the same way as the JACK driver and achieves the same low latency as JACK. Initial benchmarks also show that PipeWire uses slightly less instructions per cycle compared to JACK, which is a bit surprising and more tests need to be done.

Q: And likewise what is the state of the video support?

On the video front we added support for DMABUF and DRM modifier negotiation. This makes it possible for a video provider such as the compositor to provide a stream in the most efficient format possible, when the client supports it.

We’re also improved the scheduler to make it possible to handle headless compositors and throttling of the frame rate.

We’ve also spent a lot of time improving the libcamera plugins. We’re at a point where PipeWire camera support is added to browsers and we should be able to handle that now. We also have initial support for vulkan video filters.

There is a GSOC project to implement video conversion filters using Vulkan, which would make it possible to link and process video streams in more cases.

Since the launch in Fedora 34, there are now also a couple of native PipeWire patchbays, such as Helvum and qpwgraph that can reroute video streams as well. The most recent PipeWire version has improved support for relinking of video streams.

Q: So when PipeWire got merged into Fedora the message was that the official audio APIs for PipeWire would be the PulseAudio and JACK APIs. We have seen some projects come along since then using the PipeWire stream API instead. Is the message still to use the PulseAudio and JACK APIs for new development, or has your thinking on that changed over the last couple of years?

The message is still to use the PulseAudio and JACK APIs. They are proven and they work and they are fully supported.

I know some projects now use the pw-stream API directly. There are some advantages for using this API such as being lower latency than the PulseAudio API and having more features than the JACK API. The problem is that I came to realize that the stream API (and filter API) are not the ultimate APIs. I want to move to a combination of the stream and filter API for the future.

Q: So one of the goals for PipeWire was to help us move to a future of containerized applications using Flatpaks. How are we doing on that goal today? Does PipeWire allow us to Sandbox JACK applications for instance?

Yes, sandboxed JACK applications are available right now. You can run a flatpak Ardour that uses the PipeWire JACK client libraries in the flatpak.

What we don’t have yet is fine grained access control for those applications through the portal. We currently give flatpak applications restricted permissions. They don’t have, for example, write access on objects and so can’t change nodes or hardware volumes.

For video, we currently have more fine grained control because it is managed by the portal and permission store. We’re working on getting that kind of control for audio applications as well. 

Q: So one vision we had for PipeWire, when we started out, was that it would unify consumer and pro-audio and make usecase distributions like Fedora JAM or Ubuntu Studio redundant, instead allowing people, regardless of usecase, to just use Fedora Workstation and install both consumer and pro-audio applications through Flatpak. With what you are saying, are we basically there with that, once PipeWire 1.0 releases?

In a sense yes. It’s possible to run those applications from flatpak out of the box on Fedora Workstation without having to deal with a custom JACK/Pulseaudio setup like most of those pro-audio distributions do. 

There is, however, the aspect of integration of the various applications and the configuration of the Pro-audio cards like samplerates and latency that is not yet covered as nicely in Fedora Workstation compared to specific usecase distros.

Q: One of the things people have praised PipeWire for is improving support, under Linux, for Bluetooth headsets etc. What are the most recent developments there and what is next for it?

Bluetooth has improved a lot. We’ve added support for new codecs. PipeWire now has a vendor specific OPUS codec that allows PipeWire clients to use OPUS over bluetooth. This was developed in parallel with the Google OPUS vendor codec and so is not compatible yet.

We’ve also tracked kernel and bluez5 development and added support for the upcoming bluetooth LE standard with the LC3+ codec. We added infrastructure support for identifying, grouping, and handling the latency of separate bluetooth earpieces.

We did quite some tweaks to improve compatibility with devices, we changed the way we handle the rate control and the data transfer. There is now also support for bluetooth MIDI so (with some changes to the bluez5 config files)  you can pair and play on your bluetooth MIDI keyboard in PipeWire.

For mobile phones, we now also support offloading bluetooth handling to hardware.

We’re still tracking the changes in the kernel and bluez5 for the latest bluetooth LE changes.

Q: Another change, since we last spoke, is the switch to WirePlumber as the session management. Is that transition completed in your mind now and has it yielded the benefits hoped for?

The transition is certainly completed and is in many ways successful.

I think the hope was also that people would use the lua scripting to customize their experience. We have definitely seen people do this for specific use cases but it has not been widespread. 

I think also some of the barriers for seeing that adoption have been removed in the upcoming 0.5 version, which is shaping up quite nicely now. 

We’ve also not seen applications use the wireplumber library yet. I think this partly because the pulseaudio compatibility is so good that there is no need for native applications yet. I heard someone is working on pavucontrol, written in rust and was looking to use the wireplumber API.

Q: I know you worked on a set of patches to add pipewire Camera handling to OBS Studio. What is the status of that work?

There are 2 pending features for OBS. One is support for cameras using the Portal and PipeWire and another is to export the OBS scenes as a PipeWire stream.

Most of that code works and is ready to merge. It just needs some cleanup and Georges Stavracas is working on that now.

Q: PipeWire camera support for Firefox and Chrome is another major milestone. What can you tell us about that effort?

PipeWire Camera support is now merged in firefox 116 and you can enable it with an about:config flag. I’m not sure about Chrome.

When the browser and OBS patches are all merged, it would, for example, be possible to create a scene in OBS studio and then route the exported video stream as a camera source into the web browser. You would be able to route video just like you can with audio!

Q: One thing that you and I have talked a lot about over the last few years is how to improve the handling of things like allowing the user interface to be smarter about how to deal with sources that supply both audio and video. Examples, here, are things like HDMI interfaces or Webcameras with microphones. Currently, since the linux kernel treats these as completely independent devices, the UI also tends to, But, for a user, it will often make sense if they are seen as a pair, or at a minimum labeled in a way that makes it 100% clear that these devices are part of the same hardware. I think a lot of people would think of such things as ‘PipeWire’ features. But in reality, being policy based, is it actually WirePlumber where we would need to implement such things?

It would be a wireplumber feature, yes. One idea is to use the udev device tree to link those devices together. It requires some heuristics to get it right.

Q: There was work on adding AVB support to PipeWire, how is that work going?

Not great. The code is there but it doesn’t work or we can’t test it due to lack of hardware and infrastructure. AVB is very specialized and we need somebody with the hardware and skills to help us out with that.

We’re focusing on getting AES67 working instead, which works on more readily available hardware.  We’ve added support for PTP clocks in PipeWire and the RTP sender and receivers. We’ve had success integrating Dante devices with AES67 with PipeWire.

Q: Apart from maturing and optimizing PipeWire what are the features you are looking into going forward?

PipeWire is an IPC mechanism for multimedia. The most interesting stuff will happen in the session manager, the modules, the applications and the tools around all this. I hope to see more cool tools to route video and set up video filters etc.

I think we would like to do an API/ABI break and clean up the native API to remove some of the cruft that has been accumulating over the years, at some point.  This should not impact a lot of applications, though, as they are using the PulseAudio and JACK APIs that will of course remain unchanged. 

There are some ideas to unify the stream and filter API and remove some of the features of the stream API that turned out to be a bad idea. It probably also needs some work in the session manager.

Q: Thanks you so much for talking with us Wim, any final words to the community?

Just a big thank you to the community for embracing PipeWire and helping us get to this point. The amount of people who have contributed to PipeWire over the last couple of years is astounding both in terms of people contributing patches, but also a lot of people testing PipeWire with various kinds of software and helping us close out JACK and PulseAudio compatibility bugs or by writing articles or youtube videos about PipeWire. I hope to the community will keep working with us as we focus on providing new features through WirePlumber now and get applications out there ported to use PipeWire for video camera handling too.

Similar Posts