IRC channel logs

2019-11-27.log

back to list of logs

***Server sets mode: +nt
<damo22>why would the version number of libpciaccess shown by `ldd netdde` be different to `ldd pci-arbiter`
<damo22>ive killed them both and rerun it
<damo22> http://ftp.ports.debian.org/debian-ports/pool-hurd-i386/main/h/hurd/ is this the latest hurd?
<damo22>pci-arbiter from hurd 1:0.9.git20191029-3 does not link to pciaccess
<damo22>when i compile my version of pci-arbiter everything works with latest netdde and kernel
<damo22>and libpciaccess 0.16
***GeneralDuke1 is now known as GeneralDuke
<gnu_srs1>(14:09:29) damo22: wtf my pc rebooted and boot sector was gone: Did you restore your computer?
<damo22>yeah
<damo22>i had a bootable usb key and restored grub
<damo22>not sure what actually happened though
<damo22>im on AC power and have a laptop battery
<damo22>it shouldnt just reboot
<gnu_srs1>Hopefully not related to rump drivers on that box?
<damo22>no it was in linux
<damo22>rump drivers are fine
<damo22>i fixed the outstanding upstream rump bug with pci-userspace, switched the makefiles to new style and tested on hurd
<damo22>it passes the CI test in linux on travis
<damo22> https://github.com/rumpkernel/pci-userspace/pull/5
<damo22>alextee[m]: youre writing a DAW right?
***nckx is now known as _nckx
***_nckx is now known as nckx
<damo22>ArneBab: are you interested in sound on hurd?
<damo22>youpi: when a translator is invoked on a file, can i define an RPC that can be used to gather high resolution (buffer position, system timestamp) pairs on a sound device?
<youpi>you can define RPCs just like you want
<damo22>are RPCs fast like IRQs?
<youpi>not that fast
<youpi>they are basically like a system call
<youpi>through a bit heavier since you really switch address space
<youpi>though CR3-tagging in the cache should help here nowadays
<damo22>the problem with ALSA is that timing of I/O is too coarse, audio IRQ is used directly to know when the buffer is ready to fill/empty, but we could do better if we knew the location of the buffer pointer at other times
<damo22>and use DMA with timer IRQ to schedule more accurate fills of the audio buffer
<youpi>yes, I was surprised that it's hard for applications to know how well things are progressing
<youpi>so as to control the audio content pipelining
<damo22>applications should have accurate latency reporting
<damo22>back to them
<damo22>i could use some help to design the kind of RPCs i would need to do this properly
<damo22>i have been advised that class1 usb audio devices are a good start because they are the least complex
<damo22>the ones that are class compliant
<damo22>one solution that was proposed by Paul Davis was to use audio IRQs as a source of timing information, and recover the jitter using a delay locked loop
<damo22>then you can predict the buffer pointer in between the audio IRQs
<youpi>(I'm amazed that this is not a "solved" problem, considering that we've been programming audio cards for games for several decades now)
<damo22>i think he meant that the audio IRQ only occurs when the buffer pointer is at a known location, so you can do this
<damo22>but the linux kernel currently does not do this, instead it uses the IRQ arrival to trigger the transfer of samples directly
<damo22>paul has been frustrated with the linux guys for at least a decade
<damo22>we need an audio framebuffer interface similar to how the video framebuffer is
<youpi>personally I see it as a pipeline where both the producer and the consumer monitor somehow how well this is going
<youpi>the hardware notifying the consumer the exact progression
<youpi>but that's only the overall picture :)
<damo22>paul says driving everything from the arrival of an audio IRQ is a limiting design, but that is what ALSA does
<youpi>for me the audio IRQ should only be a way to check that the progression is as expected, yes
<damo22>yeah it can be collected with its timestamp
<damo22>so you know when it arrived
<youpi>there can even be some delay, as long as you know when it ticked
<damo22>right
<youpi>so you can compute how late/early you are feeding audio
<damo22>yeah it has to be adaptive
<damo22>ideally, you can write only a few samples ahead of the playback pointer
<youpi>(I have been working on this issue in speech-dispatcher, to adapt the production of speech synthesis according to playback progress, and the situation with the pulseaudio interface is not really satisfactory)
***foggy68 is now known as foggy67
<damo22>you need DMA bus mastering for that
<foggy67>hello
<foggy67>youpi : does the sound work now with GNU-HURD?
<damo22>we are discussing the problems with Linux audio
<foggy67>ha :)
<damo22>if you want a summary of what I think is wrong with it, (and a bunch of other linux experts) https://github.com/linuxaudio/Linux-Audio-Workgroup/wiki/Towards-Fixing-ALSA
<foggy67>I am using gnu/hurd and x window for entertainment
<youpi>foggy67: there's an experimental version of mplayer that uses rump to produce audio
<youpi>but that's only proof of concept
<youpi>damo22 is thinking about laying a proper layer
<foggy67>youpi: is the concept OK?
<youpi>which concept?
<foggy67>the concept of using rump to produce audio
<youpi>I don't know actually
<youpi>damo22: does BSD have audio drivers?
<youpi>do they behave like ALSA?
<damo22>its a netbsd usb kernel driver in userspace
<youpi>iirc they'd use an OSS interface
<damo22>im not sure
<damo22>if its OSS, then its probably worse than ALSA
<damo22>because it uses an open() read() write() interface
<youpi>there are also ioctls()
<damo22>with no timing considerations
<youpi>SOUND_PCM_GETOSPACE, SOUND_PCM_GETISPACE
<youpi>you can monitor progression with that
<youpi>SOUND_PCM_GETTRIGGER SOUND_PCM_SETTRIGGER SOUND_PCM_SETSYNCRO SOUND_PCM_GETIPTR SOUND_PCM_GETOPTR SOUND_PCM_MAPINBUF SOUND_PCM_MAPOUTBUF
<youpi>I don't know the details, but they look nice
<damo22>hmm
<damo22>is that OSSv4?
<youpi>possibly, I just looked at linux/soundcard.h
<damo22>i dont think the pointers are very accurate
<damo22>they probably update with each sound irq
<youpi>they could make computations to report between irqs
<youpi>the driver knows the rate, so it can compute
<damo22>yea, do they?
<youpi>I don't know,but tht interface doesn't prevent from doing it
<damo22>ok
<ArneBab>damo22: yes
<damo22>sounds good then
<damo22>OSS went through a few license changes, proprietary, etc, then they bumped the version and its now open i think
<damo22>but linux never upgraded to 4
<damo22>theyre stuck on 3
<youpi>it's actually available as debian packages, but the people who pushed for it didn't continue maintaining the packages
<damo22>okay
<ArneBab>damo22: I want to help make it more practical for applications to use sound on the Hurd with user-controlled access-management now that the foundations are in place — and I managed to get some funding for it: https://nlnet.nl/project/Hurd-Audio/
<damo22>thats cool
<ArneBab>it’s not a lot of funding, but it can keep me going for half a year at 5 hours per week.
<damo22>rather than redesigning the user api from scratch, i propose we implement something that can talk to jackd server daemon, because there are many applications built on jack
<damo22>it basically provides ports that you can connect and route easily
<ArneBab>can we use it to only grant a specific application access to the hardware?
<ArneBab>I guess we could limit access to jackd
<damo22>yeah you can limit who has access to jackd
<ArneBab>switchable at runtime?
<damo22>er, anyone can run a jackd
<ArneBab>then we can limit the jackd
<ArneBab>where I want to get to: When an application wants access to any audio, it should request the right group and the user should be prompted to add it
<ArneBab>or reject
<ArneBab>(same for a microphone)
<damo22>interesting
<damo22>well we definitely need to limit who has access to sound, and it should have the ability to share it
<ArneBab>we can do that by providing a translator the program can touch to get access and a second one a user-program can write to to grant access
<ArneBab>or simpler: a user program which watches the requests and prompts the user to add the group to the application in question
<ArneBab>then programs don’t need to start with audio-priviledges
<damo22>there are many things to consider, but i think hurd has the potential to have better sound support than linux
<ArneBab>and ideally this would happen transparently for programs
<ArneBab>definitely, es
<ArneBab>yes
<damo22>i sent an email to bug hurd to start a conversation
<damo22>we documented what was wrong with ALSA
<damo22>its been a work in progress over a while
<damo22>hopefully our design will not have the same flaws
<ArneBab>it will have flaws for sure, but hopefully not the same, yes :-)
<damo22>youpi suggests using OSS because we can still use their api to compute buffer positions between the arrival of irqs
<damo22>and it wouldnt make much sense to rewrite complete drivers from scratch
<youpi>and application support :)
<ArneBab>that’s the hardest, yes
<damo22>youpi: do you mean reuse OSS as the external facing api?
<damo22>and rejig it internally to fix the buffers?
<ArneBab>I just signed in to bug-hurd again — would be happy if you could sent me a digest of the conversation so far
<youpi>yes
<youpi>ArneBab: there's just the post of damo22 so far
<damo22>the conversation has only been on here and my one post
<damo22> https://lists.gnu.org/archive/html/bug-hurd/2019-11/msg00086.html
<ArneBab>thank you!
<damo22>youpi: we could do that with OSS but id still want to implement jackd on top of OSS, its ugly to use
<damo22>do you really want every audio application to call ioctl() and stuff?
<damo22> https://jackaudio.org/
<ArneBab>damo22: alsa is nowadays mostly used via pulseaudio which is annoying but provides what most applications want: A way to just put in some data and have it audible for the user
<damo22>with jack, you get inter-process audio routing
<damo22> https://jackaudio.org/applications/ can all talk to each oher
<ArneBab>what’s the reason that jack did not get taken up?
<damo22>because of ALSA's flaws and jack uses ALSA
<damo22>you configure it manually
<damo22>and it hogs the sound card
<damo22>it basically opens the device and locks it to a samplerate
<damo22>its great if you are doing a session of recording
<ArneBab>I mean: why do most desktop applications use alsa instead of jack?
<damo22>well we could implement the ALSA api
<damo22>then jack would be optional on top
<damo22>the problems outlined in that document would still be ther
<ArneBab>does jack on linux need root/sudo to start?
<damo22>no
<damo22>jack might not be the right solution either
<damo22>it expects to be the only instance using the sound card
<ArneBab>we could also layer virtual sound cards on top of the hardware that get muxed and give applications the impression of being the only application
<ArneBab>though that would mean mixing sound there
<ArneBab>(which isn’t easy)
<damo22>mixing sound is easy if you have the same sized buffers
<ArneBab>if :-)
<damo22>i highly recommend to read that doc
<ArneBab>I already started, but need to pause for a while now
<damo22>it raises some important issues that ALSA doesnt handle
<damo22>me too i need to sleep
<damo22>apparently ALSA has two modes of API, one is a push model (should be deprecated) and the other is a pull model
<damo22>we need a pull model at the lowest level so the process running the sound card decides when to schedule samples to and from the sound card
<damo22>and provides an audio framebuffer api to us to use
<damo22>we dont want the user to decide when to write samples
<damo22>the soundcard's DMA interface drives low level I/O
<ArneBab>Different from Linux, the Hurd can easily have a filesystem node which is backed by an active program
<damo22>goodnight
<ArneBab>cu
***Emulatorman__ is now known as Emulatorman
<Gooberpatrol66>should you guys try to use pipewire? from what i heard it emulates jack and pulseaudio support, so that would get you the broadest application compatibility