1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
|
#+title: EmacsConf 2021 alternate streaming solution
#+date: <2021-12-08>
[[https://libreau.org][LibreAustralia]] hosted an [[https://libreau.org/past.html#emacsconf21][alternate EmacsConf 2021 stream for APAC timezones]] on 28th November. It was a fun event to organise.
How to stream an event like this with a fully free software stack? Initially I envisioned a server streaming solution like I did with the inaugural event, by using ffmpeg to feed a local video file to icecast:
#+begin_src sh
ffmpeg -re -i ./video.webm -codec copy -content_type video/webm icecast://source:password@localhost:8000/live.webm
#+end_src
This works very well with one video, but with multiple videos one will need to [[https://trac.ffmpeg.org/wiki/Concatenate][concatenate them]]. The concat idea has two problems:
1. Not all videos can be concatenated. In fact, in most of my experiments, the output video could not play after the the portion corresponding to the first input video.
2. There's absolutely no control of the playback. Once the stream started, the whole event is scripted, and to adjust the schedule one has to kill the stream first.
Problem 2 can be fixed by utilising the fallback mountpoint with fallback-override:
#+begin_src xml
<mount>
<mount-name>/live.webm</mount-name>
<fallback-mount>/fallback.webm</fallback-mount>
<fallback-override>1</fallback-override>
</mount>
#+end_src
This way the stream never dies, provided a standby video plays on on the fallback mountpoint.
Unfortunately not all videos can move smoothly between the main and the fallback mountpoints. Some transitions cause unpleasant visual artefacts lasting for a dozen seconds, others (even worse) with audio turning into high-pitch scratching noise and never recovering. For certain videos these problems even occur when a video transitions to itself.
It may be possible to use ffmpeg to reencode videos that transitions smoothly, which is something to figure out for the future.
That's pretty much a deadend in server streaming.
On to desktop streaming, which offers the ultimate flexibility of playback control, but is heavier on bandwidth and computing resources. One idea was OBS Studio, which unfortunately does not have icecast as one of its /streaming/ options, but rather requires a hack to /recording/ to an icecast mountpoint.
I experimented with a setup from [[https://kelar.org/~bandali/][Amin Bandali]], which seems to me like using OBS Studio as an ffmpeg wrapper. Unfortunately I would get segfault unless the stream is done with a minimal resolution.
Inspired by [[https://libremiami.org][LibreMiami]]'s watch party, I decided to try out [[https://owncast.online/][Owncast]]. It was extremely easy to set up, and I could get an acceptable streaming performance with some low settings.
However, as pointed out by Amin, owncast uses rtmp as the streaming protocol, which probably encodes to mp4, [[https://audio-video.gnu.org/docs/formatguide.html][a patent encumbered format]].
How about streaming to BBB with screen share + monitor system audio as source? A test with [[https://zaeph.net/][Leo Vivier]] showed that it has a similar performance to owncast. The downside with BBB is that it requires javascript and is less accssible than icecast for viewers.
What worked, in the end, was an direct ffmpeg to icecast streaming (thanks to [[https://sachachua.com][Sacha Chua]]):
#+begin_src sh
ffmpeg -loglevel 0 -ar 48000 -i default -re -video_size 1280x720 -framerate 25 -f x11grab -i :0.0+0,20 -cluster_size_limit 2M -cluster_time_limit 5100 -content_type video/webm -c:v libvpx -b:v 1M -crf 30 -g 125 -deadline good -threads 4 -f webm icecast://source:pass@host:8000/live.webm
#+end_src
The captured area is shifted by 20 pixels in order not to grab the title bar of the player and emacs window.
The performance of this approach was better than any of the other desktop streaming solutions, probably due to its bare ffmpeg setup without any bells and whistles.
I also used an [[https://www.gnu.org/software/emms/][EMMS]] playlist to interlace the talk videos with standby music tracks. If the buffer times between talks were not so short, the whole event could have been autopiloted with elisp [[https://www.gnu.org/software/emacs/manual/html_node/elisp/Timers.html][=run-at-time=]]!
#+caption: Standby emacs session during the stream
[[../assets/e21stream.png]]
|