Show more

mathr.co.uk/zoomasm 2.1 released!

mostly small bugfixes, only new feature is jump-to-time buttons for waypoints in the timeline editor

For 3.0 I want to change the API for colouring GLSL from "one function with loads of arguments with values prefilled for the current pixel" to "one function with no arguments and an API with function calls to get values for the current pixel", which will enable me to add extra features like "get value for neighbouring pixels by offset" or "get value for absolute screen coordinates", which should allow implementing colouring algorithms like embossing flat iteration bands, and simplify overlays like zoom depth display.

Show thread

I added a new feature to , now the timestamps for each waypoint in the timeline window are buttons that jump to the corresponding time.

The idea for this emerged from a discussion on fractalforums.org, but I don't have a specific link handy, and I can't remember if I thought of it or someone else suggested it.

Hopefully will have time to release a new 2.1 version this week, before working on breaking changes for 3.0.

Show thread

Worked on another this morning.

Bug: if you select a custom output size, but then enter the dimensions of an existing size preset, the entry boxes disappear and you can't enter any custom sizes any more (only choose from the presets).

Fix: when searching for a matching preset, start search at index 0 (custom size) instead of index 1 (first preset). The first match is chosen, which means "custom" size take priority over presets, even when the values are the same.

Also applies to other things with custom+preset dropdowns like FPS and audio bitrate.

Show thread

Found a bug in 's 360 projection (the distance estimate scaling was wrong).

Trying to do the maths by hand is too hard, so I copied my implementation (for ) from my fragm-examples repository, minus the macro hell, plus some quaternion-to-rotation-matrix code ported from Python that I found online.

Now it looks okish in the view, need to render some tests at various orientations and inject spatial metadata for viewing in VLC to be more sure I got it right...

Starting to get back into practicing in the programming language after months of absence.

The tones (everything but percussion) in this one are mostly:

```
samphold(&s->sh[0], sin(pow(2.0, k) * cos(30 * twopi * 64 * t +
pow(2, 4 * cos(twopi * t)) * sin(ceil(pow(4, 1 + cos(twopi * t + 0.25 * twopi *
sin(4 * twopi * t)))) * 64 * twopi * t)) * pow(2.0, cos(twopi * 16 * t) * 2.0)),
wrap(pow(4, 1 + cos(16 * twopi * t)) * wrap(400 * 64 * t)));
```

so a bit of wave shaping (first `sin()`), phase modulation (inside of first `cos()`), strange-rhythm sequencing (the `ceil()` of a wiggly function) and bitcrush (using `samphold()` to lower sample rate). Two UGENs (for want of a better concept): a 16 bar phasor for the `t` value in [0..1), and the samphold; the rest is stateless maths.

The code is duplicated with minor modifications for the other channel, and there's some simple percussion, all fed through a resonant high pass filter for the bass drone and then a multiband compressor.

Same problem in 2021, only the backup is now over 500MB and the text file is over 5MB.

Adding some tags to help find this thread in case next year is more of the same:

Show thread

I released hgmp-0.1.2, which thanks to a patch from Sylvain Henry now supports ghc-9's new mechanism (ghc-bignum package replaces integer-gmp (and integer-simple) packages).

ghc-9 is still in release candidate phase, not sure when the final version will be out.

hgmp is a Haskell binding to GMP, which is a library for arbitrary precision integer and rational number arithmetic. The Glasgow Haskell Compiler usually uses GMP for it's Integer type, the hgmp package exposes it to the Foreign Function Interface so you can interoperate between GHC's Integer and Rational and other-language libraries' mpz_t and mpq_t.

hackage.haskell.org/package/hg

claude boosted

Open call for female artists to play online @ Sonic Electronics - OFF Iklectik - contact laura@netzzz.net

claude boosted

'web sound space' 

We were discussing yesterday, whether to iterate a sound installation for 'browser format'. I think it's a radically different space (and would be an entirely new piece). Some thoughts:

- personal, intimate space
- noisy space ("between tabs", "between coffee and e-mail")
- framed space, poststamp space. who has their laptop wired up normally to a good sound system?
- volatile space (open a tab, close a tab, forgotten)
- radiophonic space (tending to your computer is not unlike listening to a radio receiver? also: "radio background"; remote connection)
- connected space (when going via a server vs. front-end only)?
- what's the source of time? does it start when you open the tab, does the piece "always exist" and you merely tune in?
- multiphonic space (there can be different tabs running)
- controlled space (mute / unmute)
...
- what else?

game AI algorithms 

So far this week I've implemented 3 for a that @netzzz devised.

1. randomly pick moves. keep track of the best valid move found. repeat until time limit.

2. methodical exhaustive search of moves ordered by increasing complexity. keep best valid move. continue until time limit.

3. use fancy heuristics to cascade through the space of neighbouring submoves. if time limit is reached give the best valid move found so far.

Algorithm 2 easily beats algorithm 1 with the same time limit.

Algorithm 3 is quite competitive and uses vastly much less CPU time (less than 0.1 seconds for a whole match, while I had the other algorithms using several seconds wall-clock time per move (albeit probably overkill); the algorithms are all parallelized).

If I reduce the time allowance for Algorithm 2 to match Algorithm 3's speed, Algorithm 3 wins very comfortably.

The algorithms are not perfect yet, they can still miss some complex moves in the end game with many flips.

claude boosted

IKLECTIK [off-site] presents,

CORPORA ALIENA - ECLECTIC ELECTRONICS

Wednesday 13 January 2021 | 8.30pm (GMT)

IKLECTIK Youtube ch: youtu.be/9LA67Iw3Q24
IKLECTIK FB Page: facebook.com/IKLECTIK
IKLECTIK TWITCH ch: twitch.tv/iklectik

Corpora Aliena is pleased to present an IKLECTIK [off-site] online presentation of Eclectic Electronics – and features leading international and pioneering musicians, composers and sound artists, including: Nnja Riot – Jimmie Peggie – Laura Netz – James L .Malone – and Bernhard Living.

Programme:
Jimmie Peggie
Bernhard Living
Nnja Riot
James L. Malone
Laura Netz

youtube.com/watch?v=9LA67Iw3Q2

notes on running Zoom in a VM 

finally start or join a meeting, and hope screensharing the visuals window and computer audio does the Right Thing and works at acceptable quality.... fingers crossed... then enjoy the performance!

Show thread

notes on running Zoom in a VM 

Finally start Zoom, and set the settings to share screen with full manual options always. Sound to "same as system", "low" microphone background noise processing ("optimize for music"), don't adjust levels automatically.

Check that Zoom can make noise to the speakers, and hope for the best w.r.t. sharing computer sound when screen sharing (I haven't found a level meter for that...)

Maybe that needs fiddling in Pulse Audio volume controls to get the right sound from the soundcard (disable USB webcam mic) into Zoom (pulse audio jack source seems to do the trick with system X pulseaudio in qjackctl audio connections).

Show thread

notes on running Zoom in a VM 

Start the custom visuals program so it can connect to JACK (signal from usb soundcard) and claim the USB webcam (`/dev/video0` or whatever in the VM) before starting Zoom.

Start Pd with the `-sleepgrain 0.1` option so MIDI timing isn't so terrible, configure it to use ALSA MIDI and connect it the USB soundcard in qjackctl connections window (ALSA tab, not MIDI tab). Load the synth control patch and enter some values to start triggering notes.

Adjust the volume of the soundcard input and monitor passthrough level for best fit to the environs. The visuals software should be reacting to the sound and levels should be showing in the Pulse Audio volume control applet.

Show thread

notes on running Zoom in a VM 

in the VM, configure JACK to use the USB Audio soundcard. Start jackd.

Set up pulseaudio jack bridge:

/etc/pulse/default.pa:
```
load-module module-jack-source
load-module module-jack-sink
set-default-source jack_in
set-default-sink jack_out
```

restart pulse:

```
pulseaudio - -kill
pulseaudio --start
```

if it doesn't work (no `PulseAudio` in qjackctl patch bay in addition to `system`) try

```
pactl load-module module-jack-source
pactl load-module module-jack-sink
```

Show thread

notes on running Zoom in a VM 

Startup script:

```
#!/bin/bash
QEMU_AUDIO_DRV=pa \
QEMU_PA_SAMPLES=8192 \
QEMU_AUDIO_TIMER_PERIOD=99 \
QEMU_PA_SERVER=/run/user/1000/pulse/native \
qemu-system-x86_64 \
-soundhw hda -device qemu-xhci \
-m 8G -hda zoom.img -cpu host -accel kvm -smp 8 \
-vga virtio -display gtk,gl=on -show-cursor -usb -device usb-tablet \
-usb -device usb-host,vendorid=0x041e,productid=0x4095 \
-usb -device usb-host,vendorid=0x0582,productid=0x0074
```

I don't think that PA stuff is really needed any more now that I added usb-host passthrough for my external soundcard.

Don't forget to `chown` the relevant `/dev/bus/usb/foo/bar` device nodes to the user running QEMU, otherwise the passthrough won't work. Use `lsusb` to see bus address and ids. This resets on host reboot, and USB devices can change bus address too (maybe some udev magic could fix it but I don't need to run Zoom often).

Show thread

notes on running Zoom in a VM 

There's no way I'm letting that stuff on my main OS installs, so I am running it in virtualized Debian Buster in QEMU. Bit tricky getting everything sorted, what with needing passthrough for host OpenGL, USB soundcard, USB webcam, and JACK running in the VM with PulseAudio bridge. I have custom software processing the webcam and sound from the USB soundcard (via JACK), and screensharing the window and system sound (not USB webcam mic, hopefully!) seems to work now.

Figured out how to plot wakes implicitly.

Given a wake with parameter ray angles $s_-, s_+$, for each pixel $c$ in the image trace the dynamic rays at those angles towards the Julia set: if and only if they land together, then $c$ is in the wake.

An application of Theorem 2.5 from arxiv.org/abs/1709.09869 "A survey on MLC, Rigidity and related topics" by Anna Miriam Benini.

Previously I had been tracing the two parameter rays into a polygonal boundary and filling that using rasterization, to do: benchmark and compare the two methods in various scenarios.

claude boosted
Show more
post.lurk.org

Welcome to post.lurk.org, an instance for discussions around cultural freedom, experimental, new media art, net and computational culture, and things like that.