Feels nice to finally get my monolith port to write text onscreen.

Worgle, my org-mode literate programming tangler, can now generate dot graphs via another program of mine called worgmap.

Pictured is the named code block structure of the core worgle file, visualized in dot + graphviz.

bc3.xxyq 

bc2.strz 

My new tungsten sphere. It is literally the definition of an expensive paperweight, but damn if it doesn't blend right in with with the rest of my workstation.

Getting back into the regular routine of composing etudes. This time, I'm attempting to make my sounds "bleed" into the visual domain, something I've always wanted to get into.

Here's the first one. It's a 1MB video file, so I hope I'm not abusing the attachment system too much.

Sound parameters can now control visual elements. The LFO controlling the frequency of this siren is also mapped to the scale of the circle being drawn.

The hardest part here was actually tuning the FFMPEG parameters to generate an mp4 file from the input h264 video and wav audio file.

On OSX, FFMPEG seems to want to use AAC by default. And if it ain't AAC, quicktime will not play it. It also turns out that the bitrate is set to be really low by default, which caused audible sound glitches and artifacts to be heard.

It still looks/sounds like crap, but I'm just pleased I got it working.

Okay! More x264 developments. I built an x264 in my monolith ecosystem. Scheme code is evaluating forthy (Runt) code. Runt code is drawing circles to pixel framebuffers. Pixel framebuffers are being written to h264 files.

Synched sound comes next...

More silly colors. But this time, the video encoding happens directly via the x264 C API instead of writing the raw YUV/PNG frames.

The next step is to get this to draw more interesting things (duh), and also to sync it with generated sound inside my software ecosystem.

Inspired by a recent HN post on the h264 video codec, I finally got around to playing around with the sample code that ships with the x264 library. I managed to figure out how to generate raw YUV frames from RGB buffers. The color test below is what I fed into the sample program that ships with x264.

For this 5 second 30fps test video, the raw YUV data was 21.9 mb, while the encoded h264 data was 4kb. The mp4 file below was converted from the h264 data via ffmpeg, and is 6kb.

The next step is to do away with the intermediate YUV file entirely and leverage the x264 API to generate the file directly. That way, I can generate longer videos without needing to worry about disk space.

I'm very excited to actually get things up and running. This approach is *so* much cleaner than trying to generate video files using FFMPEG on a bunch of PNG files. It might actually motivate me to do some serious audio-visual compositional work (which has been on my TODO list for years).

Dev Rant, Android, "Customers" 

trigger warning for vampires 

A curated selection of some of my stones. I definitely have a type that I enjoy.

From left to right:

- childhood stone, picked up in a beach in Nahant, MA
- stone picked up from a beach in Seattle WA
- stone picked up at Race Point Beach, Cape Code, MA
- palm stone, lapis lazuli, purchased on etsy
- palm stone, obsidian, purchased on amazon

Manged to get the standard C "hello world" working on the Classic Mac OS via the PCE emulator (the same one used by archive.org for their in-browser software emulation).

The cutting came to an abrupt end after I broke my blade. As you can see, I didn't dull it too bad this time so that's an improvement. Pluses and minuses.

I cut two different sized rings today. The larger rings are 16 gauge ss wire, and the smaller ones are 18 gauge ss wire. Both I intend to use for 4-in-1 maille weaves.

Show more
post.lurk.org

Hometown is adapted from Mastodon, a decentralized social network with no ads, no corporate surveillance, and ethical design.