Working on my #FractalFlame renderer.
Better #DensityEstimation, doing it with linear histograms instead of logarithmic makes it work with "keep doubling" batch sizes instead of having to do it every small constant batch size. This sped up one test from 18s to 12s.
Also proper #BézierCurve interpolation of the #MoebiusTransformation via #Slerp of multiplier and two fixed points on the #RiemannSphere, remembering that additional #ControlPoints are needed and the curve passes through only every third point. The additional points are generated from approximated derivatives at the points where the curve passes through. Animation speed is normalized: parameter found by binary search in a precomputed array of approximate arc lengths.
Also #AutoWhiteBalance copy/pasted from GIMP, only the first frame is analysed and the resulting bounds are applied to all frames, to avoid strobing from independent frames (better would be to analyse the whole video, but storage is probably a bit of an issue for that).
Also #MotionBlur by accumulating discrete subframes of 1-sample-per-pixel each into the histogram, I think the video has 256 samples per pixel total.
Previous video has bad appearance at the poles when viewed in 360 (after injecting metadata with google spatial-media python tool). I tried jittering the histogram accumulation to blur it but the artifacts remain. I guess I'll have to do the blur in the density estimation pass.
Think I fixed it. Formula for horizontal blur radius hx in terms of vertical blur radius hy (previously omnidirectional blur radius h)
height2f = 0.5f * height
z = (y + 0.5f) / height2f - 1.0f;
r2 = 1.0f - z * z;
hx = hy / r2;
@mathr looks incredible
@mathr it looks crazy indeed .... I could not resist to try it in a 360 video template in AFrame ... and it's indeed pretty wild ;)
Welcome to post.lurk.org, an instance for discussions around cultural freedom, experimental, new media art, net and computational culture, and things like that.