Show more

link to stroboscopic HTML Show more

I fixed all the Jacobian issues I hope (there was some i,j vs j,i confusion as well as the U vs V typos...)

Now I'm redetecting the optimum loop period at every iteration in case it changes, a little slower but gives smoother loops it seems...

links to full screen stroboscopic HTML Show more

found a typo in my earlier jacobian code, no wonder it didn't work properly

but these are solutions to the fixed point iteration, not necessarily solutions to the reaction-diffusion time crystal problem...

strobing gif Show more

trying simple fixed point iteration now, x <- mix(x, F(x-1`mod`n), g). g = 1/2 in my current test, convergence metric fluctuating around 0.005, want it to be around 1000x smaller to consider it ready....

in earlier tests I ended up with fixed points, scoring as low as 0.0001

stroboscopic gif Show more

stroboscopic gif Show more

Each iteration is faster, but the global convergence seems slower, maybe I made a mistake in the Jacobian calculations.

Perhaps it would be better to try to solve:

x_1 = F(x_n)
x_2 = F(x_1)
x_3 = F(x_2)
x_n = F(x_{n-1})

but doing it naively would give 10TB of Jacobian dense matrix data. Needs sparse methods.

The analytic derivatives make it 2-3x faster than the finite differences version, so it was worth the trouble working out the formulas.

I added Jacobian matrix of analytic derivatives for Newton's method solver, but the first time I forgot to take into account the derivatives for the max-min division so it exploded to infinity. After dinner I got it right, seems to be converging a bit better now.

Meanwhile I left the earlier version (with finite difference numerical derivatives) running, the best output it gave was not quite seamlessly looping, and was mostly static anyway. I guess I have to figure out another hack to stop it converging on a fixed point, and instead converge on an interesting cycle...

I'm trying to find repeating patterns in a reaction-diffusion system using GSL's multidimensional root solver.

I figured out a hack to try to stop it converging on a constant featureless image (divide target by max-min of the variables).

It tends to give a starting image outside the 0..1 validity range, which evolves to a featureless image despite my efforts.

So far only failure to report on this one...

I saw a call: Vector Festival (Inter/Access Toronto) 2019 submissions open until 1st February 2019

I saw a call: Nanyang Technological University (Singapore) Global Digital Art Prize "Fourth Industrial Revolution" submissions open until 15th February 2019

. - training neural networks

o - training generative adversarial neural networks

O - training generative adversarial neural networks to recognize their own weights

I dramatically miscalculated the number of weights: the true figure is 2920

or maybe I used a different momentum value, with 0.95 it overshoots badly, with 0.5 I get to 3% error in around 8mins consistently

Show more

Welcome to, an instance for discussions around cultural freedom, experimental, new media art, net and computational culture, and things like that. This is part of a family of services that include mailing lists, group chat, and XMPP.