This abstraction is in the examples/abstractions directory. I just realised that I have my local osc_abs prepended to the abstraction name. Just remove that and it should load.

Ah yes - but then the omniFilter_abs~ is also not present. Also the [command] object isn't loaded, from which library does it come?

Strangely, on 06 and 07 examples, [neuralnet~ models/audio_autoencoder~ encoder] doesn't create, but [neuralnet models/audio_autoencoder~ encoder] does. (I'm on windows)

Well, using a neural network boils down to the training dataset that you'll assemble. Get as many input/output combinations as you can. Then you'll have to choose the right structure and activation and loss functions, plus optimizer (although, usually the latter is an Adam).

Your question is a bit vague, and explaining how to set up a neural network in an email is not an easy task. Especially for me, since I'm not an expert (even though I coded this library).

that's true. in this context, it would envolve getting a 1-(or 2-)dimensional data, and detecting a pattern over time (probably between 0.1 and 1.5 seconds).


Cheers

On 9/15/24 23:37, João Pais wrote:

Hi, is the patch osc_abs/fm_3 missing from the package?

I'm looking for a way to make a model for leapmotion to recognize gestures (coming from combinations of xyz or velocity vectores for each finger, for example). Would you advise using this library for this?

Best,

JOao


[neuralnet] update! Version 0.3 has just been released!

  • New activation functions added
  • Access to the internal structure of a trained network (e.g. the latent space)
  • Storing weights and biases during training for visualization
  • Save models during training
  • Signal-rate version of the object!
  • Audio autoencoder example added!

Binaries for Linux, Raspberry Pi 3,4,5, macOS, and Windows (thanks Ben Wesch for macOS and Windows) are available through deken.
Souces are available on GitHub https://github.com/alexdrymonitis/neuralnet
Thanks to Ben Wesch, Dan Wilcox, IOhannes m zmoelnig, Christof Ressi, and others!

Enjoy!