I find that the best way to think of scaling is to think of it as a linear mapping (see attached bmp)...
then you just work out the scale factor of the mapping, ie (range 2)/(range
{or ((max 2)-(min 2))/((max 1)-(min 1)) }
and then the offset. linear mapping is clearly the linear function y = mx +c, hence you've worked out m from the scale factor, and c from the offset.
BTW eric, I'll probably get those modified rhythm estimators to you soon, just that my home pooter is absolutely up sh1t creek...
cheers!
matt
-=-=-=-=-=-=-=-=-=-=-=-=- http://www.loopit.org/ -=-=-=-=-=-=-=-=-=-=-=-=-
good morning,
instead of writing to an array, you can get the value of an audio signal more directly, and convert it to a control signal. see [snapshot~] or [avg~] and [unsig~] (which i think are in the zexy library)
here's on idea on how you could go about it. it creates an sinewave LFO using [osc~] based on BPM, snaphost~'s it every 5 msecs, scales the value to something ctlout can use (0 - 127), and outputs it.
you might want to check my math on the scaling. ugh. algebra. i should have paid more attention in junior high.
and i thought i'd never say it. algebra serves a purpose? i guess you win this time, junior high school math teacher.
--eric
------ http://USFamily.Net/info - Unlimited Internet - From
$8.99/mo! ------