Hi all,
Looking into this once again (I've had this problem for 10 years or more now) I just found out that gcc has a -ffast-math flag that prevents denormals for slowing the code down, as long as the CPU has SSE instructions. I don't know if the geode does or not, though!
On linux, at any rate, you can type CFLAGS="-ffast-math -O6" ./configure
at the appropriate moment when compiling Pd. I'm not sure how this will spin out in Windows, though.
Pd code is shot through with special tests to try to catch floating point operations to prevent them from making denormals, but apparently I haven't found every possible way they can come up.
cheers Miller
On Sun, Nov 02, 2008 at 10:07:13PM +0000, errordeveloper@gmail.com wrote:
On Fri, Oct 31, 2008 at 04:27:08PM -0400, Bill Gribble wrote:
I have a patch of medium complexity, with a handful of instruments~ and a bunch of sequencing and arranging-type message handling. On my speedy Intel laptop it has no problem and barely notches the CPU usage. However, when I run this patch on my teeny Geode-based UMPC it pegs CPU at 100%.
i wouldn't expect a lot from a geode machine ;))
I'm pretty sure this is a denormal issue. There are a grand total of maybe 5 noise~, 5 osc~, 10 vline~, 5 lop~, and 1 delay line in the whole patch and not much else besides message processing... I wouldn't guess this to run me out of compute power.
pardon, but what the word 'denormals' means? ..never heard it
Any hints on how to isolate where the denormals might be popping up? I have looked for signal processing loops, and the only ones I create are around the delayline (feedback) and I suppose in the iir implementation of the lop~.
Any help appreciated, Bill Gribble
Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list
Pd-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list