Hi list~
I was wondering if anyone had used processing (http://processing.org/)
for interface work or algorithm programming. It is made by some
people at MIT and is based in Java. It has libraries which allow UDP
and OSC communication, as well as much video stuff. I have just grown
curious about it, so any comments from experienced users would be
appreciated.
~Kyle
--
http://perhapsidid.blogspot.org
(((())))(()()((((((((()())))()(((((((())()()())())))
(())))))(()))))))))))))(((((((((((()()))))))))((())))
))(((((((((((())))())))))))))))))))__________
_____())))))(((((((((((((()))))))))))_______
((((((())))))))))))((((((((000)))oOOOOOO
Hi list,
applying textures to a newWave geo is a bit like a
game of hazard: it sometimes works okay, but most of
the time it doesn't. The object tangles gem in a way
that it doesn't even apply the same texture to a
straight geo (like a cube) in the right way anymore,
the whole renderchain gets twisted somehow ... It
looks like the texture is being blowed up by an
enormous amount.
I fooled around with different texture sizes (power of
2 / non-pow2), newWaves creation arguments and the
grid size, but their influence on this behaviour seems
to be erratic at best.
It would be really great if someone could come up with
a workaround that cures this behaviour - I really like
newWaves look ...
thanks in advance,
Thoralf.
Strangely enough, if this happens with one renderchain
(say, dv-cam -> newWave), activating another (usb ->
different newWave) makes everything straight again.
___________________________________________________________
Gesendet von Yahoo! Mail - Jetzt mit 250MB Speicher kostenlos - Hier anmelden: http://mail.yahoo.de
hi list...
here is the question:
since quiet a time i am working on project which simply generates bw lines
in dv resolution and higher. those lines are midi controlled from a logic
environment. now - as the stuff finally looks the way i want, i am looking
for a way to capture it. to get it on tape in dv res is an easy task to do
but the most parts of it only work out in higher resolutions (hd). if i
want to capture in resolutions like 1080 x 720 or even 1920 x 1080 i need
to use pix_write -- and this -- off course -- is not working in realtime.
i thought of a little workaround which could work out:
in a first run i run the stuff without any video capturing. only the midi
data should be captured to a kind of timeline which is rasterized to 25
fps. in a second run the captured midi data is played back by a player
which goes 1 frame forward each time pix write has finished rendering 1
frame. i think this would be a possibility for a nice off-line rendering
system. somehow... did anyone ever tried to build a system like this? does
anybody have a tip or a glue how to do?
i think a thing like that might be useful for a lot of users anyway.
regards
wolfgang
hello list!
I'd like to be able to downsample to 8000Hz in a sub-patch in order to
save a file at that rate and encode it in the "speex" format (speex is
optimised to deal with 8k files). Because block~ and switch~ specify the
downsampling factor in reciprocal form (eg. 0.5 is a factor of 2, 0.25 a
factor of 4 etc), I find i'm stuck specifying a factor of 6 (running Pd
at 48k a 6-fold downsample will give me 8k). Because the reciprocal of 6
is a recurring number and there's no quantising of values in the source
to find the nearest multiple of 2, errors result when trying to specify
a 6-fold downsample (eg. a number like 0.16666).
Can anyone see a solution to this - besides altering the source or using
a utility such as sox to do the conversion to 8k (which i'm doing
presently). Would it be useful to anyone else if block~ and switch~
could quantise to multiples of 2? ie. could this be included in
subsequent pd releases?
cheers, iain
_________
Iain Mott
www.reverberant.com
Hi list.
I want to do a bit of chroma key with Gem but the pix_chroma_key object
is not what I need.
pix_chroma_key will mix 2 pixes together along a chroma value.
I need to transform a certain chroma range into transparent pixels so
that certain parts of the texture a see through.
Any suggestions?
Tom
no luck with Motion Jpeg A or B on OS X as Yves said
Photo-jpeg loads the movie...(frames are read ) and then it crashes/
This is on two seperate machines
Hth
On Apr 10, 2005, at 7:02 PM, Tati de la O wrote:
> Hi
> i think Yves said that you should load .mov movies encoded with motion
> jpeg codec instead of sorenson codec, the one libquicktime can't handle
> often....
>
> transcode -Z 320x240 -f 25 -y mov -F jpeg -i original-movie.avi -o
> ready-for-pd-movie.mov
>
>
> :::t
>
>
>
> El dom, 10-04-2005 a las 12:34 -0400, Patrick Pagano escribió:
>> i can't seem to get a photo -jpeg quicktime to load
>> it segfaults after about ten seconds
>> is there any sample i could try?
>>
>>
>> _______________________________________________
>> PD-list(a)iem.at mailing list
>> UNSUBSCRIBE and account-management ->
>> http://lists.puredata.info/listinfo/pd-list
>
yeh, that's classical...
only photo-jpeg
( it's not our fault, it's what is supported by libquicktime )
cheeers,
sevy
Patrick Pagano wrote:
> i am getting this error when loading quicktime movies on os x
>
> i have libquiktime
>
> libquicktime cannot load plugins out of the sourcetree
> pdp_yqt: unsupported video codec
>
> quicktime_delete_vcodec_stub called
>
>
> _______________________________________________
> PD-list(a)iem.at mailing list
> UNSUBSCRIBE and account-management ->
http://lists.puredata.info/listinfo/pd-list
>
>
i am getting this error when loading quicktime movies on os x
i have libquiktime
libquicktime cannot load plugins out of the sourcetree
pdp_yqt: unsupported video codec
quicktime_delete_vcodec_stub called
Hi, does anyone know if it is possible to turn on/off the menubar at
the top?
Im running pd/Gem on OSX and I want to output a stereo screen to two
projectors. Cant seem to get rid if the menubar on window 1. Bit
annoying.
Thanks,
Timon.