Hi, it is possible to use glsl, I would recommend to look at glsl_vertex, glsl_fragment and glsl_program. you don't have to bind any variables or textures. That is done automatically by glsl_program. you have to send the textures a message [texunit 1( or something like that if you use more than one. at the moment there is a problem with multitextures (you cannot use gl_MultiTexCoord1 or higher, but I am sure that will be fixed in the future (for now, you would use only texcoord0... if you want to link several shaders you have to use gemframebuffer and reference the texture to the next piece in the chain. the right outlet of texture gives you the ID of the texture on the gpu, and you can use that to feed it to another texture (???). I think nobody has done that before, and there is no documentation. But I hope to figure that out soon myself. cyrille has some shader examples online, and I put some online, too. but they are only tests. http://www.parasitaere-kapazitaeten.net/Pd/4shaders marius.
Julian Villegas wrote:
Hi,
I'm interested in reimplementing some externals that I've made in Pd. The idea is to take advantage of the graphic card power to deal with parallel data and boost the performance of the external. Have you guys done that before? I'm planning to use GLSL, the Open GL Shader Language, for compatibility. My problem is that I don't know how to do it, and I'm looking for some help or orientation of those among you who have some experience in this matter.
Thanks in advance.
PS: I'm posting this to pd-list and pd-dev-list, and I'm sorry for multiple instances of this email.
Julian Villegas
Me pregunto de un modo pensativo Que significa ser Colombiano? No se le respondi. Es un acto de fe JLB.
Do You Yahoo!? Tired of spam? Yahoo! Mail has the best spam protection around http://mail.yahoo.com
PD-dev mailing list PD-dev@iem.at http://lists.puredata.info/listinfo/pd-dev