you would copy the old frame into a different texture and then re
input it into the shader.
shaders address textures via texture units. you would have your new
frame coming in from texture unit 0, and the old frame (the output of
the shader one step earlier), copied to texture unit 1. You would then
have two samplers in your shader, and do with them whatever you
wanted. You bind the texture to a texture unit prior to calling the
shader. in GEM you do this with the texunit message. In Jitter you do
it in the XML file that describes all of your variables and the vertex/
fragment program.
If you wanted to take the average it would be:
< setup code cut out>
vec4 currentframe = texture2DRect(texture0sampler, texcoord0) vec4 previousframe = texture2DRect(texture1sampler, texcoord1);
gl_FragColor = (currentframe + previousframe)/2.0;
handling the copying of the output of this to a texture buffer happens
outside of GLSL, in your application, be it Jitter, GEM, VVVV or
whatnot.
And btw, fragment shaders run once for every texel or fragment (a
texture mapped pixel), not once per "frame" :)
On Jan 7, 2008, at 1:37 AM, Batuhan Bozkurt wrote:
Thanks vade, your help is greatly appreciated. I'm happy to see thatthings like this are possible with shader programming. Looking through the opengl shader book and various shader tutorials online,
I've actually found out how to sample a texture and access r g b a values with dot notation. But as I'm not very familiar with the process yet, I've been questioning how to buffer and access past values by shader programming. I'm not very bright when it comes to programming so it appeared to me that a shader program is simply a piece of code that is iterated for every new frame(?) from beginning to end that produces an output based on a given functional structure. I was thinking that the state of the variables are reinitialised(except for uniforms that
can be passed as arguments?) each time so I'm curious about how I would store values to some depth(like maybe a static variable in C functions, so it's value is kept in a new iteration). Or maybe holding the past five values in an array or something(like the Bucket object on max or
cyclone lib). Without being able to keep values I can't achieve processes that have a past memory.so
vec4 mytexture = texture2DRect(textureSampler, textureCoordinate);
would sample the current frame, but when the new frame comes into
play, it would contain the new frame right? How would I go for accessing the old data?Sorry if this is a fundamental thing about shader programming, but I think it is a very simple process if it is really possible. So maybe
you have something to say about it.Thanks! BB
vade wrote:
Hi
You can completely do what you want on the GPU with a shader or shaders. In fact, the very shader you describe exists from cycling74 and is included with jitter.
As for dealing with color planes, the RGBA values, its VERY easy to
do with shaders. You simply sample a texture with:vec4 mytexture = texture2DRect(textureSampler, textureCoordinate);
and then you can address you individual vector components with
mytexture.a mytexture.r mytexture.g mytexture.b
each as a floating point value.
:)
On Jan 7, 2008, at 12:28 AM, Batuhan Bozkurt wrote:
This is a very elegant solution for a delay effect I should say. I've never known something called "3d texture" exists but it completely makes sense. I've searched through documentation and the list for gem support for such thing but no luck yet. But thanks for the shader and the idea anyways!
The problem here is that, I actually need a more generalised framework of working with time and past frames(or pixels) when dealing with time based effects. This vertex shader approach with 3D texture and a 2D map probably would not help let's say, if I wanted to use an averaging lowpass filter(depending on past pixel values) to the R G and B values of each pixel seperately.
I'd like to know if there is any way to hold values back in buffers(like arrays) on shader level and make operations on that data by shader programming. So maybe I could say "get current value and average it with the previous value and set it as the new current value" or something like that. If there is a way, than I'd like to dig more deeply into it, but if this question looks very stupid than I'm probably getting the concept of graphics pipeline and purpose of shading language completely wrong so maybe I should stop wasting time on thinking like this and try to find other approaches.
BB
vade wrote:
This is entirely possible, however you would want to use a 3D texture, something on the order of dimensions 320 by 240 by x (where x is how many frames of time 'back' you want to go). This will be relatively heavy video memory wise I would imagine. However, I have a shader for you that does this.
Absolutely no idea what so over if GEM can deal with 3D textures. Id assume since it has low level openGl support it should be able to, but how to get 2D video frames into that 3D texture via pix_xxx I have no idea.
Here is the shader, written by Andrew Benson from Cycling74 iirc.
This is sans vertex shader, but the vertex shader is basically a passthrough, so its very simple.
texa is a 2d lookup table - greyscale, where the luminance of the pixel at the point determines how far back in time to go. This shader assumes a 512x512 map, but you could make that dynamic by passing a varying variable from the vertex shader that looks at the dims of the 3D texture.
HTH.
varying vec2 texcoord0; varying vec2 texcoord1; uniform float slice; uniform sampler3D texo; uniform sampler2DRect texa; const vec4 coeff = vec4(0.299, .587, 0.114, 0.);
void main( void ) { float v1 = dot(texture2DRect(texa,texcoord0*vec2(512.)),coeff); //assumes 512 x 512 slice map. Pretty arbitrary... vec4 v0 = texture3D(texo, vec3(texcoord0.xy,v1)); gl_FragColor = v0; }
On Jan 6, 2008, at 4:21 AM, Batuhan Bozkurt wrote:
Hello,
Let me start off by saying that I don't have much experience with computer graphics, so my knowledge is very limited on this area,
I'm more of an audio guy but I have some ideas in mind that would apply to graphics well. I just want to experiment.I want to give a simple example to show my question. Suppose I have a video running and I want to delay each pixel between 0 and a
maximum time seperately. This is the most basic time based effect I can think of. So for a 320x200 video for example, I would need 64000
delaying units running seperately and to be able to do this in realtime, I think I'd need to use GPU for computation.So I'm thinking of using pixel shaders. But I'm not really sure if this is possible or not with it. I grabbed the orange book to have an idea about the process, I did not have much time tolook in detail but I could not find any references to such an operation that made me think that I'm on the wrong track. So before going any further learning GLSL, I'd like to have your ideas on this.
Is GLSL along with GEM is a nice way to do such an operation? Is it even possible to do it in realtime? If GPU powered realtime operation is not possible, is there any
tool that you know that is capable of doing such things(realtime or offline)? I have many ideas on processes that modifies pixels which are dependant on the states of pixels before them(i.e. with memory) and I'm trying to find a way to implement them. Any help is appreciated.Thanks BB
PD-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list
PD-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list
PD-list@iem.at mailing list UNSUBSCRIBE and account-management -> http://lists.puredata.info/listinfo/pd-list