Dear list!
is it possible to run 2 pixel shaders right after each other? how? With the first I would like to do some background supstraction and with the second some erosion filtering. Or can it be done in one? Did any of you implement any morphology filter as a shader in Pd?
best,
jonas
hello,
you can render the 1st shader in a framebuffer, and use it as a texture to render the 2nd shader. have a look at example in 10.glsl folder for more informations
cheers C
Le 07/07/2017 à 01:09, hi via Pd-list a écrit :
Dear list!
is it possible to run 2 pixel shaders right after each other? how? With the first I would like to do some background supstraction and with the second some erosion filtering. Or can it be done in one? Did any of you implement any morphology filter as a shader in Pd?
best,
jonas _______________________________________________ Pd-list@lists.iem.at mailing list UNSUBSCRIBE and account-management -> https://lists.puredata.info/listinfo/pd-list
I made a shader, and i don't understand why it does not work. I used Hias example for the background substraction provided with the Kinect examples, and modified it with an erosion part.
I will post my shader here, maybe anybody already can spot easily whats wrong..
#extension GL_ARB_texture_rectangle : enable
uniform float xmin; uniform float xmax; uniform float ymin; uniform float ymax; uniform float zmin; uniform float zmax; uniform int N; // size of eroding element can be scaled from outside
uniform sampler2DRect sourceTexture; // DEPTHMAP
void main (void)
{
vec2 pos = (gl_TextureMatrix[0] * gl_TexCoord[0]).st;
vec4 color = texture2DRect(sourceTexture, pos);
vec3 real = vec3 (0.0, 0.0, 0.0);
// COMPUTE REAL COORDINATES
// in mm
// z component real.z = color.r * 65536.0 + color.g * 256.0; // depth_mode 4 or 5
// x component
float FovH = 1.0144686707507438;
float XtoZ = tan(FovH / 2.0) * 2.0;
real.x = ((pos.x / 640.0) - 0.5) * real.z * XtoZ;
// y component
float FovV=0.78980943449644714;
float YtoZ = tan(FovV / 2.0) * 2.0;
real.y = (0.5 - (pos.y / 480.0)) * real.z * YtoZ;
// I used erosion example from:
// https://sourceforge.net/p/glmixer/Source/197/tree/trunk/shaders/imageProcessing_fragment.glsl
int i = 0;
int j = 0;
vec3 minValue = vec3(1.0);
vec3 tmp = vec3(0.0);
float step_w = 0.0;
step_w = 1.0 / 640.0;
float step_h = 0.0;
step_h = 1.0 / 480.0;
for (i = -(N-1)/2; i< (N+1)/2 ; i++){
for (j = -(N-1)/2; j< (N+1)/2; j++) {
tmp = texture2DRect(sourceTexture, vec2 ( pos.s + float(i)*step_w, pos.t + float(j)*step_h ) ).rgb;
real.z = tmp.r * 65536.0 + tmp.g * 256.0;
minValue = min(tmp, minValue); // take smaller of any position or minValue,
// if minValue is 0 for any of the pixel of erosion,
// it will allways be smaller in the whole erosion step
// and therefor the whole block should be 0
// -> DOES NOT WORK! Why?
if ((real.z <= zmin) || (real.z >= zmax)) // compare values against depth boundries
{
minValue = vec3 (0.0, 0.0, 0.0);
}
}
}
if (minValue == vec3(0.0)){
gl_FragColor = vec4(minValue, 1.0);
}
else{
gl_FragColor = vec4(color.r, color.g, 1.0, 1.0);
}
}
Am 07.07.2017 um 12:51 schrieb cyrille henry ch@chnry.net:
hello,
you can render the 1st shader in a framebuffer, and use it as a texture to render the 2nd shader. have a look at example in 10.glsl folder for more informations
cheers C
Le 07/07/2017 à 01:09, hi via Pd-list a écrit :
Dear list! is it possible to run 2 pixel shaders right after each other? how? With the first I would like to do some background supstraction and with the second some erosion filtering. Or can it be done in one? Did any of you implement any morphology filter as a shader in Pd? best, jonas _______________________________________________ Pd-list@lists.iem.at mailing list UNSUBSCRIBE and account-management -> https://lists.puredata.info/listinfo/pd-list
Pd-list@lists.iem.at mailing list UNSUBSCRIBE and account-management -> https://lists.puredata.info/listinfo/pd-list
On 07/07/17 12:00, hi via Pd-list wrote: [...]
step_w = 1.0 / 640.0; step_h = 1.0 / 480.0;
[...]
tmp = texture2DRect(sourceTexture, vec2 ( pos.s + float(i)*step_w, pos.t + float(j)*step_h ) ).rgb;
texture2DRect() expects integer-per-texel coordinates in the range [0..size), so your step_w and step_h should be 1.0 to get neighbouring texels.
See Matias Valdenegro's comment here: https://stackoverflow.com/questions/6736531/where-can-i-find-documentation-o...
Works now. Yeah! thanks Claude!
Am 07.07.2017 um 13:06 schrieb Claude Heiland-Allen claude@mathr.co.uk:
On 07/07/17 12:00, hi via Pd-list wrote: [...]
step_w = 1.0 / 640.0; step_h = 1.0 / 480.0;
[...]
tmp = texture2DRect(sourceTexture, vec2 ( pos.s + float(i)*step_w, pos.t + float(j)*step_h ) ).rgb;
texture2DRect() expects integer-per-texel coordinates in the range [0..size), so your step_w and step_h should be 1.0 to get neighbouring texels.
See Matias Valdenegro's comment here: https://stackoverflow.com/questions/6736531/where-can-i-find-documentation-o...
Claude
Pd-list@lists.iem.at mailing list UNSUBSCRIBE and account-management -> https://lists.puredata.info/listinfo/pd-list