Quantcast
Channel: openFrameworks - Latest posts
Viewing all articles
Browse latest Browse all 40524

Getting int32 into shader

$
0
0

Hi.

i have an array of 32bit integers with which i want to draw objects (each value one object)

i basically do it like in the textureBufferInstancedExample.
instead of using GL_RGBA32F as format for the texture i use GL_RGBA8, and create the 32bit-value out of the RGBA components.

here the vertex-shader i user:

#version 150

uniform mat4 modelViewProjectionMatrix;
in vec4 position;
uniform samplerBuffer tex;
out vec4 color;

void main(){
    int id = gl_InstanceID;
    
    vec4 val = texelFetch(tex, id);
    
    int r = int((val.x * 255.0));
    int g = int((val.y * 255.0));
    int b = int((val.z * 255.0));
    int a = int((val.w * 255.0));

    float x = ((a<<24) + (b<<16) + (g<<8) + r);
    
    vec4 vPos = position;
    vPos.x += x;
    
    color = vec4(1.0, 1.0, 0.5, 1.0);
    gl_Position = modelViewProjectionMatrix * vPos;
}

that looks too complicated.

i tried to use GL_LUMINANCE32UI_EXT but then don't get values:

vec4 val = texelFetch(tex, id);

int r = val.x;

how would i get those 32bit-integers effectively into the shader?

thanks for any advice.
cheers
inx


Viewing all articles
Browse latest Browse all 40524

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>