Quantcast
Viewing all articles
Browse latest Browse all 40524

ofBufferObject and async camera/video texture upload

the idea is pretty much the same, when you get a new frame from the camera just bind the buffer as GL_PiXEL_PACK_BUFFER, map the buffer onto memory as GL_WRITE_ONLY and memcpy the frane pixels to the buffer. Then to finally send the pixels to a texture, load the buffer contents to it using ofTexture::loadData as you would normally do with pixels but using the ofBufferObject instead. haven't tested it but it would be something like:

//.h
ofTexture tex;
ofBufferObject pbo1;
ofThreadChannel<unsigned char*> channel;
ofThreadChannel<bool> ready;


//setup
tex.allocate(w,h,GL_RGB);
pbo.allocate(bytes,  GL_STREAM_DRAW);
auto dstBuffer = pbo.map<unsigned char>(GL_WRITE_ONLY);
channel.send(dstBuffer);


//video callback
unsigned char * dstPixels;
dstPixels.receive(dstPixels);
memcpy(dstPixels, rawCamPixels, w*h*3);
channelReady.send(true);

//draw
bool ready;
ifchannel.tryReceive(ready)){
    pbo.unmap();
    tex.loadData(pbo,GL_RGB,GL_UNSIGNED_BYTE);
    auto dstBuffer = pbo.map<unsigned char>(GL_WRITE_ONLY);
    channel.send(dstBuffer);
}

i'm using a channel to send the pixels cause usually when using a camera or video library the frame callback will happen on a different thread. you can also use more than 1 pbo in case it's nt fast enough to help hide the latency

also i just checked and there was a small fix required to make this work, you will need the latest ofBufferObject from git master or just apply the changes from https://github.com/openframeworks/openFrameworks/commit/93334123637dbc50aa8ca85cfabde2af8a21b13e


Viewing all articles
Browse latest Browse all 40524

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>