okay, so the problem is with the command 'ofSetColor': it copies the color variable into a new one; a char-based color. There the downscaling to 8 bit happens, thus prior to drawing.
ofFloatColor c = ofFloatColor(float, float, float,float);
ofSetColor(c); // c gets converted to char-color !
Then, further down the openframeworks renderer, I found only int-based methods to set the color in the renderer! For example, following the setColor command, you get this in the base renderer & even in the programmable renderer:
void ofGLRenderer::setColor(int r, int g, int b, int a){
currentStyle.color.set(r,g,b,a);
glColor4f(r/255.f,g/255.f,b/255.f,a/255.f);
// so they get floats here anyway -> the conversion to char is pretty useless..
anyways, adding a method called (of)setFloatColor troughout the renderers, the ofGraphics and abstract classes down to the opengl calls, gives me the correct result!
fbo.begin();
ofFloatColor col = ofFloatColor(requested, requested, requested, 1.0);
ofSetColorFloat(col.r, col.g, col.b, col.a);
ofFill();
ofRect(0, 0, 640, 480);
fbo.end();
This was a very ugly hack, but it works:
ofFloatColor col = ofFloatColor(requested, requested, requested, 1.0);
ofSetColorFloat(col.r, col.g, col.b, col.a);
results in
[notice ] Requested: 0.00522185 received in fbo pixels: 0.00522185
[notice ] Requested: 0.00542421 received in fbo pixels: 0.00542421
So now I finally end up with an fbo with correct values; so I can perform dithering to avoid color banding.
PS Sorry for spamming!