Sorry, I can only test with a catalyst driver on my notebook at the moment:
I created a texture with:
glTexImage2D(GL_TEXTURE_2D, 0, GL_RGB32UI, width, height, 0, GL_RGB_INTEGER, GL_UNSIGNED_INT, NULL); |
A shader writes in it with:
out uvec3 FragColor; FragColor = uvec3(...);
in this case it writes the values (100,0,0) into the color channels.
Than I use a FBO and read the pixels with:
GLuint Pixel[3]; | |
glReadPixels(x, y, 1, 1, GL_RGB_INTEGER, GL_UNSIGNED_INT, &Pixel); |
The output is unexpected and looks like this:
printf("%u %d\n", Pixel[0], Pixel[0]);
4294967196 -100
The same source works flawless with geforce and osx driver.