Quantcast
Channel: AMD Developer Forums: Message List - OpenGL & Vulkan
Viewing all articles
Browse latest Browse all 631

Bugs on Linux with GL state queries

$
0
0

There are a couple of GL state queries that fail on linux, namely GL_VERTEX_BINDING_BUFFER and GL_POLYGON_MODE. The former is listed as an unrecognised enum to glGetIntegeri_v. GL_POLYGON_MODE is recognised, but it claims that it was removed from the Core profile (it works fine on Compatibility).

 

This happens for me on the latest drivers available at the time of writing (GL_VERSION is 4.4.13084 Core Profile/Debug Context 14.301.1001), and on various core profile versions, I tested 3.2, 4.2, 4.3, 4.4.

 

Some code to show the problem. I just initialised a 4.4 (for example) context with SDL, but I don't think it's specific to that as I can also repro this on a codebase that calls straight to glXCreateContextAttribs

 

  printf("GL_VENDOR: %s\n", glGetString(GL_VENDOR));  printf("GL_RENDERER: %s\n", glGetString(GL_RENDERER));  printf("GL_VERSION: %s\n", glGetString(GL_VERSION));  GLint maj = 0, minr = 0;  glGetIntegerv(GL_MAJOR_VERSION, &maj);  glGetIntegerv(GL_MINOR_VERSION, &minr);  printf("GL_MAJOR/MINOR: %d.%d\n", maj, minr);  GLuint vertex_array = 0;  GLuint vertex_buffer = 0;  glGenVertexArrays(1, &vertex_array);  glBindVertexArray(vertex_array);  glGenBuffers(1, &vertex_buffer);  glBindBuffer(GL_ARRAY_BUFFER, vertex_buffer);  glBufferData(GL_ARRAY_BUFFER, 128, NULL, GL_STATIC_DRAW);  glBindBuffer(GL_ARRAY_BUFFER, 0);  glVertexAttribBinding(0, 0);  glBindVertexBuffer(0, vertex_buffer, 32, 16);  printf("vertex_array: %d\n", vertex_array);  printf("vertex_buffer: %d\n", vertex_buffer);  GLuint vao_binding = 0;  GLuint vb_binding = 0;  GLuint64 vb_offset = 0;  GLuint64 vb_stride = 0;  glGetIntegerv(GL_VERTEX_ARRAY_BINDING, (GLint *)&vao_binding);  glGetIntegeri_v(GL_VERTEX_BINDING_OFFSET, 0, (GLint *)&vb_offset);  glGetIntegeri_v(GL_VERTEX_BINDING_STRIDE, 0, (GLint *)&vb_stride);  glGetIntegeri_v(GL_VERTEX_BINDING_BUFFER, 0, (GLint *)&vb_binding);  printf("A: %d %d %d %d\n", vao_binding, vb_binding, (int)vb_offset, (int)vb_stride);  // this works, fetching the buffer binding 'through' the VAO.  glGetVertexAttribiv(0, GL_VERTEX_ATTRIB_ARRAY_BUFFER_BINDING, (GLint *)&vb_binding);  printf("B: %d %d %d %d\n", vao_binding, vb_binding, (int)vb_offset, (int)vb_stride);  GLint data[2] = { 0, 0 };  glGetIntegerv(GL_POLYGON_MODE, data);  printf("%d %d\n", data[0], data[1]);

 

The output from the above code, including output from DebugMessageCallback:

 

GL_VENDOR: ATI Technologies Inc.
GL_RENDERER: AMD Radeon R9 200 Series
GL_VERSION: 4.4.13084 Core Profile/Debug Context 14.301.1001
GL_MAJOR/MINOR: 4.4
vertex_array: 1
vertex_buffer: 1
Got a Debug message from 33350, type 33356, ID 1001, severity 37190:
'glGetIntegerIndexedv parameter <pname> has an invalid enum '0x8f4f' (GL_INVALID_ENUM)'
A: 1 0 32 16
B: 1 1 32 16
Got a Debug message from 33350, type 33356, ID 3200, severity 37190:
'Using glGetIntegerv in a Core context with parameter <pname> and enum '0xb40' which was removed from Core OpenGL (GL_INVALID_ENUM)'
1337 1337 1337 1337

 

I would expect the A: and B: lines to both print "1 1 32 16", and the last line should print "6914 6914" (GL_FILL) Please let me know if you need any more information.

 

Baldur


Viewing all articles
Browse latest Browse all 631

Trending Articles