![]() |
OpenGL version guarantee | ![]() |
![]() |
![]() |
phildo
![]() |
![]() |
Ok, so apparently with SDL_GL_CONTEXT_PROFILE_CORE you can only get OpenGL 4.1 on OSX? And SDL_GL_CONTEXT_PROFILE_COMPATIBILITY you should get 2.1 (which is what I want!).
However, now when getting the version with glGetIntegerv(GL_MAJOR_VERSION, ...), which I had used before to correctly deduce that I was being given an OpenGL 4 profile, I am getting 32767 (the max int size for a signed 32 bit int). That response doesn't seem to be documented anywhere in either the SDL or the OpenGL docs, so I'm at a loss. What's weirder still is that glGetString(GL_VERSION) IS returning "2.1 INTEL-10.6.31". Which is weird. Any ideas? |
||||||||||
|
![]() |
![]() |
djkarstenv
![]() |
![]() |
I have had similar questions to you which have also been bugging me. I also would like to know exactly what OpenGL context I have once I have set the flags.
I tried playing around on both my PCs at home, and I added some legacy OpenGL code to my project to test the context. Using the following code I set my OpenGL context : SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, ...); SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, ...); However on both PCs if I set it to 3.2 or higher the legacy code won't render (which I expected as I presume I am now in a modern forward compatible OpenGL context). If I set it to 3.1 or lower, the legacy code will render. Okay, so far so good. The issue though is two things. 1. If I set the context to 3.1 or lower and call glGetString(GL_VERSION) the output is always 4.5 on one PC and 4.4 on the other (One PC is better than the other). The fact remains that my legacy code runs, so I know I am in the 3.1 or below context. But it tells me I'm in a 4.4 / 4.5 context. If I set it to 3.2 or higher and call glGetString(GL_VERSION) I get exactly the context I asked for, eg 3.3 or 4.2 etc. But the question is why? Does the graphics card not recognise anything less than 3.2? 2. The other burning question is if I leave out SDL_GL_SetAttribute(SDL_GL_CONTEXT_MAJOR_VERSION, ...) and SDL_GL_SetAttribute(SDL_GL_CONTEXT_MINOR_VERSION, ...) and query OpenGL what context I am in, it tells me 4.5 or 4.4 (depending on PC used) BUT my legacy code renders, so what I presume here is that I am in an older OpenGL context BUT again the graphics card doesn't recognise older versions when calling glGetString(GL_VERSION) so it just says the latest one instead. If anyone knows any better, please share their thoughts, as this is a little confusing to say the least. ![]() |
||||||||||
|