SDL_RenderBegin and SDL_RenderEnd |
Sik
|
I know people want to mix the renderer API with direct access to
OpenGL and such but the problem is that doing that will clobber with the renderer's expected state, resulting in things not working at all. So I had an idea: let's have two functions called SDL_RenderBegin and SDL_RenderEnd. The former sets the GPU state to what the renderer wants, the latter undoes that change. So it would be something like this: // ... do stuff with opengl ... SDL_RenderBegin(renderer); // ... do stuff with renderer ... SDL_RenderEnd(renderer); // ... do stuff with opengl ... As for backwards compatibility: - Programs that only use the renderer API without anything else don't need to worry about these functions, so they will work as-is (when the renderer initializes it will set the GPU state to what it wants, like SDL_RenderBegin does). - Programs that mix the renderer API and something else are already in undefined behavior territory, so But now these functions will give those programs a chance to fix that issue (once modified). Two more remarks: - Given this is bound to be used by third party libraries to draw graphics, maybe we'll want to give these functions some sort of "stack" behavior, where SDL_RenderEnd only undoes the changes when it has been called as many times as SDL_RenderBegin was. - Maybe we'll want to specify explicitly what each backend is expected to want from the GPU state, sorta like backend-specific ABIs. No idea to what extent this would be required, but I suppose it could matter for programs that only partially alter the state. Anyway, does anybody think this could help fix the biggest issue with the renderer API right now? I'll see if I can make a bug report later. _______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||
|
SDL_RenderBegin and SDL_RenderEnd |
Alex Szpakowski
Guest
|
That would involve many glGet* function calls in the OpenGL backends, which isn’t very efficient. I’m not sure if efficiency is something people care about when using SDL_Render though.
On Jan 8, 2015, at 10:16 AM, Sik the hedgehog wrote:
_______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||
|
SDL_RenderBegin and SDL_RenderEnd |
Joseph Carter
|
I think Sik may have jumped the gun ever so slightly on bringing this
discussion up because he and I were discussing it off-list and it's going to take _slightly_ more than these two functions to accomplish what we're talking about. First, when creating a renderer, you're going to have to ask for a renderer that can SDL_RENDER_WITHNATIVE3D. That's going to inform your renderer that certain things it typically does for you in older versions of SDL2, you've assumed responsibility for. Things such as deciding what to do when a resize event happens. The renderer presently tries to do something smart for you because it can, but it cannot do that if you're going to mix 2D and 3D. If a renderer cannot be created that supports this flag, your call to create one will fail. Second, to support this on fixed pipeline OpenGL, someone (hi!) is going to have to audit the fixed pipeline renderers to see what states they change, and what states they assume. This is not 100% trivial, but fortunately at this point fixed pipeline OpenGL is more or less a finite problem. Certain things you enable are going to need to be glDisable'd before calling SDL_RenderBegin and renderer functions. And should any renderer addition in the future enable other pipeline features it doesn't already use (somewhat unlikely, but possible) it will need to disable them in SDL_RenderEnd. This means some state thrash to avoid the glGet calls, but frankly the state thrashing is faster than glGetANYTHING, and if you're worried about speed loss because of thrashing state, why are you using the SDL renderer at all? It's _bad_ at this and no promise was ever made that it'd ever get any better. The restrictions on shader-based render targets are far fewer because there's far less actual state. In fact, it's quite possible that RenderEnd and RenderBegin could be empty functions. Third, it might be necessary to add some ability to query the SDL render state and possibly to make changes to it from your own code. You'll have to get along with the renderer, essentially. The idea of using some helper library in OpenGL predates the use of shaders. Fixed pipeline GL being a state machine, however, the use of those library functions required observance of preconditions and expectation of postconditions. This is a concept still somewhat in its infancy because while certainly possible, it'll take some effort to do it an abstract way whether you're using GL or GLES, fixed or programmable pipelines, etc. Probably your code has (or will have) certain assumptions about the 3D context it wants to work with. And presently, Direct3D in SDL is something you do either completely without SDL's intervention or gets done by the renderer. You can't ask for a Direct3D window like you can OpenGL. In order to make this work with non-OpenGL 3D APIs like Direct3D or Metal (should that ever get support), we're going to have to answer some serious questions. And likely the answer will be best implementation wins. ;) Joseph On Thu, Jan 08, 2015 at 08:17:24PM -0400, Alex Szpakowski wrote:
SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||||
|
SDL_RenderBegin and SDL_RenderEnd |
Jonny D
|
Here's my point of view from already implementing these things in SDL_gpu...
I'd rather that the rendering system did not do smart things behind my back, so SDL_gpu has explicit calls that you need to perform in response to certain events (e.g. GPU_SetWindowResolution() upon SDL_WINDOWEVENT_RESIZED and GPU_GetVirtualCoords() to scale device coordinates from mouse events to your virtual/logical resolution). It makes sense that for easing the porting of games you may want those things done for you, but that is no concern when the aim is a modern rendering system. I think something like SDL_RENDER_WITHNATIVE3D would be a hint to disable these smart things. SDL_gpu now has GPU_ResetRendererState(), which would be the equivalent of SDL_RenderEnd(). SDL_gpu stores the state that it expects to have and can reset that state in this function (no glGet*() at all). For reference here (SDL_gpu has renderers for fixed-function and shader-based OpenGL), it resets the current shader program, the current GL context, the glColor, the enabling of GL_TEXTURE_2D, the enabling of GL_BLEND, the blend function and equation, the viewport, the bound texture, and the bound framebuffer. Everything else is set as needed when flushing the vertex buffer, because a simple VBO buffered rendering optimization is already in SDL_gpu. Some of those settings (like blend modes) would likely have to happen in SDL_RenderEnd() even in a shader-based renderer. Your code certainly will need to know what 3D context it gets. SDL_gpu presents that information in its renderer structure so you can check which backend you got (only OpenGL and OpenGL ES so far), which backend version, which shading language, which shading language version, and which features SDL_gpu uses that it has detected as available. Jonny D On Thu, Jan 8, 2015 at 8:41 PM, T. Joseph Carter wrote:
|
|||||||||||||||||
|
SDL_RenderBegin and SDL_RenderEnd |
Joseph Carter
|
On Thu, Jan 08, 2015 at 11:02:46PM -0500, Jonathan Dearborn wrote:
The renderer has always done smart things behind your back, it's actually the greatest impediment to mixing SDL renderer and native 3D. And for ABI reasons, it can't STOP doing those things without being told to do otherwise. But there are alternatives, and I just suggested one to Sik tonight: Since 2002 at least, we've been seeing video cards that support DX9 and therefore support render to texture. By 2005 they were common. By 2007, it started to become difficult to buy a PC that didn't have that hardware built-in because Vista needed it for a core feature (Aero). You see where I'm going with this. Right now the GL renderers work on a window. What if it was possible to create a renderer using a SDL_GLContext instead provided that the context in question provided a means to render to texture? You'd still have to lay some ground rules for fixed pipeline OpenGL, but as already noted that's a given any time you mix that kind of OpenGL with a library you didn't write. Likewise it should be possible to create a renderer to target any SDL_surface since the software renderer already does it, or you could decide that you don't support this configuration. It's up to you at that point. Direct3D doesn't really enter into this discussion because of how SDL implements it. You cannot create a Direct3D context for native rendering in SDL in any way that integrates with SDL. You can do it, but you're on your own. If we decide we want SDL to be able to support this for MS platforms, that'd have to change. We talk about the D3D renderers (both 9 and 11) needing to be kept working alongside other renderers, but SDL has never really supported making DirectX as a 3D API. If that changes, it can be done the same way as OpenGL targets would be done.
I'd say that pretty much sums up how to deal with mixing 2D renderer and native code.
I think the better solution is to get what you want for a backend and then ask SDL for a renderer that can draw to it. Joseph _______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||||||
|
SDL_RenderBegin and SDL_RenderEnd |
Aidan Dodds
Guest
|
Having Begin() and End() functions introduces strong API coupling.
A similar aproach would be to provide a 'type_t * SaveState()' function and 'RestoreState( type_t * )', where 'type_t' could be some opaque type, that captures the current rendering state. This would give the user more control over how the rendering state is being managed. It also may provide certain benefits too for caching state. On 09/01/2015 07:04, T. Joseph Carter wrote:
_______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||||||||
|
SDL_RenderBegin and SDL_RenderEnd |
Jonny D
|
How specific should one be when requesting a renderer that supports your backend?
In SDL_gpu, you can request single specific backends (and even set up your preferred order of backends/renderers to try automatically). The request is limited to backend API and version. You can make init fail if it doesn't meet one of those. You can also force a failed init if there is an SDL_gpu feature that you mandate support for. You cannot (currently), however, cause init to fail because of something more specific like the GLSL version being too low. You would handle that after init. I feel there are a lot of details that you might want to be picky about when requesting a renderer, but might be a bit much to specify all up front. Jonny D On Fri, Jan 9, 2015 at 2:04 AM, T. Joseph Carter wrote:
|
|||||||||||||||||||
|
SDL_RenderBegin and SDL_RenderEnd |
Sik
|
2015-01-09 11:16 GMT-03:00, Aidan Dodds:
To be fair, the only real purpose is for Begin to indicate that the program will render with the renderer and End to indicate that the program will render on its own. The whole state restoring thing is mostly for convenience. _______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||
|
SDL_RenderBegin and SDL_RenderEnd |
Sik
|
2015-01-09 11:44 GMT-03:00, Jonathan Dearborn:
You can already force SDL to use a specific backend: SDL_SetHintWithPriority(SDL_HINT_RENDER_DRIVER, "opengl", SDL_HINT_OVERRIDE); _______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||
|
SDL_RenderBegin and SDL_RenderEnd |
i8degrees
|
Sik,
Does SDL_SetHintWithPriority actually force the use of a specific renderer on your system? On my OS X system: if( SDL_SetHintWithPriority(SDL_HINT_RENDER_DRIVER, "direct3d", SDL_HINT_OVERRIDE) == SDL_FALSE ) { exit(-1); } This does not work on my system -- it does not terminate the program as I expected it to. I believe that these are called "hints" precisely because of the fact that the requests may not be honored by SDL in certain circumstances, such as this very one. (The renderer simply moves on to using what is available on my system, which would be OpenGL). I wish that it were that simple, though! In any case, this issue is why I use a helper method that checks the available rendering drivers for the one that I specify and returns -1 in the case that it is not found, so that I can handle the issue in the game. In my particular case, it means displaying an error message and terminating, because I haven't yet dealt with writing a Direct3D back-end for the GUI library I use, and the game is pretty much useless without a GUI... Cheers, Jeffrey Carpenter _______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||
|
SDL_RenderBegin and SDL_RenderEnd |
Sik
|
Bleh, you're right, though that renders the hint completely pointless
then (since the whole point of it is to allow the user to override the chosen renderer). That said, if it would fail then you'd get an error when initializing the renderer, not when setting the hint (SDL_SetHintWithPriority just replaces the string, it doesn't know what the hints do). 2015-01-11 5:12 GMT-03:00, Jeffrey Carpenter:
SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||
|
SDL_RenderBegin and SDL_RenderEnd |
i8degrees
|
Ah well. Thanks for verifying that!
I would be happy to see it fail during renderer initialization -- when setting the hint with SDL_HINT_OVERRIDE or whatever explicit flag. It would probably be a very simple thing to fix up in the source, but unfortunately I have much bigger problems to worry about first :-) Cheers, Jeffrey Carpenter _______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||
|