SDL_TEXTUREACCESS_TARGET |
MrTAToad
|
I've been looking into this a bit more, and found the following results when the renderer is set :
"directx" - SDL_TEXTUREACCESS_TARGET texture is no longer displayed (although the texture is present, so it must have been cleared or pointing to invalid data). "opengl" - Texture is displayed on the return to window mode, but the amount drawn is less than it should be (and was originally). "software" - Nothing is displayed |
|||||||||||
|
Naith
|
I have tried what you've written and for me it works. I've created a SDL_Window and then a SDL_Renderer with the SDL_RENDERER_TARGETTEXTURE specified.
Then I created an empty texture ('TargetTexture') that can be used like a render target. I then change the current render target to the TargetTexture, render some textures onto it, reset the current render target and then render the TargetTexture. To test it all out I did this: 1. Started the program in fullscreen mode and then switched to window mode, waited a few seconds and then switched back to fullscreen mode. The TargetTexture get's rendered fine in all three states. 2. Started the program in window mode, switched to fullscreen mode, waited a few seconds and then switched back to window mode. The TargetTexture get's rendered fine in all three states. These two test has been made with the SDL_Window created with- and without the SDL_WINDOW_OPENGL flag set (so the two tests with Opengl and the two tests without Opengl) and everything worked fine. I want to note that I've setup the SDL_Renderer with the SDL_RENDERER_ACCELERATED flag on all the above test and everything has gone fine. But, when I've used the SDL_RENDERER_SOFTWARE instead, everything works except that the TargetTexture is flickering. I think I've tried out all the combinations. I am using SDL version 2.0.3, btw. |
|||||||||||
|
MrTAToad
|
Thanks for that - I'm beginning to suspect its an AMD graphics problem. Need to test on a few nVidia machines here...
|
|||||||||||
|
SDL_TEXTUREACCESS_TARGET |
Eric Wing
Guest
|
On 12/10/14, MrTAToad wrote:
If this is Windows only, I think this is related to this thread. (I had the same issue.) https://forums.libsdl.org/viewtopic.php?t=10850&sid=aea687fd1020c67f0d4123e01e09dc6c I confirmed it doesn't happen in the OpenGL renderer for me, only the Direct3D renderer. Since it only happens in Direct3D, I'm assuming it is a SDL/Windows backend bug. Thanks, Eric -- Beginning iPhone Games Development http://playcontrol.net/iphonegamebook/ _______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||
|
DLudwig
|
If I remember correctly (someone please correct me if any of this sounds wrong!), SDL's Direct3D renderer's behavior here is by design, and mirrors what Direct3D does. I.e. texture memory will get lost on occasion (details at http://msdn.microsoft.com/en-us/library/windows/desktop/bb174714(v=vs.85).aspx ). Some textures can be restored, as copies of their data are backed-up in system/main memory (for any textures in D3DPOOL_MANAGED). Direct3D will generally try to prevent potentially ill-performing data-copying from video/texture memory, to system memory, and as such, render-to-texture contents do not get backed up into system memory. Apps are expected to refresh these textures themselves. Because of lost render-to-texture content in D3D9, SDL2 emits an event, SDL_RENDER_TARGETS_RESET, whenever render-target memory gets lost. Apps can listen for this, and recreate these textures. This event was added relatively recently, within the past year, and after SDL 2.0.0 was released. Here's its changeset, for those interested: https://hg.libsdl.org/SDL/rev/f06add42160c None of this really explains why OpenGL doesn't lose render-to-texture memory, whereas D3D9 memory does. Perhaps, on the backend, OpenGL requires drivers to keep that memory around. I'm not certain of that though. Cheers, -- David L. |
|||||||||||||
|
MrTAToad
|
yes, sounds like a lost device DirectX problem, although the OpenGL does tend to chop bits off after returning from full screen.
It looks like steaming textures would need to be recreated after returning from full-screen. This streaming texture is used for creating tiles - its a shame that mode is needed for rendering to a texture. |
|||||||||||
|
SDL_TEXTUREACCESS_TARGET |
Jonas Kulla
Guest
|
2014-12-17 17:14 GMT+01:00 DLudwig:
The concept of "lost devices" doesn't exist in OpenGL, so the driver will transparently take care of restoring resources for you on mode switches / etc. |
|||||||||||||
|
MrTAToad
|
Yes, it is transmitting that event. All I need to do now is detecting and re-create the texture. |
|||||||||||||
|
SDL_TEXTUREACCESS_TARGET |
Eric Wing
Guest
|
Does D3D have a notification before the context is lost? (Or perhaps we do this inside the call for switching fullscreen in SDL.) I'm wondering if SDL can brute force handle this case for the user by reading back from video memory to CPU memory and storing the current textures. Then after the context is recreated, resubmit the missing textures back from CPU memory. Not exactly fast, but fullscreen toggle on some other platforms is kind of slow anyway...probably for this exact reason. Thanks, Eric -- Beginning iPhone Games Development http://playcontrol.net/iphonegamebook/ _______________________________________________ SDL mailing list http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org |
|||||||||||||
|
SDL_TEXTUREACCESS_TARGET |
Jonas Kulla
Guest
|
2014-12-18 2:27 GMT+01:00 Eric Wing:
No: http://msdn.microsoft.com/en-us/library/windows/desktop/bb174714%28v=vs.85%29.aspx |
|||||||||||||||
|