The SDL forums have moved to discourse.libsdl.org.
This is just a read-only archive of the previous forums, to keep old links working.


SDL Forum Index
SDL
Simple Directmedia Layer Forums
New SDL 1.3 event type proposal for discussion
Mike Gorchak
Guest

Hello, All!

I'm going to add a new SDL event type for notifying application about fact,
that all video memory content is lost, for example, we'd call it
SDL_TEXTURES_LOST. This event will define that all textures must be
reloaded, all windows are exposed and have to be redrawn.

I think, that this event can't belong to SDL_WINDOWEVENT_XXXX group, because
it is global and concerns all created windows and SDL application itself.

Right now this event could be implemented through the SYSWM event, but I
think it could be useful on all platforms, since most GUIs have such kind of
events.

There are few cases, where this event could be very helpful:
1) When user switches video mode. On some systems this means, that video
memory content is lost (damaged, contains garbage, etc), and all offscreen
surfaces are no more valid.
2) When SDL application have two or more windows and one of the windows goes
to the fullscreen with video mode switch. After video mode switch all
textures become invalid on some systems.
3) When there are two or more SDL applications are running without
interracting between each other, and one of the SDL application switches to
fullscreen video mode. This also means on some systems, that all textures
must be reloaded.
4) In the multi video card system (with one or more video ouput ports on
each), when window is being dragged across displays which are belong to
different video cards. Some systems in this case issuing an event that
application have to re-create textures.

Sam, what's your comments about this ?

With best regards, Mike Gorchak. E-mail:



_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
New SDL 1.3 event type proposal for discussion
Bob


Joined: 19 Sep 2009
Posts: 185
On Mon, Nov 23, 2009 at 1:51 AM, Mike Gorchak wrote:
Quote:
Hello, All!

I'm going to add a new SDL event type for notifying application about fact,
that all video memory content is lost, for example, we'd call it
SDL_TEXTURES_LOST. This event will define that all textures must be
reloaded, all windows are exposed and have to be redrawn.

I think, that this event can't belong to SDL_WINDOWEVENT_XXXX group, because
it is global and concerns all created windows and SDL application itself.

Right now this event could be implemented through the SYSWM event, but I
think it could be useful on all platforms, since most GUIs have such kind of
events.

There are few cases, where this event could be very helpful:
1) When user switches video mode. On some systems this means, that video
memory content is lost (damaged, contains garbage, etc), and all offscreen
surfaces are no more valid.
2) When SDL application have two or more windows and one of the windows goes
to the fullscreen with video mode switch. After video mode switch all
textures become invalid on some systems.
3) When there are two or more SDL applications are running without
interracting between each other, and one of the SDL application switches to
fullscreen video mode. This also means on some systems, that all textures
must be reloaded.
4) In the multi video card system (with one or more video ouput ports on
each), when window is being dragged across displays which are belong to
different video cards. Some systems in this case issuing an event that
application have to re-create textures.

Sam, what's your comments about this ?

Well, I'm not Sam, but I'm going to reply anyway. Personally, I like
the idea of doing something about the lost textures problem. Over the
years we've seen many postings on this list by people who wrote an
application on an

As I understand it, this is actually a lost context problem. To me
that means the event should tell me that I have lost my complete
context, not just my textures. The context contains a lot more than
just textures.

What I'd really like to see is a graphics API that does lazy resource
loading that uses this event to mark all resources as unloaded.
Something like that would help avoid the problem of suddenly having
the application freeze while it madly uploads a few hundred megabytes
of textures to the video card. :-) If we had that then we wouldn't
need to expose the event to programmers.

Bob Pendleton



Quote:

With best regards, Mike Gorchak.  E-mail:



_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org




--
+-----------------------------------------------------------
+ Bob Pendleton: writer and programmer
+ email:
+ web: www.TheGrumpyProgrammer.com
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
New SDL 1.3 event type proposal for discussion
Mason Wheeler
Guest

What I'd really like to see is a graphics backend that didn't delete
all your textures in the first place just because you switched to
fullscreen mode or minimized the window or something. I've
never seen the point of trashing that much memory, and then
making the program waste time recreating an exact copy of
it. (It's not like you're going to use different textures for
fullscreen mode, afterall!)

Does anyone know why it does it that way? Because I've
never understood it.


From: Bob Pendleton
Subject: Re: [SDL] New SDL 1.3 event type proposal for discussion

What I'd really like to see is a graphics API that does lazy resource
loading that uses this event to mark all resources as unloaded.
Something like that would help avoid the problem of suddenly having
the application freeze while it madly uploads a few hundred megabytes
of textures to the video card. :-) If we had that then we wouldn't
need to expose the event to programmers.
New SDL 1.3 event type proposal for discussion
Bob


Joined: 19 Sep 2009
Posts: 185
On Mon, Nov 23, 2009 at 3:44 PM, Mason Wheeler wrote:
Quote:
What I'd really like to see is a graphics backend that didn't delete
all your textures in the first place just because you switched to
fullscreen mode or minimized the window or something.

Well, yeah, that is what I would like too. I'm not likely to ever see it though

Quote:
I've
never seen the point of trashing that much memory, and then
making the program waste time recreating an exact copy of
it.  (It's not like you're going to use different textures for
fullscreen mode, afterall!)

Itis a resource allocation problem. You have to deal with video memory
fragmentation and you have to allocate resources to the application
that can make the best use of them.

Think back to the bad old days when a couple of megabytes was a huge
amount of video memory. Now think back to when 65 KILObytes was a huge
amount of memory and you are back to the time when GL was being
invented. OpenGL came along later but it had all the baggage left over
from the way back bad old days. (OpenGL was created from GL mostly to
keep the world from moving to PEX because PEX was open and GL was not.
Ever hear of PEX? Hey, it worked!)

Ok, so back when memory was small and expensive you had to make the
most of what you had. Not to mention that back in those days a
microprocessor that could do a million instructions per second was
still science fiction. Memory was expensive and cycles were expensive.

When you set the video mode some amount of video memory is used up by
the display buffers. If you are allocating memory you have to have a
strategy for doing it. The easiest way to allocate it is to start at
address 0 and work your way through memory. It was never that easy,
because graphic buffers are rectangles, not just long strings of
bytes, but that was the idea. (On some systems you had to do 2
dimensional allocation because the video hardware had a fixed stride.)
Your display buffer (or buffers) usually took up most of the memory.
When you added textures , extra buffers, display lists, you just
filled them in to the memory following the video buffers. You just
fill memory starting at the bottom and working toward the top. Now,
you change the video mode so that you need bigger display buffers.
Where do you find the memory for them?

You can start by compacting memory and then dropping items one at a
time all the while sending a stream of information about what has been
dropped back to the programmer and eventually get enough free space
for the new buffers. Or, you can just dump everything out of memory
and start over. That second approach doesn't need to give any feedback
to the programmer because everything always gets trashed after certain
function calls. Code space and cycles were also very expensive so a
solution that doesn't require any memory or code is a real winner. Not
to mention that if one application goes full screen, why shouldn't it
get all the video memory too? No other application can use it because
they are not visible.

That is the simple version of the problem it was actually a lot more
complex than that.

Now days it is not as much of a problem because the size of graphics
memory has grown and the way it is accessed by both rendering and
display hardware has changed. But, even now, the graphics memory can
wind up fragmented to the point where there are no free blocks large
enough to hold the new display buffers. And, it is still the case that
if another application goes full screen there is no good reason not to
give all the video memory to the only application that can be seen.

Quote:

Does anyone know why it does it that way? Because I've
never understood it.

Hope I've helped, I actually worked on memory allocators for graphics
hardware back in the bad old days.

Bob Pendleton

Quote:

________________________________
From: Bob Pendleton
Subject: Re: [SDL] New SDL 1.3 event type proposal for discussion

What I'd really like to see is a graphics API that does lazy resource
loading that uses this event to mark all resources as unloaded.
Something like that would help avoid the problem of suddRemember that cenly having
the application freeze while it madly uploads a few hundred megabytes
of textures to the video card. :-) If we had that then we wouldn't
need to expose the event to programmers.

_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org





--
+-----------------------------------------------------------
+ Bob Pendleton: writer and programmer
+ email:
+ web: www.TheGrumpyProgrammer.com
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
New SDL 1.3 event type proposal for discussion
Mike Gorchak
Guest

Thanks, Bob for the help.

I could describe you why this happens.

Easy case. Video card's memory can be accessed as large linear space. When driver is initialized and video mode is set, driver allocates primary framebuffer in video memory (for example 1680x1050 32bpp, it occupies minimum 7,056,000 bytes of video memory. But as Bob said, stride requirements could be different, for example 256 byte alignment for the each scanline). Ok, then driver allocates video memory for hardware cursor, display/command lists and for other needs(appr. from 64Kb up to 16Mb of video RAM). And the rest of RAM is filled with color backbuffer, zbuffer+stencil and textures. Even old good Quake I in GL mode could eat up to 96Mb of video RAM (tested Smile).

Now imagine, that user switched to 1920x1200 video mode, this mode occupies minimum 9,216,000 bytes, you need additional 2Mb of video RAM comparing to 1680x1050) and the rest of video RAM was already allocated. The only way to allocate new framebuffer is to trash all video memory content or to pretend that it is not allocated anymore. Driver reinitializes all video heaps and notifying upper level that video ram is trashed.

It is possible to divide video RAM into few regions, for example one for CRTC area (displayable framebuffer), one for cursor, one for display/command lists and the rest for the textures. But in this case you must preserve for CRTC area a lot video memory. For example, video chip supports maximum 2048x1556 on two video heads, this means that you have to exclude minimum 25,493,504 bytes of video memory for CRTC area for both video outputs (heads). As for command/display lists, for example, in 2D mode it is enough to have 512Kb video RAM for this, but in 3D mode it is not enough for maximum perfomance, you have to allocate 1-4Mb of RAM for this. If video chip supports, it could allocate exclusive display/command list for the each process. This scheme with pre-allocated regions of video memory, wastes video memory without any sense Smile

Hope it helps Smile
Quote:
"Mason Wheeler" wrote in message [url=news:]news:[/url]...
What I'd really like to see is a graphics backend that didn't delete
all your textures in the first place just because you switched to
fullscreen mode or minimized the window or something. I've
never seen the point of trashing that much memory, and then
making the program waste time recreating an exact copy of
it. (It's not like you're going to use different textures for
fullscreen mode, afterall!)

Does anyone know why it does it that way? Because I've
never understood it.


From: Bob Pendleton
Subject: Re: [SDL] New SDL 1.3 event type proposal for discussion

What I'd really like to see is a graphics API that does lazy resource
loading that uses this event to mark all resources as unloaded.
Something like that would help avoid the problem of suddenly having
the application freeze while it madly uploads a few hundred megabytes
of textures to the video card. :-) If we had that then we wouldn't
need to expose the event to programmers.





_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
New SDL 1.3 event type proposal for discussion
Mike Gorchak
Guest

Hello, Bob!

BP> As I understand it, this is actually a lost context problem. To me
BP> that means the event should tell me that I have lost my complete
BP> context, not just my textures. The context contains a lot more than
BP> just textures.

Yes, you are right. It's more than just textures, but all except textures
could be handled by SDL (context/device reinitialization, etc.).

In DirectX it is called "Device Lost".

In OpenGL/OpenGL ES this problem can be hided by OpenGL itself.

BP> What I'd really like to see is a graphics API that does lazy resource
BP> loading that uses this event to mark all resources as unloaded.
BP> Something like that would help avoid the problem of suddenly having
BP> the application freeze while it madly uploads a few hundred megabytes
BP> of textures to the video card. :-) If we had that then we wouldn't
BP> need to expose the event to programmers.

But in this case SDL must have a copy of all textures in system RAM, like it
happens in OpenGL.

With best regards, Mike Gorchak. E-mail:



_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
New SDL 1.3 event type proposal for discussion
Sam Lantinga


Joined: 10 Sep 2009
Posts: 1765
In general I've been writing the SDL 1.3 video drivers such that there
is a copy of textures in memory (either in SDL or the driver level) so
that trashing video memory doesn't have to be exposed to the
application.

On Sun, Nov 22, 2009 at 11:51 PM, Mike Gorchak wrote:
Quote:
Hello, All!

I'm going to add a new SDL event type for notifying application about fact,
that all video memory content is lost, for example, we'd call it
SDL_TEXTURES_LOST. This event will define that all textures must be
reloaded, all windows are exposed and have to be redrawn.

I think, that this event can't belong to SDL_WINDOWEVENT_XXXX group, because
it is global and concerns all created windows and SDL application itself.

Right now this event could be implemented through the SYSWM event, but I
think it could be useful on all platforms, since most GUIs have such kind of
events.

There are few cases, where this event could be very helpful:
1) When user switches video mode. On some systems this means, that video
memory content is lost (damaged, contains garbage, etc), and all offscreen
surfaces are no more valid.
2) When SDL application have two or more windows and one of the windows goes
to the fullscreen with video mode switch. After video mode switch all
textures become invalid on some systems.
3) When there are two or more SDL applications are running without
interracting between each other, and one of the SDL application switches to
fullscreen video mode. This also means on some systems, that all textures
must be reloaded.
4) In the multi video card system (with one or more video ouput ports on
each), when window is being dragged across displays which are belong to
different video cards. Some systems in this case issuing an event that
application have to re-create textures.

Sam, what's your comments about this ?

With best regards, Mike Gorchak.  E-mail:



_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org




--
-Sam Lantinga, Founder and President, Galaxy Gameworks LLC
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
New SDL 1.3 event type proposal for discussion
Kamos


Joined: 16 Oct 2009
Posts: 4
Location: Porto Alegre, Brazil
Thank you all for your answers. I've been wondering about this for a
while now.  Wink

BP> But, even now, the graphics memory can wind up fragmented to
BP> the point where there are no free blocks large enough to hold the
BP> new display buffers.

Maybe this sounds stupid, but... shouldn't the driver warn you that
there is no memory to change the resolution, and then you do
whatever you like with that information (i.e., trashing the memory
yourself)?

--
Daniel Camozzato
New SDL 1.3 event type proposal for discussion
Mike Gorchak
Guest

Hello, Sam!

SL> In general I've been writing the SDL 1.3 video drivers such that there
SL> is a copy of textures in memory (either in SDL or the driver level) so
SL> that trashing video memory doesn't have to be exposed to the
SL> application.

The main problem, that Photon GUI in QNX and GF interface are return an
error to application level, that it has to reupload all offscreen areas and
destroy the old. I can implement a temporary storage for the each uploaded
texture in SDL_TextureData structure, but complexity of the driver will be
increased and system memory consumption will be increased. For example, I
have to control where this offscreen area is allocated (system memory,
shared memory, video memory of adapter number X, etc).

I'm even not sure that OpenGL/OpenGL ES is preserving texture memory
consistency, it is implementation dependant. In embedded systems such waste
of system memory is not applicable.

Even more, it's ok, when texture re-uploading is performed in the same
application which set new video mode, but video switch could be called, for
example, by another SDL application and first application will not be
notified that all textures are lost.

With best regards, Mike Gorchak. E-mail:



_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
New SDL 1.3 event type proposal for discussion
Brian
Guest

Sounds like a good feature Sam. I wonder could this be made optional
however? Maybe specified during renderer creation.

For simple applications it would make sense for SDL to transparently
handle this. But more complicated programs might have enough meta data
around to restore the textures (etc) from disk, or regenerate any
procedural textures. My thinking is that complicated SDL programs
might want tight control of memory, they might be willing to trade off
time during a resolution switch (which is an exceedingly rare event)
in order to have a lower memory footprint.

In any case, having such an event would still be useful for
applications which use raw OpenGL for rendering, as there are OpenGL
objects like shaders etc that will be lost with the context that SDL
cannot have knowledge of.

-- Brian

On Tue, Nov 24, 2009 at 7:49 AM, Sam Lantinga wrote:
Quote:
In general I've been writing the SDL 1.3 video drivers such that there
is a copy of textures in memory (either in SDL or the driver level) so
that trashing video memory doesn't have to be exposed to the
application.
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
New SDL 1.3 event type proposal for discussion
David Olofson
Guest

On Tuesday 24 November 2009, at 15.11.10, Brian
wrote:
Quote:
Sounds like a good feature Sam. I wonder could this be made optional
however? Maybe specified during renderer creation.

For simple applications it would make sense for SDL to transparently
handle this. But more complicated programs might have enough meta data
around to restore the textures (etc) from disk, or regenerate any
procedural textures. My thinking is that complicated SDL programs
might want tight control of memory, they might be willing to trade off
time during a resolution switch (which is an exceedingly rare event)
in order to have a lower memory footprint.
[...]

Maybe a "please_regenerate_this_texture()" callback, as an alternative to
internal buffering or events? The point with this is that it gets called
"automatically" when SDL (or rather, the backend) actually *needs* the data.

Of course, the application could handle this (keep "wrapper" objects for
textures, buffers etc, and mark them as "needs refresh" as needed), but as SDL
needs to have the objects and logic in place anyway, it seems a little
nonsensical to duplicate it in the application.


--
//David Olofson - Developer, Artist, Open Source Advocate

.--- Games, examples, libraries, scripting, sound, music, graphics ---.
| http://olofson.net http://kobodeluxe.com http://audiality.org |
| http://eel.olofson.net http://zeespace.net http://reologica.se |
'---------------------------------------------------------------------'
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
New SDL 1.3 event type proposal for discussion
Bob


Joined: 19 Sep 2009
Posts: 185
On Tue, Nov 24, 2009 at 6:09 AM, Daniel Camozzato
wrote:
Quote:
Thank you all for your answers. I've been wondering about this for a
while now.  Wink

BP> But, even now, the graphics memory can wind up fragmented to
BP> the point where there are no free blocks large enough to hold the
BP> new display buffers.

Maybe this sounds stupid, but... shouldn't the driver warn you that
there is no memory to change the resolution, and then you do
whatever you like with that information (i.e., trashing the memory
yourself)?

You'd think so, but it is complicated. Lets say the driver tells the
programmer he has to delete 20 megs worth of stuff, and the programmer
goes through and deletes 20 individual items of 1 meg each. But, the
memory that needs to be freed is the memory at the end of the display
buffer, not just any old memory. He could have deleted 20 megs and
left 20 individual 1 meg "holes" of free memory. Ok, you can't just
tell him he has to free X amount of memory. You have to tell him to
delete specific items from memory. That sounds good, but what if he
doesn't *own* those items. So, now the driver has to tell some other
application that it has to delete a list of items. What if that
application refuses to delete them? Or, what if it deletes them, and
then immediately reloads them? What if the application is blocked
waiting on a timer or an I/O and won't notice the event until next
Tuesday? What if *your* application doesn't delete them properly? What
if your program is freeing memory while another application is
allocating memory.

Asking applications to give things back just doesn't work. You have to
take them away and tell them about it later.
The one way that always works is to just trash all the memory and let
the applications recover from it.

Bob Pendleton

Quote:

--
Daniel Camozzato

_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org





--
+-----------------------------------------------------------
+ Bob Pendleton: writer and programmer
+ email:
+ web: www.TheGrumpyProgrammer.com
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
New SDL 1.3 event type proposal for discussion
Sam Lantinga


Joined: 10 Sep 2009
Posts: 1765
These are really good points. Okay Mike, want to submit a bugzilla
patch for review?

Thanks!

On Tue, Nov 24, 2009 at 6:11 AM, Brian wrote:
Quote:
Sounds like a good feature Sam. I wonder could this be made optional
however? Maybe specified during renderer creation.

For simple applications it would make sense for SDL to transparently
handle this. But more complicated programs might have enough meta data
around to restore the textures (etc)  from disk, or regenerate any
procedural textures. My thinking is that complicated SDL programs
might want tight control of memory, they might be willing to trade off
time during a resolution switch (which is an exceedingly rare event)
in order to have a lower memory footprint.

In any case, having such an event would still be useful for
applications which use raw OpenGL for rendering, as there are OpenGL
objects like shaders etc that will be lost with the context that SDL
cannot have knowledge of.

-- Brian

On Tue, Nov 24, 2009 at 7:49 AM, Sam Lantinga wrote:
Quote:
In general I've been writing the SDL 1.3 video drivers such that there
is a copy of textures in memory (either in SDL or the driver level) so
that trashing video memory doesn't have to be exposed to the
application.
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org




--
-Sam Lantinga, Founder and President, Galaxy Gameworks LLC
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
New SDL 1.3 event type proposal for discussion
Mike Gorchak
Guest

Hello, Sam!

SL> These are really good points. Okay Mike, want to submit a bugzilla
SL> patch for review?

Ok, I will post a patch to bugzilla soon.

With best regards, Mike Gorchak. E-mail:



_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org