The SDL forums have moved to discourse.libsdl.org.
This is just a read-only archive of the previous forums, to keep old links working.


SDL Forum Index
SDL
Simple Directmedia Layer Forums
OSX and Retina
Trevor Powell


Joined: 11 Mar 2012
Posts: 12
With the release of the MacBook Pro Retina, Apple has separated "points" from "pixels" on the OS X platform, the same way that they have on iOS. The mechanism is a little bit different, but the concept seems to be the same; a window which is reported as being 800x600 points may actually contain 1600x1200 pixels (or more or less, depending on a whole range of factors)

Under OS X, by default OpenGL works in a sort of compatibility mode in which you get one pixel per "point", which matches the pixel density you get on normal displays, but results in blocky-looking graphics on retina displays. To use the full resolution of a high-density display, you need to call "[view setWantsBestResolutionOpenGLSurface:YES];" to tell OSX that you want to run with something other than the standard "one pixel per 'point'" resolution. In my copy of SDL 1.2, I've done this in SDL_QuartzVideo.m, around lines 830 and 1080, where the OpenGL view is being created. My added code in both places looks like this:

Code:
if ( [window_view respondsToSelector:@selector(setWantsBestResolutionOpenGLSurface:) ] )
{
  [ window_view setWantsBestResolutionOpenGLSurface:YES ];
}


This works for setting up the view to be retina-compatible (and is backwards-compatible to earlier OSX versions where the option wasn't available), but still doesn't expose the actual pixel dimensions to client code; we're left using "points" instead of "pixels" for almost everything; SDL_SetVideoMode takes a size in points, SDL_ListModes(NULL, SDL_FULLSCREEN|SDL_HWSURFACE) returns screen sizes in points, SDL_VIDEORESIZE events report resize events in points, etc. And so when client code passes those dimensions to glViewport() (which wants its arguments to be expressed in pixels), the client code ends up drawing to only a small portion of the OpenGL surface, and I haven't found an SDL interface which seems to be intended to be used by client code to convert between these "point" values and the actual underlying number of pixels.

Half the time, I suspect that the easiest thing to do would simply be to make all of these functions use pixels, rather than points. The only real downside of that would be that if someone asks for a 800x600 video mode, that's going to make a very small window on a retina display, and a much larger one on a non-retina display. Which maybe isn't such a terrible thing.

But I imagine that this "convert from points to pixels" side of things has already been worked out for SDL's iOS retina support, so maybe it's just a matter of porting the general strategy that's in use there over to the OSX side of SDL. Has anybody else already started in this direction? (And should I be doing this experimentation in SDL2, instead of 1.2? I haven't really gotten my feet wet with version 2 yet)
Trevor Powell


Joined: 11 Mar 2012
Posts: 12
After further research, it appears that SDL's Retina behaviour on iOS is to ignore the issue, and simply to always deal with pixels, ignoring the "points" concept that underpins Cocoa layouts. That's very reasonable on a device where you have a single physical size for the display.

And I'm sort of leaning toward doing the same thing under OS X; patch SDL to convert everything it receives from Cocoa from points into pixels, and then work entirely on pixels internally. This would make it so that every number the client code sees is expressed in terms of pixels, which would drastically simplify working with a high-density display.

This would result in a few potential issues:
- Client code would have no way to know whether or not it's running on a retina display (except by making educated guesses based upon the resolution it's using). This might make it awkward to (for example) scale buttons or other interactive UI elements to an appropriate physical size on the screen.
- Low-resolution SDL games played in a window would appear in an extremely small window on retina screens, since resolution (and therefore window size) would be specified in pixels. (They should be fine when played fullscreen, though)

Questions:
- Would it be worth adding API functions for exposing the density of a high-density display? (This could be used by client code to address issue #1, above). Alternately, adding this sort of API function would make it possible for us to continue to use points as the standard unit in OSX SDL, and have client code convert into real pixels when it needs to; this would keep us from needing to patch so much unit conversion code into mouse/window/etc code within SDL.
- Would it be worthwhile to add a video flag (SDL_HIGHDENSITYSURFACE, or somesuch) to enable/disable Retina support? So everything continues to run using the usual "normal-density display" logic, and retina support would only be enabled if that flag was provided? (This would address issue #2, above)
- High-density displays are actually a bit more complicated on a desktop device than on a mobile one, since we've got some nastier use-cases; multiple monitors with different densities, windows spanning two monitors, etc. Not sure to what extent we want/need to worry about these corner-case situations, as long as they don't result in instability..
OSX and Retina
Brian Barnes
Guest

Trevor Powell wrote:

Quote:
After further research, it appears that SDL's Retina behaviour on iOS is to ignore the issue, and simply to always deal with pixels, ignoring the "points" concept that underpins Cocoa layouts. That's very reasonable on a device where you have a single physical size for the display.

And I'm sort of leaning toward doing the same thing under OS X; patch SDL to convert everything it receives from Cocoa from points into pixels, and then work entirely on pixels internally. This would make it so that every number the client code sees is expressed in terms of pixels, which would drastically simplify working with a high-density display.

This would result in a few potential issues:
- Client code would have no way to know whether or not it's running on a retina display (except by making educated guesses based upon the resolution it's using). This might make it awkward to (for example) scale buttons or other interactive UI elements to an appropriate physical size on the screen.
- Low-resolution SDL games played in a window would appear in an extremely small window on retina screens, since resolution (and therefore window size) would be specified in pixels. (They should be fine when played fullscreen, though)

I'd vote for this, as I already depend on it for iOS. Smile

In dim3, ever interface element is defined by the project creator as
absolute size in an absolute field that is scaled to the screen size, so
they work on any resolution and always fit the scale of any art behind
it (i.e., the elements are set in a virtual window that is translated to
the real window.)

If people deal with absolute sizes regardless of screen resolution, than
they will run into very small elements, but they probably shouldn't be
in the first place.

[>] Brian
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
Re: OSX and Retina
Trevor Powell


Joined: 11 Mar 2012
Posts: 12
Brian Barnes wrote:

I'd vote for [everything in pixels], as I already depend on it for iOS. Smile

In dim3, ever interface element is defined by the project creator as
absolute size in an absolute field that is scaled to the screen size, so
they work on any resolution and always fit the scale of any art behind
it (i.e., the elements are set in a virtual window that is translated to
the real window.)


I do exactly the same thing in my own projects; it does seem like the sensible way to work. But I can imagine other people who want to maintain interface elements at a particular physical size, regardless of the size of the screen. Just didn't want to completely discount that use case. (Or at least, to make my eventual patch's non-support for it explicit)

More experimentation is showing that we don't need to add a flag to enable/disable retina mode; OS X applications require a special key in their Info.plist file to specify that they support high-density rendering. ("NSHighResolutionCapable") So anyone who doesn't set that won't get high-density rendering even if SDL asks for it. Additionally, end-users can disable high-resolution rendering on any program by turning it off in the program's "Get Info" window. If the developer doesn't include that key in the Info.plist, or the user explicitly disables high-resolution rendering, then the program runs in 1-pixel-per-point compatibility mode anyhow. So no flag required.

The trickier part is that on a desktop machine, we can wind up with different scaling factors on different monitors. Or even different scaling factors on a single window, in the case of display mirroring. The net result of all this is that it's turning out to be a little tricky to generate a realistic list of available resolutions, as measured in pixels -- especially before the window has even been created. But I'm working on it. Smile
Re: OSX and Retina
RodrigoCard


Joined: 23 Apr 2011
Posts: 113
Location: Brazil
I vote for setting a opt-in HINT for a Retina size, since this is the default behavior in OS X.

something like:
SDL_SetHint(SDL_APPLE_RETINA_DISPLAY)

Also, would be great if it works the same way on iOS.
I needed to patch SDL when trying to make a non-retina size game, handling it manually is so much harder.
So a opt-in hint would be a better option. IMHO
Re: OSX and Retina
Trevor Powell


Joined: 11 Mar 2012
Posts: 12
So I have SDL now working for retina-with-retina-enabled, and non-retina modes (OpenGL only; don't currently have test code to run against the 2D interface for testing). This includes window and mouse handling (both captured and non-captured). Haven't yet fixed resize events.

And retina-with-retina-disabled is mostly working, too, except for one little gotcha:

SDL_ListModes() finds the list of available resolutions by calling CGDisplayCopyAllDisplayModes(). And those results are all expressed in terms of raw device pixels, regardless of whether the application has been set as "do not support high resolution rendering". Which means that in non-high-resolution rendering mode, a MacBook Pro Retina reports supporting a 2880x1800 resolution, when in fact we can only create a window holding a 1440x900 resolution.

In theory, I can talk to the NSScreen to find the scaling factor between pixels and points, but in non-high-resolution-rendering mode, that interface returns a 1:1 mapping between pixels and points, so that doesn't help us in this situation.

There's also a Core Graphics interface (CGContextConvertRectToUserSpace()) which should be able to convert from device pixels to Cocoa points for the purpose of determining what size fullscreen windows we can make in non-high-resolution mode, but that call requires a CGContext object, and CGDisplayGetDrawingContext() is returning NULL for me.

I'm a neophyte with Quartz, so I assume that I just need to dig deeper into the documentation.
Trevor Powell


Joined: 11 Mar 2012
Posts: 12
This is really kicking my butt.

I'm having no luck at all figuring out how to convert from the display resolutions returned by CGDisplayCopyAllDisplayModes() into the window sizes to be created through Cocoa, when retina support is disabled. Quartz calls don't adjust for the scaling factor, NSScreen pretends that there is no scaling factor (when the application's retina support is disabled), etc.

Incidentally, this issue is already present on the current stable release of SDL 1.2, which has no retina support (and probably also with 2, if it has been implemented in a similar manner); it gets available screen sizes by asking Quartz, and then tries to use one by creating a window through Cocoa. It just blindly assumes that the two OSX libraries are using the same units, which isn't a valid assumption on any device with a high-density display. Means that SDL_ListModes() reports resolutions which then don't work correctly when selected via SDL_SetVideoMode. Maybe I should file this as a bug, to get more eyes on the problem? This presumably is already affecting every SDL-based program running on an OSX machine with a retina display.