SDL 2.0 and PNG alpha |
MrTAToad
|
Do you set blend mode before rendering ?
|
|||||||||||
|
Naith
|
You can use the SDL_ConvertSurface() function to convert a surface into the same pixel format as the screen surface.
Example code:
Then you just render the converted image surface as usual. The alpha channel in the PNG image should be preserved. When you want to switch to using SDL_Texture's only, you can do it like this: Exampel code:
The IMG_LoadTexture() function is a function in the SDL_Image library that creates an SDL_Texture from a image file. The alpha channel in the PNG image is preserved when using this function aswell. |
|||||||||||||||
|
SDL 2.0 and PNG alpha |
Jonny D
|
Hmm, that is unfortunate. So the fake display surface in SDL 2 does not have an alpha channel, so images converted to its format lose their alpha...
Well, maybe you can try constructing your own format to see if the performance is still affected? I would try making a copy of the screen surface's format, then add an alpha mask (which might be a guess for now). Something like: SDL_PixelFormat my_format = *Screen->format; my_format.Amask = 0x000000ff;  // A guess... Try 0xff000000 too? optimizedSurface = SDL_ConvertSurface( loadedSurface, &my_format, NULL ); On Windows, SDL 2 always uses SDL_PIXELFORMAT_RGB888 for the display surface (see src/video/windows/SDL_windowsframebuffer.c:WIN_CreateWindowFramebuffer). So if you want to do a little more work to get the mask right without guessing, then you can try copying and adjusting the logic in src/video/SDL_pixels.c:SDL_PixelFormatEnumToMasks to produce a reasonable Amask from that format enum. Of course, the most obvious way forward is to stop using SDL_GetWindowSurface(). I personally have an older project for which I use my own constructed "screen" surface, then upload that surface each frame and then render it. Or you can use the render API or something else. Good luck! Jonny D |
|||||||||||
|
SDL 2.0 and PNG alpha |
Gary Grant
Guest
|
Manually setting Format.Amask does work, but as with not converting at all, it kills the performance. Thing is, reading the code in 1.2, this was exactly what the Display_Format_Alpha function did, without impacting performance.Â
On Sun, Oct 12, 2014 at 9:26 AM, Jonathan Dearborn wrote:
|
|||||||||||||
|