The SDL forums have moved to discourse.libsdl.org.
This is just a read-only archive of the previous forums, to keep old links working.


SDL Forum Index
SDL
Simple Directmedia Layer Forums
SDL_PixelFormatEnumToMasks, SDL_CreateRGBSurfaceFrom, RGBA88
Daniel Gibson
Guest

Let's assume I have an array of bytes with pixel data, like "unsigned
char data*;", e.g. from stb_image's stbi_load().
The first byte is for red, the second for blue, the third for green, the
fourth for alpha.
I think it's sane to call this RGBA, right?

So I wanna create a SDL_Surface* with this, e.g. to set the window icon.
I do:
Uint32 rmask, gmask, bmask, amask;
int bpp;
SDL_PixelFormatEnumToMasks(SDL_PIXELFORMAT_RGBA8888,
&bpp, &rmask, &gmask, &bmask, &amask);
SDL_Surface* surf = SDL_CreateRGBSurfaceFrom((void*)data, w, h, 4*8,
4*w, rmask, gmask, bmask, amask);

(Then I display that surface).
On a little endian machine, this looks wrong, because the rmask is
0xff000000, which masks the last (fourth) byte instead of the first one
(similar for other masks).

Using SDL_PIXELFORMAT_ABGR8888 looks correct, but will (most probably)
look wrong on big endian machines...
Furthermore, using it seems just wrong when I really have RGBA data.

Anyway, I find it kinda surprising at first that the data passed to
SDL_CreateRGBSurfaceFrom() seem to be interpreted as 32bit ints (and
with 16bit or 24bit colordepth this is even stranger), especially as
it's a void* pointer and not a Uint32* pointer. (Only at first because
something like this must be done internally, otherwise the masks
wouldn't make sense)
And then it's even more surprising that the masks generated by
SDL_PixelFormatEnumToMasks() with SDL_PIXELFORMAT_RGB888 and
SDL_PIXELFORMAT_RGBA8888 don't seem to work correctly with bytestreams -
the very name "RGBA8888" (in contrast to the nonexistant "RGBA32")
sounds like "there's 8bits/1byte of red, then 1byte of green etc", not
"we really assume a 32bit integer value".

So is there a portable way to set the masks in the (platform-specific)
correct way for bytestreams?
I'd even assume that this is a more common usecase than transforming an
array of 32bit ints with red in the least significant byte (instead of
smallest address).

Cheers,
Daniel
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
SDL_PixelFormatEnumToMasks, SDL_CreateRGBSurfaceFrom, RGBA88
Eirik Byrkjeflot Anonsen
Guest

Daniel Gibson writes:

Quote:
Let's assume I have an array of bytes with pixel data, like "unsigned
char data*;", e.g. from stb_image's stbi_load().
The first byte is for red, the second for blue, the third for green, the
fourth for alpha.
I think it's sane to call this RGBA, right?

So I wanna create a SDL_Surface* with this, e.g. to set the window icon.
I do:
Uint32 rmask, gmask, bmask, amask;
int bpp;
SDL_PixelFormatEnumToMasks(SDL_PIXELFORMAT_RGBA8888,
&bpp, &rmask, &gmask, &bmask, &amask);
SDL_Surface* surf = SDL_CreateRGBSurfaceFrom((void*)data, w, h, 4*8,
4*w, rmask, gmask, bmask, amask);

(Then I display that surface).
On a little endian machine, this looks wrong, because the rmask is
0xff000000, which masks the last (fourth) byte instead of the first one
(similar for other masks).

Using SDL_PIXELFORMAT_ABGR8888 looks correct, but will (most probably)
look wrong on big endian machines...
Furthermore, using it seems just wrong when I really have RGBA data.

Anyway, I find it kinda surprising at first that the data passed to
SDL_CreateRGBSurfaceFrom() seem to be interpreted as 32bit ints (and
with 16bit or 24bit colordepth this is even stranger), especially as
it's a void* pointer and not a Uint32* pointer. (Only at first because
something like this must be done internally, otherwise the masks
wouldn't make sense)
And then it's even more surprising that the masks generated by
SDL_PixelFormatEnumToMasks() with SDL_PIXELFORMAT_RGB888 and
SDL_PIXELFORMAT_RGBA8888 don't seem to work correctly with bytestreams -
the very name "RGBA8888" (in contrast to the nonexistant "RGBA32")
sounds like "there's 8bits/1byte of red, then 1byte of green etc", not
"we really assume a 32bit integer value".

Actually, I believe it is pretty common to have n-bit pixels split into
R, G, B and A components. I believe I have run into this with several
graphics APIs. Probably more than the opposite. I wouldn't be surprised
if this is the result of historical accident. Although straight 4x8
RGBA, 4x8 RGBX and 3x8 RGB are pretty ubiquitous today, things used to
be very different. And will hopefully be different again soon, when use
of more than 8 bits per component becomes more common.

As far as naming is concerned, RGBA32 would not really work well, as it
could have several possible sub-component sizes. 10-10-10-2 isn't
entirely unknown. For an example of the resulting confusion, look at how
the pixel format naming schemes in OpenGL have evolved.

Quote:
So is there a portable way to set the masks in the (platform-specific)
correct way for bytestreams?
I'd even assume that this is a more common usecase than transforming an
array of 32bit ints with red in the least significant byte (instead of
smallest address).

This I have no idea about.

eirik
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
SDL_PixelFormatEnumToMasks, SDL_CreateRGBSurfaceFrom, RGBA88
Daniel Gibson
Guest

Am 19.03.2015 18:14, schrieb Eirik Byrkjeflot Anonsen:
Quote:
Daniel Gibson writes:
Quote:
And then it's even more surprising that the masks generated by
SDL_PixelFormatEnumToMasks() with SDL_PIXELFORMAT_RGB888 and
SDL_PIXELFORMAT_RGBA8888 don't seem to work correctly with bytestreams -
the very name "RGBA8888" (in contrast to the nonexistant "RGBA32")
sounds like "there's 8bits/1byte of red, then 1byte of green etc", not
"we really assume a 32bit integer value".

Actually, I believe it is pretty common to have n-bit pixels split into
R, G, B and A components. I believe I have run into this with several
graphics APIs. Probably more than the opposite. I wouldn't be surprised
if this is the result of historical accident. Although straight 4x8
RGBA, 4x8 RGBX and 3x8 RGB are pretty ubiquitous today, things used to
be very different. And will hopefully be different again soon, when use
of more than 8 bits per component becomes more common.

As far as naming is concerned, RGBA32 would not really work well, as it
could have several possible sub-component sizes. 10-10-10-2 isn't
entirely unknown. For an example of the resulting confusion, look at how
the pixel format naming schemes in OpenGL have evolved.

Yeah, I was thinking about SDL_PIXELFORMAT_RGB24 which seems to set the
right masks for RGB data regardless of platform endianess.
I don't care about the name as long as it has RGBA in it (so it's easy
to find).

Cheers,
Daniel
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
SDL_PixelFormatEnumToMasks, SDL_CreateRGBSurfaceFrom, RGBA88
Daniel Gibson
Guest

I guess this was a bit long and potentially confusing, let's try shorter:

Having byte-wise RGBA (first a red byte, then a green one, ...) pixel
data is a very common usecase, as it's what many image-decoding libs
output and what OpenGL expects as input.

But according to https://wiki.libsdl.org/SDL_CreateRGBSurface and
https://wiki.libsdl.org/SDL_CreateTextureFromSurface
I'm supposed to do something like this to get bytewise RGBA pixel data
into a SDL_Surface:

Uint32 rmask, gmask, bmask, amask;
#if SDL_BYTEORDER == SDL_BIG_ENDIAN
rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
#else
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;
#endif

SDL_Surface *surface = SDL_CreateRGBSurface(0, 640, 480, 32,
rmask, gmask, bmask, amask);

There should be an easier way to obtain those masks that does not
require the SDL user to use obscure hex numbers and worry about endianess.
SDL_PixelFormatEnumToMasks() would be a good candidate to obtain the
masks, if there were a suitable SDL_PIXELFORMAT_* for bytewise RGBA (and
not just for "a whole Uint32 contains RGBA with R in the least
significant byte, whatever that may be on your platform").

Cheers,
Daniel

Am 18.03.2015 17:29, schrieb Daniel Gibson:
Quote:
Let's assume I have an array of bytes with pixel data, like "unsigned
char data*;", e.g. from stb_image's stbi_load().
The first byte is for red, the second for blue, the third for green, the
fourth for alpha.
I think it's sane to call this RGBA, right?

So I wanna create a SDL_Surface* with this, e.g. to set the window icon.
I do:
Uint32 rmask, gmask, bmask, amask;
int bpp;
SDL_PixelFormatEnumToMasks(SDL_PIXELFORMAT_RGBA8888,
&bpp, &rmask, &gmask, &bmask, &amask);
SDL_Surface* surf = SDL_CreateRGBSurfaceFrom((void*)data, w, h, 4*8,
4*w, rmask, gmask, bmask, amask);

(Then I display that surface).
On a little endian machine, this looks wrong, because the rmask is
0xff000000, which masks the last (fourth) byte instead of the first one
(similar for other masks).

Using SDL_PIXELFORMAT_ABGR8888 looks correct, but will (most probably)
look wrong on big endian machines...
Furthermore, using it seems just wrong when I really have RGBA data.

Anyway, I find it kinda surprising at first that the data passed to
SDL_CreateRGBSurfaceFrom() seem to be interpreted as 32bit ints (and
with 16bit or 24bit colordepth this is even stranger), especially as
it's a void* pointer and not a Uint32* pointer. (Only at first because
something like this must be done internally, otherwise the masks
wouldn't make sense)
And then it's even more surprising that the masks generated by
SDL_PixelFormatEnumToMasks() with SDL_PIXELFORMAT_RGB888 and
SDL_PIXELFORMAT_RGBA8888 don't seem to work correctly with bytestreams -
the very name "RGBA8888" (in contrast to the nonexistant "RGBA32")
sounds like "there's 8bits/1byte of red, then 1byte of green etc", not
"we really assume a 32bit integer value".

So is there a portable way to set the masks in the (platform-specific)
correct way for bytestreams?
I'd even assume that this is a more common usecase than transforming an
array of 32bit ints with red in the least significant byte (instead of
smallest address).

Cheers,
Daniel

_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
SDL_PixelFormatEnumToMasks, SDL_CreateRGBSurfaceFrom, RGBA88
Driedfruit
Guest

I *did* goof, and my last e-mail did indeed went only to Daniel istead
of the whole list, but, long story short, I concur, this is an odd
behavior for SDL_PixelFormatEnumToMasks. I would really like to hear
from someone (Sam/Ryan?), if it's the expected behavior or an oversight?
I'm willing to clarify the wiki on this apsect, as long as someone can
confirm one way or another.

If we don't want to break the API, let's introduce
SDL_PixelFormatEnumToMasks2 (for the lack of better name), which *does*
the byte flipping according to endianess.

Although the SDL_BYTEORDER hack is not drastic or hard to implement,
and is copy-pastable from the docs, it's still a little weird that
*every* SDL project has to implement it.

On Fri, 20 Mar 2015 11:41:51 +0100
Daniel Gibson wrote:

Quote:
I guess this was a bit long and potentially confusing, let's try
shorter:

Having byte-wise RGBA (first a red byte, then a green one, ...) pixel
data is a very common usecase, as it's what many image-decoding libs
output and what OpenGL expects as input.

But according to https://wiki.libsdl.org/SDL_CreateRGBSurface and
https://wiki.libsdl.org/SDL_CreateTextureFromSurface
I'm supposed to do something like this to get bytewise RGBA pixel
data into a SDL_Surface:

Uint32 rmask, gmask, bmask, amask;
#if SDL_BYTEORDER == SDL_BIG_ENDIAN
rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
#else
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;
#endif

SDL_Surface *surface = SDL_CreateRGBSurface(0, 640, 480, 32,
rmask, gmask, bmask, amask);

There should be an easier way to obtain those masks that does not
require the SDL user to use obscure hex numbers and worry about
endianess. SDL_PixelFormatEnumToMasks() would be a good candidate to
obtain the masks, if there were a suitable SDL_PIXELFORMAT_* for
bytewise RGBA (and not just for "a whole Uint32 contains RGBA with R
in the least significant byte, whatever that may be on your
platform").

Cheers,
Daniel

Am 18.03.2015 17:29, schrieb Daniel Gibson:
Quote:
Let's assume I have an array of bytes with pixel data, like
"unsigned char data*;", e.g. from stb_image's stbi_load().
The first byte is for red, the second for blue, the third for
green, the fourth for alpha.
I think it's sane to call this RGBA, right?

So I wanna create a SDL_Surface* with this, e.g. to set the window
icon. I do:
Uint32 rmask, gmask, bmask, amask;
int bpp;
SDL_PixelFormatEnumToMasks(SDL_PIXELFORMAT_RGBA8888,
&bpp, &rmask, &gmask, &bmask, &amask);
SDL_Surface* surf = SDL_CreateRGBSurfaceFrom((void*)data, w, h, 4*8,
4*w, rmask, gmask, bmask, amask);

(Then I display that surface).
On a little endian machine, this looks wrong, because the rmask is
0xff000000, which masks the last (fourth) byte instead of the first
one (similar for other masks).

Using SDL_PIXELFORMAT_ABGR8888 looks correct, but will (most
probably) look wrong on big endian machines...
Furthermore, using it seems just wrong when I really have RGBA data.

Anyway, I find it kinda surprising at first that the data passed to
SDL_CreateRGBSurfaceFrom() seem to be interpreted as 32bit ints (and
with 16bit or 24bit colordepth this is even stranger), especially as
it's a void* pointer and not a Uint32* pointer. (Only at first
because something like this must be done internally, otherwise the
masks wouldn't make sense)
And then it's even more surprising that the masks generated by
SDL_PixelFormatEnumToMasks() with SDL_PIXELFORMAT_RGB888 and
SDL_PIXELFORMAT_RGBA8888 don't seem to work correctly with
bytestreams - the very name "RGBA8888" (in contrast to the
nonexistant "RGBA32") sounds like "there's 8bits/1byte of red, then
1byte of green etc", not "we really assume a 32bit integer value".

So is there a portable way to set the masks in the
(platform-specific) correct way for bytestreams?
I'd even assume that this is a more common usecase than
transforming an array of 32bit ints with red in the least
significant byte (instead of smallest address).

Cheers,
Daniel

_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org



--
driedfruit
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
SDL_PixelFormatEnumToMasks, SDL_CreateRGBSurfaceFrom, RGBA88
Daniel Gibson
Guest

On 03/20/2015 06:24 PM, Driedfruit wrote:
Quote:
I *did* goof, and my last e-mail did indeed went only to Daniel istead
of the whole list, but, long story short, I concur, this is an odd
behavior for SDL_PixelFormatEnumToMasks. I would really like to hear
from someone (Sam/Ryan?), if it's the expected behavior or an oversight?
I'm willing to clarify the wiki on this apsect, as long as someone can
confirm one way or another.

If we don't want to break the API, let's introduce
SDL_PixelFormatEnumToMasks2 (for the lack of better name), which *does*
the byte flipping according to endianess.

I think there are indeed some SDL_PIXELFORMATs that imply byte-wise
instead of Uint32-wise encoding, and are defined differently dependent
on SDL_BYTEORDER, e.g. SDL_PIXELFORMAT_RGB24
So I think we don't need a new API, just a new
SDL_PIXELFORMAT_RGBA8888_BYTEWISE_AS_YOU_WOULD_EXPEXT or something like
that Wink

Also: Having a pixelformat enum for bytewise RGBA would be really handy
for converting other strange (already supported) formats to RGBA for
OpenGL with SDL_ConvertPixels(), even when not using SDL_Surfaces.

BTW, those APIs are all kinda strange..
SDL_PixelFormatEnumToMasks() sets your masks for an SDL_PIXELFORMAT
enum, and then SDL_CreateRGBSurface() only uses those masks to guess the
right enum again and use that internally.. (if I haven't missed something).
So I guess
SDL_CreateRGBSurface2(flags, w, h, depth, pixelFormatEnum)
with a better name would be cool? (Same for SDL_CreateRGBSurfaceFrom())


Cheers,
Daniel

Quote:

Although the SDL_BYTEORDER hack is not drastic or hard to implement,
and is copy-pastable from the docs, it's still a little weird that
*every* SDL project has to implement it.

On Fri, 20 Mar 2015 11:41:51 +0100
Daniel Gibson wrote:

Quote:
I guess this was a bit long and potentially confusing, let's try
shorter:

Having byte-wise RGBA (first a red byte, then a green one, ...) pixel
data is a very common usecase, as it's what many image-decoding libs
output and what OpenGL expects as input.

But according to https://wiki.libsdl.org/SDL_CreateRGBSurface and
https://wiki.libsdl.org/SDL_CreateTextureFromSurface
I'm supposed to do something like this to get bytewise RGBA pixel
data into a SDL_Surface:

Uint32 rmask, gmask, bmask, amask;
#if SDL_BYTEORDER == SDL_BIG_ENDIAN
rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
#else
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;
#endif

SDL_Surface *surface = SDL_CreateRGBSurface(0, 640, 480, 32,
rmask, gmask, bmask, amask);

There should be an easier way to obtain those masks that does not
require the SDL user to use obscure hex numbers and worry about
endianess. SDL_PixelFormatEnumToMasks() would be a good candidate to
obtain the masks, if there were a suitable SDL_PIXELFORMAT_* for
bytewise RGBA (and not just for "a whole Uint32 contains RGBA with R
in the least significant byte, whatever that may be on your
platform").

Cheers,
Daniel

Am 18.03.2015 17:29, schrieb Daniel Gibson:
Quote:
Let's assume I have an array of bytes with pixel data, like
"unsigned char data*;", e.g. from stb_image's stbi_load().
The first byte is for red, the second for blue, the third for
green, the fourth for alpha.
I think it's sane to call this RGBA, right?

So I wanna create a SDL_Surface* with this, e.g. to set the window
icon. I do:
Uint32 rmask, gmask, bmask, amask;
int bpp;
SDL_PixelFormatEnumToMasks(SDL_PIXELFORMAT_RGBA8888,
&bpp, &rmask, &gmask, &bmask, &amask);
SDL_Surface* surf = SDL_CreateRGBSurfaceFrom((void*)data, w, h, 4*8,
4*w, rmask, gmask, bmask, amask);

(Then I display that surface).
On a little endian machine, this looks wrong, because the rmask is
0xff000000, which masks the last (fourth) byte instead of the first
one (similar for other masks).

Using SDL_PIXELFORMAT_ABGR8888 looks correct, but will (most
probably) look wrong on big endian machines...
Furthermore, using it seems just wrong when I really have RGBA data.

Anyway, I find it kinda surprising at first that the data passed to
SDL_CreateRGBSurfaceFrom() seem to be interpreted as 32bit ints (and
with 16bit or 24bit colordepth this is even stranger), especially as
it's a void* pointer and not a Uint32* pointer. (Only at first
because something like this must be done internally, otherwise the
masks wouldn't make sense)
And then it's even more surprising that the masks generated by
SDL_PixelFormatEnumToMasks() with SDL_PIXELFORMAT_RGB888 and
SDL_PIXELFORMAT_RGBA8888 don't seem to work correctly with
bytestreams - the very name "RGBA8888" (in contrast to the
nonexistant "RGBA32") sounds like "there's 8bits/1byte of red, then
1byte of green etc", not "we really assume a 32bit integer value".

So is there a portable way to set the masks in the
(platform-specific) correct way for bytestreams?
I'd even assume that this is a more common usecase than
transforming an array of 32bit ints with red in the least
significant byte (instead of smallest address).

Cheers,
Daniel

_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org




_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
SDL_PixelFormatEnumToMasks, SDL_CreateRGBSurfaceFrom, RGBA88
Daniel Gibson
Guest

Ok, I somehow felt like implementing my suggestions, so attached you'll
find two patches:

* sdl2_SDL_PIXELFORMAT_RGBA32.diff adds SDL_PIXELFORMAT_RGBA32 which
assumes bytewise RGBA. The name is analogous to SDL_PIXELFORMAT_RGB24.
I adjusted SDL_PixelFormatEnumToMasks() to use it, but not
SDL_MasksToPixelFormatEnum(), to retain backwards-compatibility, i.e.
SDL_MasksToPixelFormatEnum() will still return SDL_PIXELFORMAT_RGBA8888
or SDL_PIXELFORMAT_ABGR8888 depending on the systems endianess.
SDL_GetPixelFormatName() also recognizes the new format. I hope I didn't
forget any further places that might need adaption for the new format.

* sdl2_CreateRGBSurfaceWithFormat.diff adds
SDL_CreateRGBSurfaceWithFormat() and
SDL_CreateRGBSurfaceWithFormatFrom() that work like the versions without
"WithFormat" but take an Uint32 format for SDL_PIXELFORMAT_* instead of
the RGBA masks.
I adjusted SDL_CreateRGBSurface() and SDL_CreateRGBSurfaceFrom() to use
these new functions.

I don't care much about the names of either the new pixelformat or the
new functions, I just chose the ones that came to my mind first.

I tested using a hacked together crappy imageviewer which uses stb_image
and SDL2, it's also attached.

Cheers,
Daniel

On 03/20/2015 06:46 PM, Daniel Gibson wrote:
Quote:
On 03/20/2015 06:24 PM, Driedfruit wrote:
Quote:
I *did* goof, and my last e-mail did indeed went only to Daniel istead
of the whole list, but, long story short, I concur, this is an odd
behavior for SDL_PixelFormatEnumToMasks. I would really like to hear
from someone (Sam/Ryan?), if it's the expected behavior or an oversight?
I'm willing to clarify the wiki on this apsect, as long as someone can
confirm one way or another.

If we don't want to break the API, let's introduce
SDL_PixelFormatEnumToMasks2 (for the lack of better name), which *does*
the byte flipping according to endianess.

I think there are indeed some SDL_PIXELFORMATs that imply byte-wise
instead of Uint32-wise encoding, and are defined differently dependent
on SDL_BYTEORDER, e.g. SDL_PIXELFORMAT_RGB24
So I think we don't need a new API, just a new
SDL_PIXELFORMAT_RGBA8888_BYTEWISE_AS_YOU_WOULD_EXPEXT or something like
that Wink

Also: Having a pixelformat enum for bytewise RGBA would be really handy
for converting other strange (already supported) formats to RGBA for
OpenGL with SDL_ConvertPixels(), even when not using SDL_Surfaces.

BTW, those APIs are all kinda strange..
SDL_PixelFormatEnumToMasks() sets your masks for an SDL_PIXELFORMAT
enum, and then SDL_CreateRGBSurface() only uses those masks to guess the
right enum again and use that internally.. (if I haven't missed something).
So I guess
SDL_CreateRGBSurface2(flags, w, h, depth, pixelFormatEnum)
with a better name would be cool? (Same for SDL_CreateRGBSurfaceFrom())


Cheers,
Daniel

Quote:

Although the SDL_BYTEORDER hack is not drastic or hard to implement,
and is copy-pastable from the docs, it's still a little weird that
*every* SDL project has to implement it.

On Fri, 20 Mar 2015 11:41:51 +0100
Daniel Gibson wrote:

Quote:
I guess this was a bit long and potentially confusing, let's try
shorter:

Having byte-wise RGBA (first a red byte, then a green one, ...) pixel
data is a very common usecase, as it's what many image-decoding libs
output and what OpenGL expects as input.

But according to https://wiki.libsdl.org/SDL_CreateRGBSurface and
https://wiki.libsdl.org/SDL_CreateTextureFromSurface
I'm supposed to do something like this to get bytewise RGBA pixel
data into a SDL_Surface:

Uint32 rmask, gmask, bmask, amask;
#if SDL_BYTEORDER == SDL_BIG_ENDIAN
rmask = 0xff000000;
gmask = 0x00ff0000;
bmask = 0x0000ff00;
amask = 0x000000ff;
#else
rmask = 0x000000ff;
gmask = 0x0000ff00;
bmask = 0x00ff0000;
amask = 0xff000000;
#endif

SDL_Surface *surface = SDL_CreateRGBSurface(0, 640, 480, 32,
rmask, gmask, bmask, amask);

There should be an easier way to obtain those masks that does not
require the SDL user to use obscure hex numbers and worry about
endianess. SDL_PixelFormatEnumToMasks() would be a good candidate to
obtain the masks, if there were a suitable SDL_PIXELFORMAT_* for
bytewise RGBA (and not just for "a whole Uint32 contains RGBA with R
in the least significant byte, whatever that may be on your
platform").

Cheers,
Daniel

Am 18.03.2015 17:29, schrieb Daniel Gibson:
Quote:
Let's assume I have an array of bytes with pixel data, like
"unsigned char data*;", e.g. from stb_image's stbi_load().
The first byte is for red, the second for blue, the third for
green, the fourth for alpha.
I think it's sane to call this RGBA, right?

So I wanna create a SDL_Surface* with this, e.g. to set the window
icon. I do:
Uint32 rmask, gmask, bmask, amask;
int bpp;
SDL_PixelFormatEnumToMasks(SDL_PIXELFORMAT_RGBA8888,
&bpp, &rmask, &gmask, &bmask, &amask);
SDL_Surface* surf = SDL_CreateRGBSurfaceFrom((void*)data, w, h, 4*8,
4*w, rmask, gmask, bmask, amask);

(Then I display that surface).
On a little endian machine, this looks wrong, because the rmask is
0xff000000, which masks the last (fourth) byte instead of the first
one (similar for other masks).

Using SDL_PIXELFORMAT_ABGR8888 looks correct, but will (most
probably) look wrong on big endian machines...
Furthermore, using it seems just wrong when I really have RGBA data.

Anyway, I find it kinda surprising at first that the data passed to
SDL_CreateRGBSurfaceFrom() seem to be interpreted as 32bit ints (and
with 16bit or 24bit colordepth this is even stranger), especially as
it's a void* pointer and not a Uint32* pointer. (Only at first
because something like this must be done internally, otherwise the
masks wouldn't make sense)
And then it's even more surprising that the masks generated by
SDL_PixelFormatEnumToMasks() with SDL_PIXELFORMAT_RGB888 and
SDL_PIXELFORMAT_RGBA8888 don't seem to work correctly with
bytestreams - the very name "RGBA8888" (in contrast to the
nonexistant "RGBA32") sounds like "there's 8bits/1byte of red, then
1byte of green etc", not "we really assume a 32bit integer value".

So is there a portable way to set the masks in the
(platform-specific) correct way for bytestreams?
I'd even assume that this is a more common usecase than
transforming an array of 32bit ints with red in the least
significant byte (instead of smallest address).

Cheers,
Daniel

_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org






_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
SDL_PixelFormatEnumToMasks, SDL_CreateRGBSurfaceFrom, RGBA88
Daniel Gibson
Guest

I created bugreports for this so the patches don't get lost:
https://bugzilla.libsdl.org/show_bug.cgi?id=2923
https://bugzilla.libsdl.org/show_bug.cgi?id=2924

Cheers,
Daniel


On 03/22/2015 04:10 AM, Daniel Gibson wrote:
Quote:
Ok, I somehow felt like implementing my suggestions, so attached you'll
find two patches:

* sdl2_SDL_PIXELFORMAT_RGBA32.diff adds SDL_PIXELFORMAT_RGBA32 which
assumes bytewise RGBA. The name is analogous to SDL_PIXELFORMAT_RGB24.
I adjusted SDL_PixelFormatEnumToMasks() to use it, but not
SDL_MasksToPixelFormatEnum(), to retain backwards-compatibility, i.e.
SDL_MasksToPixelFormatEnum() will still return SDL_PIXELFORMAT_RGBA8888
or SDL_PIXELFORMAT_ABGR8888 depending on the systems endianess.
SDL_GetPixelFormatName() also recognizes the new format. I hope I didn't
forget any further places that might need adaption for the new format.

* sdl2_CreateRGBSurfaceWithFormat.diff adds
SDL_CreateRGBSurfaceWithFormat() and
SDL_CreateRGBSurfaceWithFormatFrom() that work like the versions without
"WithFormat" but take an Uint32 format for SDL_PIXELFORMAT_* instead of
the RGBA masks.
I adjusted SDL_CreateRGBSurface() and SDL_CreateRGBSurfaceFrom() to use
these new functions.

I don't care much about the names of either the new pixelformat or the
new functions, I just chose the ones that came to my mind first.

I tested using a hacked together crappy imageviewer which uses stb_image
and SDL2, it's also attached.

Cheers,
Daniel


_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org