Alpha blending bug? |
Daniel F Moisset
Guest
|
Hi,
I'm seeing strange behaviour with his code: #include <SDL.h> int main (void) { SDL_Surface *screen,*s1,*s2; SDL_Event e; SDL_Rect r; SDL_Init (SDL_INIT_VIDEO); screen = SDL_SetVideoMode (300, 300, 16, 0); s1 = SDL_CreateRGBSurface (0, 50, 50, 24, 0x00f80000, 0x0007e000, 0x00001f00, 0x000000ff); s2 = SDL_CreateRGBSurface (0, 50, 50, 24, 0x00f80000, 0x0007e000, 0x00001f00, 0x000000ff); SDL_FillRect (s1, NULL, 0x00ffffff); SDL_FillRect (s2, NULL, 0x004000ff); r.x = 100; r.y = 100; SDL_BlitSurface (s1, NULL, screen, &r); r.x = 125; r.y = 125; SDL_BlitSurface (s2, NULL, screen, &r); SDL_UpdateRect (screen, 0, 0, 0, 0); while (SDL_WaitEvent (&e)) { if (e.type==SDL_QUIT) break; } SDL_Quit(); return 0; } This creates two 16 bit + alpha surfaces, and blits them to the screen, overlapping them. The problem is that the surfaces are opaque but SDL is alpha blending them a litlle. That is, I expect something like (incoming ascii-art) **** **** **++++ **++++ ++++ ++++ But SDL shows something like **** **** **xx++ **xx++ ++++ ++++ where 'x' is a blend of '*' and '+' (very close to '+') Why does that happen? I'm guessing it has to do with using 16 bit surfaces, I have not seen that happening at 24 bpp... however I have this problem in an app whereI have a lot of bitmaps in-memory and the memory usage difference is important. Thanks for your time, Daniel -- "Why program by hand in five days what you can spend five years of your life automating." -- Terrence Parr, ANTLR author -- Except - Free Software developers for hire - http://except.com.ar |
|||||||||||
|
Alpha blending bug? |