The SDL forums have moved to discourse.libsdl.org.
This is just a read-only archive of the previous forums, to keep old links working.


SDL Forum Index
SDL
Simple Directmedia Layer Forums
Sprite sheet overlap when using hardware textures
Dark_Oppressor


Joined: 15 Aug 2013
Posts: 47
I'm using the SDL2 2D graphics API, with SDL_Texture's. When in hardware mode (tested with direct3d and opengl), but NOT when in software mode, I have a problem:

I'm making a game with a tiny resolution (256x144). I have two fonts. Each font has one image (.png) that contains all of its characters. There are 256 characters. The standard font's characters are 8x8, so the image is 2048x8. The small font's characters are 3x5, so the image is 768x5.
I use 'SDL_RenderSetLogicalSize(renderer,SCREEN_WIDTH,SCREEN_HEIGHT);' to set the logical render size to 256x144. I make the window a larger resolution (my tests are with the same aspect ratio, though).
On certain resolutions (640x360, 1920x1080) my small font (but not my standard one) is messed up. It's fine at 256x144 obviously, and it also looks fine at 1280x720.
The weird look appears to be that at certain positions in the window, small font characters will show parts of other characters that are adjacent to them on the sprite sheet image itself.
I remembered something from the ancient past about non-power of 2 textures, so I tried padding my small font image, but that didn't seem to make a difference. I'm using "nearest" render scale quality.
It seems to be some kind of rounding issue (move a messed up character over 1 logical pixel, and suddenly it looks fine). The best possible thing I've found from Googling is to try putting empty padding between sprites on the spritesheet, but that is a terribly painful solution, and I'm hoping there is something I'm missing.
Can anyone tell what my problem is?
Sprite sheet overlap when using hardware textures
Sik


Joined: 26 Nov 2011
Posts: 905
Yeah, that sounds like a rounding issue. No idea whether it's within
SDL or your hardware's fault (since I know some low-end GPUs are
rather lackluster when it comes to it).
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
Re: Sprite sheet overlap when using hardware textures
Dark_Oppressor


Joined: 15 Aug 2013
Posts: 47
Sik wrote:
Yeah, that sounds like a rounding issue. No idea whether it's within
SDL or your hardware's fault (since I know some low-end GPUs are
rather lackluster when it comes to it).
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org


I've got a Geforce GTX 760 in this machine. I also tested it with the same results on a box with a Geforce GTS 250. I just tested it on Android (a Galaxy S4), and 1920x1080 actual resolution on there looks fine! Hrm...
mr_tawan


Joined: 13 Jan 2014
Posts: 161
Have you tried setting the SDL_HINT_RENDER_SCALE_QUALITY hint to 0 (nearest neighborhood) ?
Dark_Oppressor


Joined: 15 Aug 2013
Posts: 47
mr_tawan wrote:
Have you tried setting the SDL_HINT_RENDER_SCALE_QUALITY hint to 0 (nearest neighborhood) ?


Yes, I'm sorry, I mentioned in my original post that I was using nearest render scale quality, but I should have made it more clear. Setting that hint to "nearest" is what I'm currently doing.
Sprite sheet overlap when using hardware textures
Joseph Carter


Joined: 20 Sep 2013
Posts: 279
http://www.gamedev.net/topic/627546-solved-spritesheet-bleeding/

Note this thread is from 2012 and the OP's use of OpenGL was less
than optimal in 2002 (glBegin(GL_QUADS), really?!) but it discusses
some of the problems and is worth a look.

You should not be having this problem with nearest neighbor pixel
filtering, but hardware sometimes optimizes fast in place of correct.

Is your text MOVING when the artifacts show up, and what's your blend
function?




On Wed, Jan 07, 2015 at 10:51:39AM +0000, Dark_Oppressor wrote:
Quote:
I'm using the SDL2 2D graphics API, with SDL_Texture's. When in hardware mode (tested with direct3d and opengl), but NOT when in software mode, I have a problem:

I'm making a game with a tiny resolution (256x144). I have two fonts. Each font has one image (.png) that contains all of its characters. There are 256 characters. The standard font's characters are 8x8, so the image is 2048x8. The small font's characters are 3x5, so the image is 768x5.
I use 'SDL_RenderSetLogicalSize(renderer,SCREEN_WIDTH,SCREEN_HEIGHT);' to set the logical render size to 256x144. I make the window a larger resolution (my tests are with the same aspect ratio, though).
On certain resolutions (640x360, 1920x1080) my small font (but not my standard one) is messed up. It's fine at 256x144 obviously, and it also looks fine at 1280x720.
The weird look appears to be that at certain positions in the window, small font characters will show parts of other characters that are adjacent to them on the sprite sheet image itself.
I remembered something from the ancient past about non-power of 2 textures, so I tried padding my small font image, but that didn't seem to make a difference. I'm using "nearest" render scale quality.
It seems to be some kind of rounding issue (move a messed up character over 1 logical pixel, and suddenly it looks fine). The best possible thing I've found from Googling is to try putting empty padding between sprites on the spritesheet, but that is a terribly painful solution, and I'm hoping there is something I'm missing.
Can anyone tell what my problem is?





Quote:
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org

_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
Sprite sheet overlap when using hardware textures
Sik


Joined: 26 Nov 2011
Posts: 905
2015-01-08 2:48 GMT-03:00, Dark_Oppressor:
Quote:
I've got a Geforce GTX 760 in this machine. I also tested it with the same
results on a box with a Geforce GTS 250. I just tested it on Android (a
Galaxy S4), and 1920x1080 actual resolution on there looks fine! Hrm...

OK, that definitely looks like something to look into SDL then, I
doubt *those* cards have that kind of bugs.
_______________________________________________
SDL mailing list

http://lists.libsdl.org/listinfo.cgi/sdl-libsdl.org
Re: Sprite sheet overlap when using hardware textures
Dark_Oppressor


Joined: 15 Aug 2013
Posts: 47
Joseph Carter wrote:
http://www.gamedev.net/topic/627546-solved-spritesheet-bleeding/

Note this thread is from 2012 and the OP's use of OpenGL was less
than optimal in 2002 (glBegin(GL_QUADS), really?!) but it discusses
some of the problems and is worth a look.

You should not be having this problem with nearest neighbor pixel
filtering, but hardware sometimes optimizes fast in place of correct.

Is your text MOVING when the artifacts show up, and what's your blend
function?


The text is not moving when the artifacts show up. I can put a letter of my bitmap font in a specific place in the window each time I test and it will always show up messed up. By blend function do you mean: 'SDL_SetTextureBlendMode(texture,SDL_BLENDMODE_BLEND);' ?
I use that whenever I create a texture from an image file, including my bitmap fonts. I'm just using SDL's 2D accelerated rendering stuff.

I'm going to go read that link now, thanks!
Dark_Oppressor


Joined: 15 Aug 2013
Posts: 47
OK, I read that link, and it sounds like the kind of thing I'm thinking might be happening. Some sort of hardware texture filtering something or other, but it's happening below the level that I can access through SDL's 2D rendering API. Does that sound reasonable?

I ran a test on my laptop with a Radeon HD 7520G (integrated graphics), and it works fine there in 640x360 (one of the resolutions that fails on the ones that have issues). It also works fine in 1600x900 (my laptop's max resolution), just for the record, as well as 1280x720 (which has worked on all machines so far).

So, I know I have only tested on 4 machines so far, but I've seen it:
Work in software mode (albeit slowly!) on the initial machine where I noted a problem
Fail to work in either OpenGL or DirectX hardware accelerated modes on the initial machine
Fail to work in OpenGL (did not test DirectX) on a second machine
Work on a Samsung Galaxy S4
Work on a laptop with a Radeon HD 7520G

Where "work" is defined as no weird sprite display issues cropping up. The two machines that are having issues are both using Nvidia cards. Not sure if that means something, but it might be a direction to look in...
Dark_Oppressor


Joined: 15 Aug 2013
Posts: 47
Is this perhaps something I could/should submit a bug report about?
MikeyPro


Joined: 14 Feb 2015
Posts: 15
I'm running into the same problem of sprite frame bleeding.

I can say to a 99% certainty that this is on the end of SDL. I still need to look at SDL's source code to verify where.

The frame information you pass into SDL (SDL_RenderCopy) is an SDL_Rect, which is all integers. That makes it impossible to deviate from those specifications, As long as those numbers have not changed from some calculation with floats you did beforehand, the width and height will always be correct and would not cause frame bleed. With my case, I have verified my data is always correct, and yes, my quality has been set to nearest for pixel perfect drawing, or as close to as I am offered from the library.

One thing about this bug is I only see it occur in windowed mode. When I go fullscreen the bleeding does not occur. I always use the desktop resolution and then set a logical resolution of 320x240, with windowed mode using an 800x600 window.

So until I look at the source, I can only assume SDL itself is actually using floats somewhere in the texture lookup for the source rectangle and messing it up. It's a common mistake and with SDL2 being a new iteration of SDL, it's not that big of a shocker it has some bugs.

This is being done on a Windows 7 x64 machine.

If anybody has anything else to add, that'd be great.
MikeyPro


Joined: 14 Feb 2015
Posts: 15
I believe the problems for the opengl renderer, which is the one I use and assume is the OP's renderer as well, might be coming from the source file SDL_render_gl.c starting at line 1209 where it uses floats to calculate the rectangles for the texture:

Code:
   
    minx = dstrect->x;
    miny = dstrect->y;
    maxx = dstrect->x + dstrect->w;
    maxy = dstrect->y + dstrect->h;

    minu = (GLfloat) srcrect->x / texture->w;
    minu *= texturedata->texw;
    maxu = (GLfloat) (srcrect->x + srcrect->w) / texture->w;
    maxu *= texturedata->texw;
    minv = (GLfloat) srcrect->y / texture->h;
    minv *= texturedata->texh;
    maxv = (GLfloat) (srcrect->y + srcrect->h) / texture->h;
    maxv *= texturedata->texh;
MikeyPro


Joined: 14 Feb 2015
Posts: 15
Maybe some kind soul could check it out and compile to see? I believe ( )'s surrounding the integer calculations before the cast to GLFloat might be what is needed to prevent the issue... I may be wrong, but I don't think the GLfloat cast is casting the entire op and only the lefthand operand of the divisions... I could be wrong but it would make sense considering the issue.
MikeyPro


Joined: 14 Feb 2015
Posts: 15
Or SDL could use the integer form of the GL functions, instead. Why use floats anyway? Anyone who calls the RenderCopy function call can only pass integers to it in the form of SDL_Rect....
MikeyPro


Joined: 14 Feb 2015
Posts: 15
So after all that and a bug report, it comes down to the GPU handling of texels. The way around it is to just add an empty pixel border around every single image and sprite frame you do. It's not fun, but it's worth it... grumble... grumble..