Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

(OSX) Rendering problems #37

Closed
DanielGibson opened this issue Apr 21, 2016 · 9 comments
Closed

(OSX) Rendering problems #37

DanielGibson opened this issue Apr 21, 2016 · 9 comments

Comments

@DanielGibson
Copy link

DanielGibson commented Apr 21, 2016

There are rendering problems in OSX, both 32bit and 64bit, both first and second encounter.
It's OSX 10.11.4 on a i7-4570R with Iris Pro graphics.

I also get an assertion failed (on startup in TFE, in the first level on TSE), from Grx_wrapper.cpp:154 pglGetError()==0.
Unfortunately I suck at using OSX and OpenGL, so it'd be cool if someone else could reproduce and debug the problem ;-)

tfe-bug
tse-bug

@icculus
Copy link
Owner

icculus commented Apr 21, 2016

I think this is a missing depth buffer, I've seen this on Linux when changing from fullscreen to windowed.

@rohit-n
Copy link

rohit-n commented Apr 21, 2016

I can reproduce the second screenshot on Linux when switching resolution before starting a new game.

@DanielGibson
Copy link
Author

The problem is that for some stupid reason in CGfxLibrary::SetupPixelFormat_OGL() gap_iDepthBits <12 so SDL_GL_DEPTH_SIZE is set to 0.
If I set gap_iDepthBits = 16; in the <12 case it works.

Haven't really figured out why that is <12 in the first place, though.

@rohit-n
Copy link

rohit-n commented Apr 23, 2016

gap_iDepthBits is first set to zero in GfxLibrary.cpp:197. However, I've noticed that the game reads from Scripts/PersistentSymbols.ini later on (I can't find exactly in the code where this happens). If it doesn't already exist, one can add

persistent extern user INDEX gap_iDepthBits=(INDEX)16;

to the file and never see this issue again.

@DanielGibson
Copy link
Author

DanielGibson commented Apr 24, 2016

ok, /maybe/ gap_iDepthBits == 0 is supposed to mean "automatically choose suitable depth buffer bits" or "use color depth" (INDEX gap_iDepthBits = 0; // 0 (as color depth), 16, 24 or 32 suggests the latter).

If I understand the documentation of Window's ChoosePixelFormat() function correctly, it always creates a depth buffer, unless explicitly told otherwise with a flag, see https://msdn.microsoft.com/en-us/library/windows/desktop/dd368826%28v=vs.85%29.aspx :

PFD_DEPTH_DONTCARE - The requested pixel format can either have or not have a depth buffer. To select a pixel format without a depth buffer, you must specify this flag. The requested pixel format can be with or without a depth buffer. Otherwise, only pixel formats with a depth buffer are considered.

So, what should we do when it's 0?
Not call SDL_GL_SetAttribute(SDL_GL_DEPTH_SIZE, gap_iDepthBits); at all and assume the driver (or SDL or whoever) chooses a sane default?
Just set 16?
Set depending on gl_dmCurrentDisplayMode.dm_ddDepth? if so, should we use 24bit or 32bit in case of DD_24BIT?

@icculus
Copy link
Owner

icculus commented Apr 24, 2016

Just set it to 16 if it's zero. There's no way this game needs more than 16.

(dm_ddDepth is color depth of the display, not the depth buffer, and is probably always 32 in modern times.)

@icculus
Copy link
Owner

icculus commented Apr 24, 2016

Figuring out why this is zero in PersistentSymbols.ini is a worthwhile endeavor, too...has this always just been falling back to 0 on resolution change and no one ever noticed, or did we break it?

@yamgent
Copy link

yamgent commented Apr 24, 2016

I don't think we broke anything, people have spotted this bug before:
https://bugs.winehq.org/show_bug.cgi?id=11970#c43

it could be a deliberate (albeit bad) decision to set it to 0. Since ChoosePixelFormat() is not obligated to give the exact pixel format, but just return a format that is as close as possible to the request, that could be the intention of setting it to 0 [see also this, although I cannot find the NVIDIA source that it is citing. In this article, Serious Sam "2" = TSE].

DanielGibson added a commit to DanielGibson/Serious-Engine that referenced this issue Apr 24, 2016
Seems like on Windows 0 bits is handled as "Let Windows/Driver decide),
so handle that case by setting a reasonable value of 16.
@DanielGibson
Copy link
Author

added a fix for this to PR #44

DanielGibson added a commit to DanielGibson/Serious-Engine that referenced this issue Apr 24, 2016
Seems like on Windows 0 bits is handled as "Let Windows/Driver decide",
so handle that case by setting a reasonable value of 16.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants