Re: TurboSphere
Reply #358 –
TS uses whatever SDL2 uses. On my laptop with dual, dissimilar GPUs, it uses whichever one the drivers use for display 0 in GLX. I assume it is similar in Windows, whichever GPU WGL will give it as a default. SDL2 does have more advanced support for choosing version of OpenGL, which I use to specify a 3.1 context. I don't know if it will try using other devices if the first one can't do it.
All a file needs in order to be successfully loaded by TurboSphere as a plugin is a valid executable library header that specifies the symbols Init, GetFunctions, GetFunctionNames, GetVariables, and GetVariableNames. It will, of course, fail quite spectacularly if those symbol definitions are lies.
This is too perfect of an error for it to be corruption. I'm going to assume I made some typo in the GL funciton loading code somewhere, and it's saying that glGenBuffers is missing when it's really something else completely. It would still have to be something a 550 has that a 560 doesn't. Quite possibly the NVidia texture cloning extension I use, which apparently has been superseded by a a core function, so an older card would have it but a newer card might not. Originally, I assumed that, since my AMD 8750, 8550, NVidia 8800 and 550 ti all had the NVidia variant, and only the 8550 had the GL variant, I could safely use the NVidia version. I recently changed that to try and get the NV function, then if that fails the GL function, then fall back to software if that fails. I don't think that change made it into 0.3.5c.