Page 1 of 3
Shaders & disabling shaders
Posted: Wed Oct 16, 2013 6:15 pm
by FluffyFreak
Does it make sense to keep the non-shader code path around?
Does anyone use it?
Why are you/they using it?
I've just realised that I hadn't even thought about the non-shader option of the shields stuff. I could keep the existing solution around for if shaders are disabled, or I could set it up in such a way that we use the new meshes with the old non-shader pipeline, but it means writing more code and testing more things that I just don't think is in use.
I'd rather we removed the non-shader fallback entirely and cleanup the code to be fully shader based OpenGL GLSL.
At least to the level that the Mac OSX supports anyway, we couldn't for example start requiring OpenGL 4.3 but we could go to OpenGL 3.2 and GLSL 1.50 (
https://developer.apple.com/graphicsima ... abilities/).
That would make it easier to port the game to other platforms in future and might even fix some issue we have with shaders refusing to compile: what works fine on nVidia often dies horribly on AMD/Intel drivers due to them being more standards conformant with specific versions limitations, whereas nVidia let you get away with a lot :(
I've asked this before, at least once a year probably, so has anything changed? Do people still play the game with shaders disabled?
Re: Shaders & disabling shaders
Posted: Wed Oct 16, 2013 8:11 pm
by Tichy
Never played without shaders, except for testing purposes.
Re: Shaders & disabling shaders
Posted: Wed Oct 16, 2013 8:16 pm
by Zordey
I asked on GIT because I would turn shaders off on my work machine since it runs much faster without them. I have noticed that patterns do not work with shaders turned off as well.
The reason I turn shaders off is because I have my work laptop set in Windows power saving mode (mainly to stop the fans from spinning up) [I work in a very quiet office and its quite noticeable when the fans kick in.]
Overall though, its no big deal to turn the power up on my machine to play Pioneer so wouldnt annoy me if the option was removed
Re: Shaders & disabling shaders
Posted: Wed Oct 16, 2013 8:29 pm
by lwho
Just made a quick search for Linux: Open source AMD, NVidia and Intel drivers seem to only support OpenGL 3.1 (using Mesa 9.2). Mesa 10.0 is announced to support OpenGL 3.2 or 3.3. The proprietary AMD and NVidia drivers seem to support OpenGL 4.3.
So, requiring anything newer than OpenGL 3.1 could be problematic for Linux at the moment.
Re: Shaders & disabling shaders
Posted: Wed Oct 16, 2013 9:32 pm
by robn
Our hope has always been to remove the shaderless path eventually. Every time we think about it, there's always a reason to keep it. Perhaps its time to think again.
I think we'll need to have levels of support though - graphics quality, if you like. Merely having shader support available doesn't mean that they can quickly handle all manner of complex things. Right now I do UI work with shaders off because its the only way to get a decent frame rate on my Intel chip. With shaders on I either have to accept a choppy mess or fire up the NVidia chip. Both of these things destroy my battery.
Re: Shaders & disabling shaders
Posted: Wed Oct 16, 2013 10:06 pm
by jpab
I develop on two machines. My desktop, which has a modern nvidia card, and an old Toshiba laptop which has integrated Intel graphics. The laptop can run GL2.1. Last time I checked (which was a while ago) it could run with shaders, but only slowly, so on there I usually run with shaders turned off.
My position remains the same as it was last time I remember this being discussed: I'm ok with removing the fixed function path, but I think we need to continue to support low end machines. So we need a set of simplified shaders that drop features but give acceptable performance on Intel graphics.
It wouldn't be the end of the world if I lost the ability to run on the old Toshiba, but it would be a little inconvenient for me.
Re: Shaders & disabling shaders
Posted: Thu Oct 17, 2013 9:04 am
by FluffyFreak
Good god I hate Intel GPUs (
http://en.wikipedia.org/wiki/Comparison ... sing_units) and their drivers! Wtf!?!
They've had 9 and 10 compatibility for that long but never enabled anything above OpenGL 2.0/2.1 on Windows until so recently?
It'd be good to know what's so slow about using shaders with them, or more specifically which shaders are so slow on those GPUs.
Does disabling the Eclipse help?
I wonder if it's falling back to software compatibility for some shaders as they're too complicated for Intels drivers.
We might be able to avoid that by automatically optimising teh shader source using something like the glsl-optimizer (
https://github.com/aras-p/glsl-optimizer) before feeding it to the OpenGL shader compilers.
Really there's nothing we're doing that shouldn't run reasonably on anything from the last 10 years :/ unless its from Intel it seems.
Re: Shaders & disabling shaders
Posted: Thu Oct 17, 2013 9:15 am
by FluffyFreak
Either way really the not-really-that-much-older Intel GPUs do rather throw a spanner in the works for progressing to even OpenGL 3.0+
Re: Shaders & disabling shaders
Posted: Thu Oct 17, 2013 11:53 am
by robn
FluffyFreak wrote:It'd be good to know what's so slow about using shaders with them, or more specifically which shaders are so slow on those GPUs.
Are there any proper development tools available for shaders? A profiler would be lovely.
We might be able to avoid that by automatically optimising teh shader source using something like the glsl-optimizer (
https://github.com/aras-p/glsl-optimizer) before feeding it to the OpenGL shader compilers.
The thing is, glsl-optimizer is just the Mesa shader compiler extracted into a standalone utility. Its not going to get any better on Linux at least.
Re: Shaders & disabling shaders
Posted: Thu Oct 17, 2013 8:05 pm
by FluffyFreak
I was thinking of helping the official Intel drivers.
I don't expect good performance from the open source ones, there's just too little information for them to exploit the architecture and too little experience and resources to compete with the manufacturers in those terms. So I'm happy that they're/we're able to run comfortably on them.
However, when it comes to Intel and to their older GPU drivers (hell even their new ones) then every little helps.
They're about as bad as some of the early mobile GPU drivers so if you can optimise them in advance then it can mean the difference between it running well or running but falling back to software emulation.
Nothing we can do about the OpenGL 2.0/1 versioning of course.