There's a PR in which aims to address some of the issue which are known to affect performance under OpenGL but I've run into a bit of a wall dealing with the actual stalls.
By "Stalls" I mean the moments when the framerate ceases to be smooth and drops from a constant 60fps for 1 to several seconds before recovering.
I think that I've tracked it down a single function but the function in question just leaves me with more quesitons than answers:
What that means is that it gets called just once (1) in the problem frame, but consumed 9782.8490 Million CPU cycles or 100% of the measured frame time.void __cdecl LuaEvent::Emit(void) 1 9782.8490(MCycles) (100%)
The rest of the games processing for that frame didn't use up enough time to register as even 1%.
Now, LuaEvent::Emit is quite a simple function in and of itself but what I need to know is what events it's emitting so that I can track down the real culprit which I suspect is something nasty in the trade script or something similar.
Is there any way to know what events it's going to fire? I could just go through all of the registered Lua functions and add profiling code to them then try and work it out manually but that might take a while.
Andy