My laptop from 2009 has an NVIDIA card in it. I built my desktop in 2012 and put an AMD Radeon HD 7770 in it – a very capable card. However, I’ve been having trouble getting the OpenGL programs that I run on my laptop to work properly on my desktop, which has struck me as weird.
I have been concerned about this for awhile because it seems to be an open secret in the graphics hardware world that AMD’s support of OpenGL isn’t quite as good as NVIDIA’s is. But on the other hand, I don’t want to just jump to a “this company sucks” kind of conclusion. Odds are there is something I’m not doing quite right, seeing as I have a little less experience than the millions of man hours that have gone into NVIDIA and AMD’s history.
I realized this morning as I got to work (away from my desktop, so I can’t test anything just yet) that GLUT has a glutInitContextVersion() function that allows the user to define the version of OpenGL they are using. I haven’t been using it, and while OpenGL defaults to 3.3 on my laptop, that’s also the highest level it supports. Perhaps my desktop is defaulting to 4.2, which doesn’t compiler shaders right or something?
I have a new video to put together to show off the basics of a GUI that I’ve been putting together for my engine, but while everything looks great on my laptop, it doesn’t work at all on my desktop, which is the whole basis for this post. It’s annoying, and makes me wish our school was using DirectX rather than OpenGL for its work. Maybe I should maintain two versions of my engine, one in OpenGL for the classes and one in DirectX for my ‘professional’ portfolio. I don’t really intend to work outside the Windows work anyway, unless I’m going to consoles, and then XBOX uses DirectX too, whereas the PS3 and PS4 use proprietary graphics libraries that I’ll have to learn anyway.
This is mostly just me thinking out loud, but it’s useful to get things down in writing, in a linear fashion. Helps organize my thoughts. Time to play around with the engine some more.