So this needs a little bit of backstory —
A few months ago, my laptop had died, so I decided to borrow my boyfriend’s old MacBook Pro for a few weeks.
I loved it. I didn’t exactly love the MacBook Pro, but I absolutely loved OS X. The developer tools for my type of work (web development) are just frankly 10 years ahead of their Windows counterparts. It looks far more visually appealing than Windows or even Linux.
So, when I fixed my laptop, I couldn’t settle for Windows. I didn’t want to go out and buy a MacBook Pro — as the hardware I have is already future proof for the next 3 years.
I investigated the Hackintosh community, and to my amazement almost all the hardware was supported out of box and is well documented. But absolutely everyone said that AMD switchable graphics do not work and there is no way to get it to work.
For me to use a system full-time, it has to have full hardware support. This is the main reason why I don’t use Linux. I have to always settle for “oh it works but not correctly”.
I got a copy of the AMD card VBIOS by using the following tool to dump the system BIOS. From reading around, I saw that the VBIOS is around ~63K in size, so I found the VBIOS through trial and error.
I then gave this to Clover (the Hackintosh UEFI boot utility).
This seemed to work. Although I could tell the graphics card was actually initalized by OS X, I couldn’t use it at all. This made me think there must be a way of tricking Apple’s Mux control into activating it.
Some users’ had reported success with tricking the system to use it through some EDID forcing, but this cannot work on my hardware as I know the display goes through the Intel card and there is no way to switch this.
Getting curious, I looked around for a tool that would show more information about the graphical environment.
I found that the card was there, and was available! I was even able to select it and render through it. The FPS was significantly higher than the Intel card, which shows conclusively that there is some rendering going on at the dGPU level.
This made me think of the following problems:
- This card will only EVER work with OpenGL applications that allow offline renderers. As with this system there is no Mux that allows it to be entirely switched.
- I will need some way of forcing EVERY OpenGL application to use the AMD renderer. This will probably result in a low-level library hook.
Problem 1 is not able to solved easily, as far as I’m aware, and I’m not willing to waste any time on it.
Problem 2 however, was slightly easier and could be accomplished.
Digging around the Apple OpenGL documentation, I found a function that does what I need to do. CGLSetVirtualScreen()
The problem is calling this at the right time, and inside each targetted application.
Reading further into the Apple developer documentation, I found that even OpenGL application has to call CGLCreateContext() to get an OpenGL context. The sensible way of doing this would be to hook that function, then calling the original, then setting the virtual screen to what we want, then return back to the original application.
And.. it worked!
I haven’t been able to get anything but Chrome and Chess working with this (mostly as I haven’t tried anything else), and I can’t be bothered writing some system-level thing to do this with every application, someone else can do this if they can be bothered.
The full source code is here in zip form, and includes a build script and an example to run in Chrome. You’ll probably have to modify both the launcher script and the source for your system.
If you cannot do this yourself, then contact me and I can do at a price.