Myronazz

Distinguished
Sep 5, 2016
325
12
18,795
Hello...

I have been having an absolute nightmare trying to get my friend's Radeon HD 7670M to work with games. But no matter what we do only the Intel HD Graphics 4000 is being used... Now it's an important thing to note that outside Minecraft we don't know if it's being used by the system, if we go to Dota 2, the game runs at 21fps at most so we can assume that the Intel GPU is being used, but we have no way of knowing.

When we are in Minecraft, the game runs at 80fps at best (actual average is around 50-60) and it reports the Intel card being used. Going to the latest AMD Catalyst, we have set Minecraft to High Performance so the Radeon GPU is used instead of Intel but clearly that for whatever reason does not work, even though the drivers recognize everything. If I go to DirectX Diagnostics it says that display is from Intel HD Graphics but Render is the AMD Card. Does that mean that the system displays through the Intel card but renders graphics through the AMD card which is why Minecraft says Intel is being used? Is this deduction correct or I am dumb?

It's also important to note that the laptop doesn't officially support running Windows 10 by HP. But there are drivers for Windows 10 for every single device, just not directly from HP, including the two GPUs. But could it be that the Radeon HD 7670M is specially designed and needs drivers directly from HP? Is that a possibility?

The laptop model is HP Pavillion G6-2220ev

Thank you... I appreciate any help

Things I've tried so far:
  1. DDU AMD drivers on safe mode and install them back.
  2. Set High Performance to everything (Including javaw.exe because Minecraft is a game that runs via Java)
  3. Try the Beta Crimson Drivers, does not appear to work. Catalyst appears to work better
 

Urzu1000

Distinguished
Dec 24, 2013
415
10
18,815
Honestly, it's not going to be that huge of a performance jump even if you do get it working. Think along the lines of maybe 30% better FPS. 15 FPS might turn into 19 FPS or something similar.

That being said, there is a setting buried somewhere in the new Windows 10 menus that lets you set your default GPU. I can't remember what it is though.

The next piece of advice I'm giving you is "try at your own risk" advice, because I've never done it and I have no idea if it will work favorably. You could go into the motherboard BIOS and disable the Intel iGPU through there, forcing everything to use the AMD GPU instead (since there would effectively be no alternative). It should work fine, but in a worse case scenario (assuming some really god-awful BIOS design) you may not be able to get into the BIOS if it doesn't use the AMD GPU. Like I said, very slim chance of that being an issue, but if it happens you'd have to either find a CMOS reset switch or (more likely) open the laptop up and pop the CMOS battery on the motherboard in and out.
 

Myronazz

Distinguished
Sep 5, 2016
325
12
18,795
I did alot of research to this and apperantely the Radeon Graphics depend on the presence of Intel HD Graphics to work. I'm not 100% sure why that is but I've seen it happen. If my Intel GPU is not present AMD catalyst stops working and the driver for the card itself stops functioning as well. Resulting in a laptop that does display but performs badly because it's like the drivers dissappeared. Not sure why all that happens, but it does.

So yeah.....
 

TRENDING THREADS