Hybrid Graphics for AMD+AMD systems

So, as you guys might have figured out, I installed this distro for the first time. Everything seems to be working just fine except for two things that are nagging me. And I am going to be writing about one of them here right now.

Everybody seems to have talked about Intel+Nvidia, Intel+AMD, AMD+Nvidia. Are there really no users who run AMD APU + AMD GPU? Maybe I am not searching the right thing, but so far I have found nothing that works. :pensive:

#inxi -Ga
Device-1: AMD Baffin [Radeon RX 460/560D / Pro
vendor: ASUSTeK driver: amdgpu v: kernel bus-ID: 01:00.0
chip-ID: 1002:67ef class-ID: 0380
Device-2: AMD Picasso vendor: ASUSTeK driver: amdgpu v: kernel
bus-ID: 04:00.0 chip-ID: 1002:15d8 class-ID: 0300
Device-3: IMC Networks USB2.0 HD UVC WebCam type: USB driver: uvcvideo
bus-ID: 1-4:3 chip-ID: 13d3:56a2 class-ID: 0e02 serial: 0x0001
Display: x11 server: X.Org 1.20.13 driver: loaded: amdgpu,ati
unloaded: fbdev,modesetting,vesa display-ID: :0 screens: 1
Screen-1: 0 s-res: 1920x1080 s-dpi: 96 s-size: 508x285mm (20.0x11.2")
s-diag: 582mm (22.9")
Monitor-1: eDP res: 1920x1080 hz: 60 dpi: 142 size: 344x194mm (13.5x7.6")
diag: 395mm (15.5")
OpenGL: renderer: AMD Radeon Vega 8 Graphics (RAVEN DRM 3.42.0
5.14.9-arch2-1 LLVM 12.0.1)
v: 4.6 Mesa 21.2.3 direct render: Yes

Let me know if you guys need to know something more about the system. Oh, and its a Asus TUF FX505DY laptop. :grinning_face_with_smiling_eyes:
I think I am using my Integrated GPU to render the desktop environment, but I would love to for you guys to cross check that too. Problem here is CoreCtrl says that my dGPU, GPU0 is chugging 6W continuously. So how do I switch it off when the Integrated GPU is in use?

I actually don’t know the full answer to the question but I think you would have to use prime or reverse prime depending on which gpu you want to use.


I found this on reddit. Not sure if it’s any help?

You use the environment variable DRI_PRIME to tell a program which GPU to use. Generally, 0 is integrated and 1 is discrete. So to test, run:

DRI_PRIME=0 glxinfo | grep "OpenGL renderer"

DRI_PRIME=1 glxinfo | grep "OpenGL renderer"


Edit: I’m not sure if you can set it up with optimus-manager like Intel/Nvidia?

Edit2: Here is a video link related to prime render

Something is not right here.

Giving the command: xrandr --listproviders gives this result

Provider 0: id: 0x55 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 4 outputs: 2 associated providers: 1 name:AMD Radeon(TM) Vega 8 Graphics @ pci:0000:04:00.0
Provider 1: id: 0x84 cap: 0xf, Source Output, Sink Output, Source Offload, Sink Offload crtcs: 5 outputs: 0 associated providers: 1 name:Radeon RX 560 Series @ pci:0000:01:00.0

In the video, you can see his providers exclusively mention one of the display cards as Source Output, and other one as Sink. But in my case, its lists everything lol. I don’t think the Prime command is doing anything either. How do we determine which of the two GPU is GPU0 and GPU1? Can different programs report different results?

Edit - So you can’t run Optimus Manager because it specifically for Nvidia dGPU. The pages you sent about PRIME also sends me to a Gentoo page for setting up vga switcheroo (I think?) which can be done after going through Kernel and changing config files and it goes on and on. I can change config files, but I am no way handling anything relating to Kernel. Hands up. Why does no one have Linux setup on AMD+Radeon setup :sob:
I have also found this but I am not sure what to do with it: https://gitlab.freedesktop.org/hadess/switcheroo-control :man_shrugging:

Edit - more research (googling) sent me here: https://www.reddit.com/r/linux_gaming/comments/aoh5be/guide_hybrid_graphics_on_linux_nvidia_optimus/
The second part to that post has some idea about Intel+AMD configuration. Reading through that suggested that AMD+Radeon configs are can inherently use PRIME. But then my question is - Does the dGPU not turn off completely?

The command shows your graphics cards properly. I don’t know if it will work or not. I don’t have the hardware to experiment with. This Gentoo page does give a lot of information.


Have you tried vga switcheroo.


To test the GPU, the article says -

First make sure that the kernel was compiled with the following settings

I don’t know first thing about compiling. I have not touched a kernel before. And it tells to make sure if kernel is compiled with certain settings. I am totally out of my depth with this. I don’t know how and where you check the settings with which kernel was compiled. Its not like we are compiling kernel when we download a distro, right?
I feel like the GPUs are gonna stay as it is until someone who is very very comfortable in linux buys a laptop with dual AMD GPU and makes a video about configuring it. lol

I can’t experiment because I don’t even know where to start with all the information given in Gentoo article, which admittedly is plenty, but its pretty advanced level stuff.

But I can confirm DRI_PRIME is not working as intended.
Because I ran the command

DRI_PRIME=1 /usr/bin/firefox

This should result in some kind of activity in the dGPU, right? But CoreCtrl reports absolutely no change no matter what (or how many) video I play in dGPU. There are some changes in iGPU, but the spikes in clocks don’t look like I expect them to. Also, the activity reports as 100% all the time. Perhaps CoreCtrl has not hooked up to my system properly (which I doubt because it reports my CPU clocks perfectly). Is there any other useful tool to to record GPU clocks and temps?

To use dGPU, run this

DRI_PRIME=1 firefox

As for draining 6w, its because its in hybrid mode. I haven’t found any way of disabling dGPU when not in use.

Yea, I found out about that a while back. The dGPU never really fully completely turns off. Just goes into standby, thereby sipping single digits Watts (to think ARM processors can run at full speed on this much watts haha). DRI_PRIME does run on dGPU, but this can only be seen on 3D application. Using that command on firefox resulted in nothing for some reason but Valley benchmark started as intended and ran on dGPU.

1 Like

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.