I fell in the Nvidia trap, but about two years ago when I discovered EOS, spent more time to learn about Linux, and via this forum learned the hard way that Nvidia simply is not great on Linux. To put it mildly… Their hardware support sucks.
I never had AMD graphics. My questions for y’all, is it really worth paying attention to AMD vs Nvidia on a new laptop purchase? Did you have seamless experience with AMD and Linux, is it because they open source their firmware, so technically should be supported in the Linux Kernel? Does hybrid graphics work well too?
Or if you need performance - some top-tier DELLs like Precision have BIOS setting to use only dedicated GPU, like in normal PC, to avoid optimus crap altogether.
Ok, so even though and has open source drivers, the only way to switch hybrid is either Optimus or only dedicated. At least in my mind. There are other options but seems hybrid is just bad on Linux. But I assume it may work better on AMD because of the supported driver.
Problem is most everything is on laptops nowadays, so what’s a better solution, like a heavier laptop with only dedicated GPU? But that’s hardware bloat
I know I live on another planet, but can one even get a laptop with only dedicated graphics? I assume switching to dedicated graphics in bios is a must then.
Yes, that is exactly the way to do this…unfortunately very rarely you can find such models.
Other option back in a day, when there were simpler BIOS, instead of UEFI garbage - you could modify or reverse engineer BIOS for some laptop models (like some Sony VAIO) to have a proper switcher inside BIOS…
P.S. Not to say that it’s impossible to make Optimus laptop work decently (in fact on Linux it is simpler in my view), it’s just won’t be flawless because of absolute insanity of whole on-the-fly GPU-switching concept.
You do live on another planet. Ryzen man …Ryzen! First of all i would never buy a laptop to do gaming on. I’m a desktop user and if i was a gamer that’s what i would be using. No hybrid GPU crap for me. On laptops I’ll stick with onboard graphics.
My advice would be to just get a gaming desktop with top-tier AMD graphics. You would pay less for it and have a much better time than on any laptop.
I find laptops to be more trouble than worth, especially high-end laptops you’ll pay a fortune for.
Optimus graphics is just rubbish, I wouldn’t even consider that. And NoVidea tends to drop support for older hardware, so proprietary drivers become too much trouble to maintain. After a while, they get dropped from the repos and soon, you’re stuck with noveau – which, while a valiant effort, is also utter rubbish. This is a much worse problem on a laptop than on a desktop, because you can’t give your old NoVidea graphics card to Tim Cook’s giraffe to step into, and just get a newer one (or, even better, AMD, as you’ll probably be pretty pissed off at NoVidea when that happens).
The only advantage a laptop has over a desktop is its mobility – having a laptop is nice when you’re travelling or to bring with you to work, but for anything else, a desktop is far superior. In fact, the money you’d save by getting a really good desktop, compared to a much crappier laptop, would be enough for you to get a cheaper laptop as well, which you can use to visit this forum while you’re away from home and discus how bad NoVidea is (that’s pretty much the only thing I use my laptop for).
You can not like its proprietary nature but RTX is far from garbage and much more sophisticated than SSRTGI.
Nvidia also absolutely trashes AMD when it comes to accelerated encoding/decoding.
Nvidia has better OpenCL support and obviously CUDA. You can’t even get OpenCL support that matters with Flatpak with AMD but with Nvidia its easy.
The render path for Blender with Nvidia is much better. Even with HIP the 6900xt is only about 3060 level in Blender
Nvidia based laptops are generally better and more options than AMD even when paired with AMD CPU
The only scenario that Nvidia loses to AMD on Linux is OSS drivers and Gaming being generally better on AMD because of that, otherwise literally any other metric Nvidia flat out wins
Not far, it’s utter garbage compared to engine-based path-tracing (SSRTGI is different story, but it works everywhere).
Implementation really sucks AND it’s proprietary.
Screen space effects by nature are less accurate, prone to issues, and rather noisy.
Again, you can not like RTX nature but to say its garbage is a plane lie. RTX is far beyond SSRTGI but I prefer universal options for gaming like SSRTGI.
Software engine-based implementations like CryEngine example is not screen-space, if you actually read stuff.
Basically if you use power of human brain to implement similar calculations to SSRTGI while using all available data around you (hence you need game engine to do that) - there are no problems and even some benefits to such technologies, like being much more performant (since you can use not only CPU but GPU) and being cross-platform without using insane brute-force proprietary approach like NoVydia RTX.
Only areas where you might need that stupid insane RTX stuff is 3D modeling (and that’s only because of how 3D modeling software is currently implementing ray-tracing) and maybe rendering full-scene ray-traced games but we’re far away from even considering it, performance-wise.
Besides, it’s very easy to see why it’s garbage, if you know anything about path-tracing.
Even if they would make exactly the same stupid decisions they have made, but gone path-tracing instead of ray-tracing - we would already have full-scene traced games available worldwide playable with 60 FPS even in 4k.
It doesn’t matter, because under the hood it’s not really.
I mean this, if you’re too lazy
It’s similar, they use voxels for light calculations for example, which is indistinguishable from RTX ray-tracing.
Fire CryEngine scene in engine, compare - see for yourself.
you got some serious bias to perform those mental gymnastics friend
Both Crytek and Reshade perform the RTing in view port/screen space and part of the performance improvement comes from only considering things in screen space but also by being approximations. This is how Crytek originally managed Ambient Occlusion and brought that to the masses in the form of SSAO. They also much like SSAO operate at lower resolutions, generally 1/2 or 1/4 and objects at distance lose clarity and can suffer from some bleeding but its generally not a problem.
RTX uses ray marching casting at this moment a finite number of rays which is the only reason it suffers from the noise it does, as the hardware improves it from a technical perspective is far superior to SS effects due to being more accurate, higher detail, and able to consider things outside of SS. While i dont feel RTX or Hardware based RT is the way to go, much like when Tesselation became a thing i thought it was eh, it is the technically superior method from quality standpoint.
I much prefer what Cryengine, Reshade, and Unreal Egnine do that allows for RT kind of effects without having to have specific hardware to accomplish it.
here is an analysis DF did a while ago for this sort of technique, its really a fun way to accomplish it and is incredibly clever. It has its limitations though.