Thanks! I appreciate the update, will have to give it a try myself after work.
So I am enabling HDR and Wayland (because I need it for HDR). Out of curiosity what are your other commands doing there and how are they working for you in Hunt?
Thanks! I appreciate the update, will have to give it a try myself after work.
So I am enabling HDR and Wayland (because I need it for HDR). Out of curiosity what are your other commands doing there and how are they working for you in Hunt?
Yeah also rollbacked to 10-12. Haven’t had a chance to test 10-14.
I am skeptical due to the change log. It seems he is applying game specific fixes on top of 10-13. I imagine what was making Hunt slow was in a downstream project.
There were major performance regressions for Hunt Showdown in 10-13. I wonder if that is fixed in this release.
I don’t know for sure but I run Mangohud with a wine version display and I know the version string for 10-13 was messed up. It was something along the lines of 10-12-gitsha. So maybe he was fixing that?


It’s a little confusing because of things shifting around but my understanding is that this the launch of Microsoft’s debloated and handheld gaming targeted version of Windows. Basically they saw what a better experience SteamOS was and realized it was a problem.


I am kind of shocked about the 7900 xtx. I have the same GPU and I am getting good performance under Linux.
I did some just for fun benchmarking on Doom The Dark Ages last night and I expected Linux to be slightly slower due to the built in ray tracing but I actually got better avgs under Linux. The max frame rate was slightly higher under Windows but the lows were way better under Linux. Overall fairly close performance with a slight edge to Linux.
Maybe Bazzite is doing some magic here. What distro was he using?
Edit: I watched a bit of it, he is running Bazzite, no idea why he is seeing such crazy different numbers. I typically run Proton GE, and I assume he is running Proton Stable, so that would make a dent. People are mentioning low power mode in the comments, but I never have had any issue with that and my 7900 xtx. I haven’t had to do anything weird or out of the ordinary.
I think it’s most likely due to me not playing the same games he is, Stalker 2 is basically the only he is playing that I have played in the past and I’ve haven’t done a comparison of that game on Linux vs Windows.


Interesting - is that kernel level thing? Could other distros use that on the right hardware or is too much to maintain multiple kernels that are that hardware specific?


What is clear doing that is unique and what are the trade offs?
Why isn’t this mainstreamed into other distros?



What the fucking fuck is this? Websites are comically bad now.


It’s had a few security issues in the past and last I heard they introduce a lag between packages going into the arch repos and things being available in Manjaro - even for critical security updates.
Bummer - sucks to lose a good server
What happened? Did the instance maintainer just get sick of running it?


For posterity’s sake here is what I ended doing.
HDR is working well under KDE, the Wayland mode in GE Proton introduces a big performance hit in some games unfortunately.


Yeah I was exploring KDE on a Fedora live disc and I guessed that is what automatic vrr was doing. Turning it to always introduced more flicker but still seemed less then gnome.


It’s way worse if I run games under the experimental Wayland mode that you enable with GE.
What distro are you using? I am on Bazzite


https://www.reddit.com/r/XboxSeriesX/comments/t3fn6l/can_someone_explain_vrr_like_im_5_what_it_does/
Ok, so let’s say your tv is a typical 60hz TV, that means it updates 60 times a second, regardless of the games frame rate. A 60fps game will be in perfect sync with your TV, as will 30fps because each frame will just be displayed twice. When your game is running at a frame rate in between it’s not in sync with the display any more and you end up with screen tearing, as the image being sent to the TV changes part way through the image being displayed.
VRR stands for Variable a refresh Rate. It basically means the displays refresh rate can vary to match the source of the image, so that it always stays in sync.
This a pretty good explanation of what VRR is doing. Basically makes it so you can drop frames and it still feels smooth.


Kind of a bummer to hear - I was hoping KDE’s VRR implementation might avoid the issue. It may be a Wayland problem so that would be unavoidable.
Edit: did some testing with a live image tonight - at least on my machine KDE seems much better when it comes to flicker


I had the name wrong initially - I just edited it to correct it, but under Windows “dynamic refresh rate” - is distinct then VRR. Settings reads “To help save power, Windows adjusts the refresh rate up the selected rate above”. See https://www.theverge.com/2021/6/29/22555295/microsoft-windows-11-dynamic-refresh-rate-laptops.
I can turn it off and still have VRR enabled.
Trust me when I say the amount of OLED flicker is much much higher in Gnome then under Windows for the exact same games. Like give you eye strain and a headache super fast. I still see a little flicker under Windows but it’s not comparable.


Oh man that looks bad. Hopefully that’s like something going wrong in the beta version because I can’t see them shipping something that looks like that
What reviews? People seem almost universally hyped on this one. There was a preview that IGN did during the beta that was pretty maligned because of how little the author seemed to understand the game.
I played the server slam and it was really well polished and put together.