Any of you feel like we’ve become so fixated on graphics and perfomance to the point where the actual game part of a video game is often overlooked, or at least underemphasized? I don’t know about the rest of you, but all I come across on social media regarding gaming is about resolution, ray tracing, DLSS/FSR, frame rates, frame time, CPU and GPU untilization, and all of that stuff, and I’m honestly sick of it! I mean performance markers have always been discussed when it comes to PC gaming, but now even console gaming is getting this treatment! Don’t you miss the days when you just installed the game and just played it? I know I do. What do you think?
I think realistic graphics in 3D games got to be good enough that further improvement doesn’t really matter any more in 2011 (Skyrim) but I can see an argument for putting it as late as to 2016 (Witcher 3).
I feel like I might get a ton of downvotes for this, but I kind of disagree. Maybe when it comes to things like texture detail, we certainly don’t need every single hair on Roach modeled with full physics or anything.
That’s only a subset of what constitutes graphics in a game though. I think that while it is computationally expensive, the improvements in lighting that we’re seeing contribute to making graphics more realistic and do matter.
I get that people meme on Ray Tracing and the whole RTX On thing, but lighting techniques like Path Tracing, Global Illumination, and Dynamic Illumination are just as much a generational shift as physics was in HL2. Output resolution and texture resolution got pushed to a point where any further gains are marginal improvements at best. Physics is getting to that point, although there’s still room for improvement. Look at how well the finals handles destruction physics, or the ballistics models used in Arma 3. Lighting is the next thing being refined, and it has a ways to go. I’d bet that in 10 years full, real time, dynamic, ray traced lighting will be taken for granted, and we’ll be arguing whether there’s any value or added realism benefit to increasing the number of individual rays cast by each light source, or how many bounces they take. I’d also not be surprised if people were memeing about RTX Sound On at that point and saying that game audio peaked with HRTF or Spatial Audio.
I definitely think you’re in a bubble of AAA games. This is literally the middle of an indie game renaissance.
Get off of consoles, and get a midrange gaming PC.
That’s just triple A games, I think. Indie games are varied and don’t necessarily focus on graphics
all I come across on social media regarding gaming is about resolution, ray tracing, DLSS/FSR, frame rates, frame time, CPU and GPU untilization, and all of that stuff,
That’s because those are measurable factors in a game, things that can be objectively measured. “Fun” and “playability” though are subjective, so a journalist has a harder time telling you if a game will work for you.
I don’t know what you’re talking about, old games were just as fucking janky on release, and most of them took years of modders fixing all those issues for them to get better.
Fallout 1 & 2 - janky on release
Baldur’s Gate 1 & 2 - janky on release
Morrowind - janky on release
S.T.A.L.K.E.R. Call of Chernobyl - janky on release
S.T.A.L.K.E.R. 2 - janky on release
All of these were capable of being installed and “just playing” them on release. There were countless bugs and janky behavior and that’s normal and we’re now spoiled by day 1 patches. STALKER 2 has been out a month and has had three major patches for bug fixes. STALKER Call of Chernobyl probably could have used the same but in 2007 the infrastructure to push quick updates just wasn’t there yet. Steam had only released by Valve in late 2003, roughly three and a half years earlier.
I feel it’s a bit like any hobby. You’d see casual film enjoyers and then those who refuse to watch unless it’s a bluray on their 4k Dolby Vision TV with 1000 nits OLED brightness. There are some who just enjoy listening to music on their airpod knockoffs by streaming on YouTube music and then there are those who buy $500 headphones with high quality gold plated aux wire and a custom DAC and use some obscure format to really enjoy music. There are some who enjoy team sports and then there are those who know personal routine of each player and the wetness of the grass or the year of the ball’s manufacturing and its impact on throw.
It’s a spectrum.
To me it sounds more like the social media algorithms put you into the “gaming tech” corner so that’s all you see. Indie gaming is huge and not at all about graphics. Look at the currently popular games on Steam and a ton of them are technologically very basic.
Also don’t forget retro games.
Even new games can be run on midrange hardware if you don’t crank up the settings.
People want big numbers and companies watch to sell the latest stuff. No one gives a platform for advocating low budgets, cheaper hardware and patient gaming.
Exactly. My feeling is more the opposite of what OP is saying. Gameplay and innovation is king. Just not in AAA games.
Valheim was one of the best selling games and is still a huge success. Indies are getting better and more popular to the point that even big companies like Nexon are indiewashing their studio and pretending that Dave the Diver is an indie game with pixel art instead of a work of one of the biggest publishers there is. In my experience most of the gamers nowadays are people that grew up on minecraft, terraria or probably more likely today - roblox.
So basically no, I don’t think so. Maybe big studios want you to believe that and it might be true for a casual FIFA or CoD gamer but for anyone else, there are more options than ever and the supply of good smaller simpler games is just overwhelming, the days are too short to even keep track of them anymore.
You may as well have typed this in 2009 or 2015.
It used to be that people argued that it’s worth getting the new game console because “better graphics”. The console wars hasn’t gone anywhere, it’s just expanded.
In any case, in regards to just installing a game and playing it, no, not really. When I was playing games in college in 2012 it was still a time when you would open a game and go to the settings menu to adjust settings.
Sometimes it was just turning off motion blur, but there was always settings to change to try to reach a stable 60FPS.
Nothing changed, it just expanded. Now instead of 60FPS it’s a variable 60-240FPS. Instead of just 720p-1080p resolution, unless it’s portable, it’s 1080p minimum otherwise variable up to 4k. Instead of “maxing out” we now have raytracing which pushes software further than our hardware is capable.
These aren’t bad things, they’re just now 1) slightly marketed, 2) more well known in the social sphere. There isn’t anything stopping you from opening up the game and going right away, and there’s nothing stopping other people from wondering about frame timings and other technical details.
Sure, focusing on the little things like that can take away from the wider experience, but people pursue things for different reasons. When I got Cyberpunk 2077 I knew that there were issues under the hood, but my experience with the game at launch was also pretty much perfect because I was focused on different things. I personally don’t think a dip here and there is worth fretting over, but some people it ruins the game for them. Other people just like knowing that they’re taking full advantage of their hardware, hence figuring out the utilization of their components.
There’s one last aspect not mentioned. Architectures. 10 years ago games would just boot up and run… But what about games from 10 years before then? Most players not on consoles were having to do weird CPU timing shenanigans to be able to boot up a game from (now 20) years ago. We’re in the same boat now with emulation, which while emulation is faring better, X360/PS3 generation games that had PC ports are starting to have issues on modern Windows. Even just 5 or 6 years ago games like Sleeping Dogs wouldn’t play nice on modern PC’s, so there’s a whole extra aspect of tinkering on PC that hasn’t even been touched on.
All this to say, we are in the same boat we’ve always been in. The only difference is that social media now has more knowledge about these aspects of gaming so it’s being focused on more.
The one thing I do agree with though is that this is all part of software development. Making users need better hardware, intentional or not, is pretty crazy. The fact that consoles themselves now have Quality vs Performance modes is also crazy. But, I will never say no to more options. I actually think it’s wrong that the console version of games often are missing settings adjustments, when the PC counterpart has full control. I understand when it’s to keep performance at an acceptable level, but it can be annoying.
There are a lot of phenomenal indie games. There also are still a couple of really good AAA games, but AAA gaming isn’t what it used to mean. In fact I’d be careful with AAA by default unless reviews state that the game is actually good. Ubisoft even tried to establish an “AAAA quality” game with Skulls & Bones or how it’s called and it’s a total flop.
The real quality these days lies in indie games or (mostly) independent gaming studios. I think it’s kind of safe at this point to just assume by default that Bethesda, Microsoft, EA, Activision-Blizzard and so on simply cannot produce actual good games anymore (there may be some exceptions, but again, wait for independent reviews, and unless it was independently verified, don’t trust them to produce a good game).
Another problem is the sheer mass of games flooding the market, because it means that true gems aren’t found so easily. But they exist. There’s no shortage of great games, you just have to look harder, and look in the right places.
this makes me nostalgic… people were saying this about fallout 3.
To be faaaaaaaaaaaiiiiir, a lot of that was tied up in the switch from overhead isometric view to first-person view.
Fallout 1/2 didn’t focus on graphics, they were in many ways point-and-click adventures. A lot of things you had to hover over for “flavor text” and every once in a while something only four pixels wide exists that you need to notice.
So the gameplay actually actively eschewed graphics in favor of things like flavor text and reading.
Further, the switch to first person broke the SPECIAL system, because how to you even manage a gun skill in a first person shooter without it feeling absurd? It made sense in isometric, even if it was often frustrating to miss an enemy when you had a 79% chance to shoot them in the balls. Putting that in a first person when you mag dump into someone standing right in front of you and half your shots feels a lot less realistic, and can quickly become frustrating in a more fast-paced first-person-shooter environment. The SPECIAL system feels absolutely slapped on as an afterthought in Fallout 3.
Also, the writing in Fallout 3 was that shitty Bethesda writing. The writing was just subpar compared to the prior two installments. Especially the fucking stupid ass end of the game.
I’d say a lot of those complaints were driven more by the perspective switch than anything else.