Recommending a 2060 Super for Cyberpunk (PL) is utter nonsense.
THEzwerver
It uses Unreal Engine so yeah it’ll require a shitton of power.
Charb9
I still play Phantom Liberty with a GTX super 1650 and I usually get 60 fps on high with some tweaking !
Geyer13
How did Tales of the Shire not get canceled? Wild.
Gervh
No wonder GPU prices are being set so high if people are of the belief that 2060 super cannot run Cyberpunk
anima311
Gamer not knowing how these things are calculated will never be not funny.
ConsoleMaster0
YoUr SyStEm Is JuSt NoT pOwErFuL eNoUgH tO rUn OuR gAmE, iT’S nOt OuR fAlT
Seriously now, this is happening since 2018. It took the masses 7 years to realize and start complaining. But if games getting more and more unoptimized ever the years, the fact that the prices for GPUs gets more high with each generation makes things even worse.
Emulation ftw!!!
tacobellbandit
I had a 2070 before upgraded and cyber punk ran smooth on medium settings. High with some custom optimization ran great
Ni_Ce_
imo “recommended” is a bullshit term and steam should get rid of it. recommended for what?? people have different demands.
Thomas_JCG
That shit looks worse than the original *Fable*, how is Unreal getting worse with every version.
Rossilaz
Unrealistic ≠ Unoptimised!
Though, yeah, that shouldn’t take a 3060
hardlyreadit
Recommended isn’t minimum. The shire’s minimum is a gtx 770 while 2077 is a 1060
VVitcel
missed opportunity: Tales of Optimization
Remy0507
We just gonna ignore the fact that Cyperpunk came out almost 5 years ago, huh? When the RTX 20 series was still pretty new (the 30 series cards had JUST come out). And are we acting like a 3060 Ti is a new, high-end card and not a nearly 5 year old mid-range (more like entry-level at this point) card ? I don’t understand the point of this post…
Noamod
My integrated graphics 4 ram looking ass booting up Disco Elysium with a podcast on smartphone, because those loads will take a while.
LeastInsaneKobold
But heccin cozy gamerino
Optimaximal
Cyberpunk was released in 2020, when the 20xx series were the best on the market. The recommended specs for a game rarely get updated, even when DLC is released.
Tales of the Shire hasn’t even released yet, but it’s 5 years later and they’re still recommending a graphics card 2 generations behind current. Does that not actually suggest it’s actually heavily optimised because it runs absolutely fine targetting a 4-year old card?
Atlanos043
Not saying Tales of the Shire is well optimized but I don’t think Cyberpunk is a good example for “how to do it right” considering
1) Cyberpunk is 5 years older and
2) It was heavily critisized for being unoptimised at launch (personal expierience: My PC version had constant crashes during the tutorial mission. I had to wait 2 days to have it actually playable).
yimyum_
I’m over here with my 10 year old 1060 running cyberpunk with almost no issues. Sure it doesn’t look great, but I really can’t complain.
upq700hp
unreal engine and its consequences (lazy devs) have been a disaster for the human race (gamers)
Tomacz
I believe this is the developer’s first game. It seems they typically make statues/figures/props. So… Not software.
So yes the game should run better, but also they’re not a huge AAA developer with decades of experience like CDPR.
NeverNotOnceEver
Ppl mistake graphics that aren’t realistic for not being demanding. That being said, a five year old game which started development at least 10 years ago isn’t a great comparison
stronkzer
That’s the reccomended specs for frickin’ STALKER 2, a game that both looks ultra-realistic AND is to this day not properly optimized. Devs aren’t even trying anymore smh.
WedTheMorallyGrey
I played so many games on max settings but palworld is the one my pc likes to go overdrive. I feel that meme.
Temulo
Cyberpunk can stfu, runs like shit on xbox
Potential_Let_6901
Cyberpunk is running on switch2 lol, unfair comparison.
crumbletasty
People be using the word “optimised” but I really don’t think they know what that means.
The differences in the sizes of the teams developing the games, the funding available to the studios, the fact that Cyberpunk had about a decade longer in development and the software and architecture the games are built on are totally different.
Also it’s “recommended”. Not “minimum”. The word “Recommended” is subjective as it’s based on an opinion of acceptable performance.
Aphala
Guess pirate softwares coding is floating around
Gradash
This is caused by the Engine itself. Unity 6 is very expensive, less than Unreal, but still requires a lot. If the developer uses engines like Unity and Unreal, the result is that they need to add the minimum that the Engine requires.
Why Phanton Liberty runs well on older GPUs? Because it uses its own engine, and now CDPR has moved to Unreal; prepare for this to be no longer a reality.
30 Comments
Recommending a 2060 Super for Cyberpunk (PL) is utter nonsense.
It uses Unreal Engine so yeah it’ll require a shitton of power.
I still play Phantom Liberty with a GTX super 1650 and I usually get 60 fps on high with some tweaking !
How did Tales of the Shire not get canceled? Wild.
No wonder GPU prices are being set so high if people are of the belief that 2060 super cannot run Cyberpunk
Gamer not knowing how these things are calculated will never be not funny.
YoUr SyStEm Is JuSt NoT pOwErFuL eNoUgH tO rUn OuR gAmE, iT’S nOt OuR fAlT
Seriously now, this is happening since 2018. It took the masses 7 years to realize and start complaining. But if games getting more and more unoptimized ever the years, the fact that the prices for GPUs gets more high with each generation makes things even worse.
Emulation ftw!!!
I had a 2070 before upgraded and cyber punk ran smooth on medium settings. High with some custom optimization ran great
imo “recommended” is a bullshit term and steam should get rid of it. recommended for what?? people have different demands.
That shit looks worse than the original *Fable*, how is Unreal getting worse with every version.
Unrealistic ≠ Unoptimised!
Though, yeah, that shouldn’t take a 3060
Recommended isn’t minimum. The shire’s minimum is a gtx 770 while 2077 is a 1060
missed opportunity: Tales of Optimization
We just gonna ignore the fact that Cyperpunk came out almost 5 years ago, huh? When the RTX 20 series was still pretty new (the 30 series cards had JUST come out). And are we acting like a 3060 Ti is a new, high-end card and not a nearly 5 year old mid-range (more like entry-level at this point) card ? I don’t understand the point of this post…
My integrated graphics 4 ram looking ass booting up Disco Elysium with a podcast on smartphone, because those loads will take a while.
But heccin cozy gamerino
Cyberpunk was released in 2020, when the 20xx series were the best on the market. The recommended specs for a game rarely get updated, even when DLC is released.
Tales of the Shire hasn’t even released yet, but it’s 5 years later and they’re still recommending a graphics card 2 generations behind current. Does that not actually suggest it’s actually heavily optimised because it runs absolutely fine targetting a 4-year old card?
Not saying Tales of the Shire is well optimized but I don’t think Cyberpunk is a good example for “how to do it right” considering
1) Cyberpunk is 5 years older and
2) It was heavily critisized for being unoptimised at launch (personal expierience: My PC version had constant crashes during the tutorial mission. I had to wait 2 days to have it actually playable).
I’m over here with my 10 year old 1060 running cyberpunk with almost no issues. Sure it doesn’t look great, but I really can’t complain.
unreal engine and its consequences (lazy devs) have been a disaster for the human race (gamers)
I believe this is the developer’s first game. It seems they typically make statues/figures/props. So… Not software.
So yes the game should run better, but also they’re not a huge AAA developer with decades of experience like CDPR.
Ppl mistake graphics that aren’t realistic for not being demanding. That being said, a five year old game which started development at least 10 years ago isn’t a great comparison
That’s the reccomended specs for frickin’ STALKER 2, a game that both looks ultra-realistic AND is to this day not properly optimized. Devs aren’t even trying anymore smh.
I played so many games on max settings but palworld is the one my pc likes to go overdrive. I feel that meme.
Cyberpunk can stfu, runs like shit on xbox
Cyberpunk is running on switch2 lol, unfair comparison.
People be using the word “optimised” but I really don’t think they know what that means.
The differences in the sizes of the teams developing the games, the funding available to the studios, the fact that Cyberpunk had about a decade longer in development and the software and architecture the games are built on are totally different.
Also it’s “recommended”. Not “minimum”. The word “Recommended” is subjective as it’s based on an opinion of acceptable performance.
Guess pirate softwares coding is floating around
This is caused by the Engine itself. Unity 6 is very expensive, less than Unreal, but still requires a lot. If the developer uses engines like Unity and Unreal, the result is that they need to add the minimum that the Engine requires.
Why Phanton Liberty runs well on older GPUs? Because it uses its own engine, and now CDPR has moved to Unreal; prepare for this to be no longer a reality.
Ya it more grass in the ahire that night City