miss_brainfart, (edited )
@miss_brainfart@lemmy.ml avatar

Would it really be unexpected? They’ve blatantly shown how they want to milk us for every little, incremental improvement that barely qualifies as a sidegrade sometimes.

Evil_Shrubbery,

The 5800 will have a 128bit bussy and 6GB gram. The Ti version will only have it clocked higher and be able to actually address all of the ram.

0ops,

128bit bussy? Say less

OrderedChaos,

64bit?

umbrella,
@umbrella@lemmy.ml avatar

will we be able to afford it?

Nomecks,

Lol no. For AI, research and studio 3d rendering only.

*except Arnold. I don’t need render farm neckbeards downvoting me

yggstyle,

They still haven’t dumped all those chips they claimed on their last three earnings calls. Expect the 5000 series to have a familiar flavor.

Lojcs,

I don’t get this argument at all. Unless you’re buying gpus based on their name (xx80, xx70 etc), 40 series have been consistently cheaper than 30 series at any performance level. Nobody’s going to think “$850 for a 4070ti is too much” and buy a 3090ti for $1500 instead.

umbrella,
@umbrella@lemmy.ml avatar

$850 for a 4070ti IS too much.

its ridiculous for a -70 series.

Lojcs,

And yet it’s still a better deal than the 30 series equivalent

Zahille7,

Bro I’m still using my 2060 Super

ThugJesus,

I’ve yet to encounter a game my 2080s couldn’t run at 144hz max graphics… Why are we still pumping out $800-1200 gpus when there’s nothing requiring that amount of power?

Transcendant,

Yeah bit you’ve never played Alan Wake 2 with turbo mode ray tracing in 28000p /s

(proud owner of a 2070s still chugging along here)

lurker8008,

Must be nice hitting that 2010s backlog. I still got all many unplayed steam games. And don’t even get started with all the free epic games.

hips_and_nips,

nothing requiring that amount of power

My simulators, Pimax, and three 4k 144hz monitors beg to differ.

MomoTimeToDie,

Congratulations on not playing anything intensive, I guess?

Geth,

Recently upgraded both CPU and GPU for better VR performance. Before this my 3070 was struggling a bit with cyberpunk on the 4k TV. There’s also productivity and AI stuff people deal with these days.

There’s plenty of needs that require more and more power.

AstralPath,

2060 user here as well. Could use a bit of an upgrade, but not much.

Cavemanfreak,

And I’m still here on my 1060 6GB rocking 40 fps on medium in Jedi: Survivor.

pennomi,

We don’t need faster GPUs as much as we need more VRAM. Double the memory instead of leaving it stagnant again.

BigDaddySlim,
@BigDaddySlim@lemmy.world avatar

It’s not just the lack of VRAM, it’s also Nvidia and their stupidity lowering the bit bus for lower tier cards compared to the last gen counterparts.

9488fcea02a9,

I dont understand the VRAM cuts… The RAM fabs have been cutting production because of low prices

I would love more VRAM so that i can have a GPU that can do a bit of gaming and dabble in some AI stuff. 100% agree i’d pay for more VRAM instead of horsepower

CaptainProton,

More memory means you can do real work with it, and enterprise AI training is a money printer that they’d be scavenging the shit out of with cards that are closer substitutes.

tal,
@tal@lemmy.today avatar

Honestly, the gap between the server parallel compute cards and the home video cards isn’t that large. 24GB on video cards, 80GB for a compute card.

That’s not even two binary orders of magnitude. That’s a narrow window to try to make their money from. Plus, some tasks can be subdivided and run on multiple GPUs, and they can’t segment up the market for those.

Like, in general, my bet is that when for most things that fit the above requirements of fitting in that window and having a task that can’t be subdivided, there’s probably enough room for algorithmic improvements to get two binary orders of magnitude of reduction in memory requirements.

CaptainProton,

But then you can do work with it, and that’s where the real money is at.

They should all be shipping with 32GB now… AMD is at least seeing the light by releasing some 24gb cards under $1k

Really hope Intel’s next generation of GPU silicon makes it a more realistic substitute - that would actually spice things up a lot, you basically won’t see real competition again until nVidia’s AI training dominance is in someone’s crosshairs

  • All
  • Subscribed
  • Moderated
  • Favorites
  • games@sh.itjust.works
  • fightinggames
  • All magazines