No.2270
.flip it
No.2273
I might buy a used 4090 since everyone's dumping them on the secondary market locally and the 5090 is shit.
I'll only give the envious green AI snake oil company money indirectly at this point.
No.2274
why are you buying an nvidia card. nvidia is an AI for buisness company
No.2275
Couldn't you get like two 3090s or two 4090s (or some other combo) used for the price of a new 5090? that would probably be better for AI than a single 5090. Or maybe a quadro and the likes. I wouldn't trust the 50 series cards just on principle that they lied about the performance metrics of the 5070, saying it's better than the 4090 at half the price.
No.2277
>>2276There's a kissu pc?
No.2278
>>2277Yeah it's what the Palworld server is hosted on
No.2279
I believe there are ways that it's used in LAN. The latency would probably be pretty bad by network
No.2280
Really it's about rendering rather than as a studio. Rendering for 3d is only ever done by distributed systems
No.2281
Pathetic
No.2282
>>2275Two GPUs unfortunately have a hidden cost to them. I'd need a new motherboard and if I need a new motherboard then I need a new CPU. I might already need a new PSU if I'm getting a 5090, so the costs are absurd. Bleh!
>>2276Not really, or at least I doubt it. That kind of setup is for rendering farms, once you have a complete scene and you send the data over to be processed, like a Pixar movie. The performance I'd want is for rotating the camera around without stuttering in a complex scene while having other 3D programs open or even a game window. You know, "live" stuff. Multi monitor setups are less impressive when you can't make use of them for everything.
No.2285
i don't even know if multi-gpu is a thing that works. when I tried to do it for video games it always had issues and the data had to be mirrored across both cards.
No.2286
>>2285Don't know about gaming, but that's how they do it at the big AI training servers with 100*s of GBs of VRAM.
No.2309
The 5090 doesn't seem dramatically better compared to the 4090, like the 4090 was compared to the 3090/Ti. It's mostly just a generational improvement. Real world performance seems to be about 30% higher. That performance increase seems basically identical to the increase in TDP; 450W on the 4090 compared to 575W on the 5090, or ~28% higher. The efficiency in terms of FPS per watt more less bears this out, with the 5090 being around the efficiency of the 4090, or slightly below.
In Gamers Nexus' testing (
https://www.youtube.com/watch?v=VWSlOC_jiLQ), for reference, the 5090 had an FPS/W of 0.34, and the 4090 at 0.35. The 5090 drew 538.4W, and the 4090 drew 391.7W.
Not exactly very impressive. It basically looks like the 50-series is going to be a refresh generation.
No.2310
>>2309What about for AI, like most people have been touting it for?
No.2311
>>2310Hard to say. AI compute could be anywhere from 1.3x to 2.5x greater than the 4090. You'll likely have to wait until the release date on January 30th when regular people get their hands on one.
If you believe Nvidia's marketing, it should be 2.5x the speed of the 4090, going by their reported Tensor core TOPS. But... Well... They also said the 5090 would be twice the performance of the 4090, which it very obviously isn't (
>>2309)...
No.2313
>>2311Yeah, I'm on at 3080 with 12gb of VRAM at the moment. It's a massive upgrade for AI just based on the VRAM alone. For a lot of the AI stuff you need a certain threshold of VRAM just to run it, like the video stuff I've mentioned before would take me 30 minutes to do what a 4090 does in a minute, and that's with making a lot of sacrifices. You can either store it in speedy VRAM or you can't.
>January 30thNice, I still have some time to think it over.
No.2316
>>2315Nice, thanks. The two yotubers I look at once in a while (gaming nexus and jayz2cents) just looked at the gaming side of things. I'm generally more interested in image/video AI more than text since text has such absurd VRAM requirements for the better models. I'm sure the efficiency there is increasing over time, or at least I hope.
I'm leaning towards getting the card, but I just learned that it's a PCIE 5 thing instead of the 4 my motherboard has. 5 is backwards compatible, but I won't get the best possible performance. I guess I could buy the card and get a new motherboard in summer or so.
No reports of its performance in 3D programs, though. I'm sure it's good, but I'd like to hear more about it.
No.2357
oh wait, now I see that I said something similar 2 days ago
being sick is fun