Intel and DreamWorks Working On Rendering Animation In Real-Time
category: offtopic [glöplog]
http://tech.slashdot.org/story/11/11/16/2035250/intel-and-dreamworks-working-on-rendering-animation-in-real-time
You wouldn't say, but realtime is faster? I'm curious if they're going to release any specs, or that it is just vaporware.
You wouldn't say, but realtime is faster? I'm curious if they're going to release any specs, or that it is just vaporware.
hmm, what does that remind me off.. hmmm...
I suspect that by 'realtime' they mean at least interactive speeds, on a hugely expensive workstation or server setup.
I saw something recently about intel making a new 'coprocessor' with a huge number of pentium-like cores (basically what became of larabee), I guess a rack or two full of these and maybe a few dozen high end GPUs could give a decent realtime render. Stream the results back to the animator's workstation onlive style and you have what they're talking about.
Something like that would be seriously revolutionary for animators, especially those bits of the pipeline where you have to wait for a fairly high quality render to get some feedback on what you're doing. It won't mean much to us for quite a while though I reckon.
I saw something recently about intel making a new 'coprocessor' with a huge number of pentium-like cores (basically what became of larabee), I guess a rack or two full of these and maybe a few dozen high end GPUs could give a decent realtime render. Stream the results back to the animator's workstation onlive style and you have what they're talking about.
Something like that would be seriously revolutionary for animators, especially those bits of the pipeline where you have to wait for a fairly high quality render to get some feedback on what you're doing. It won't mean much to us for quite a while though I reckon.
So, 3D rendering goes cloud?
imagine the fancy new avatar fullhd 3d realtime
Prediction: They're so used to their current workflow that they'll just use the really fast rendering procedure to render offline at even finer details with even crazier realism.
What Graga said.
Maybe they're using this F/CPU.
Maybe they're using this F/CPU.
rare: that's the one i was thinking of. A rack full of them would churn out some pretty pixels at speed :)
Quote:
this F/CPU.
From the article: "However, developers need to code their software in order to take best advantage of GPUs."
Personally, I'm a bit concerned by this. Are we in the demoscene community ready to CODE our software and thereby taking the best advantage of the GPUs? Frankly, I'm not convinced that we are. People! You NEED to CODE your software. Otherwise there's a bunch of GPU capacity just sitting out there not having someone taking best advantage of it. Just because people were too lazy to CODE THEIR SOFTWARE. Sheeesh...
Yeah, back in the people still wrote code. The games these youngster play nowadays,.. just no code at all. Where did the glamour go? Bring back the code! Even Os'es these days contain next to nothing code-wise.
Quote:
You NEED to CODE your software. Otherwise there's a bunch of GPU capacity just sitting out there not having someone taking best advantage of it.
on modern PC demos, GPU processing power is already heavily used by more and more complex shaders (not to mention geometry shaders) and geometries. To be able to use GPU processing power to achieve GPGPU stuff you'd need an additional GPU. That would raise the price of the demo-ready PC even higher.