RTPU: The Next Step in Graphics Rendering

Status
Not open for further replies.
"Typically, ray-tracing requires that the PC or Mac process the entire scene, not just the visible areas"

I got lost when you said "PC or Mac", which made about as much sense to me as drinking "Soda or Pepsi".
 
it won't do anything if the games aren't programmed for it...

so its just a giant money sucker paperweight until the game devs stop sitting on their asses making the same game with a new game...

the software is well behind the hardware in the present
 
meant to be "same game with a new name..."

sorry toms doesn't have a edit feature... not much i can do about that feature though
 
Unfortunately, the CausticRT platform isn't ready for the general public, costing around $4,000 for the CausticGL driver, the CausticOne processor and accelerator card, and one year of firmware and software updates. The company also said that developers may also purchase a one-year subscription for US $2,500 that includes support for up to 10 incidents. However, it may be possible that the general public will see the CausticRT platform in the not-too-distant-future, as the company suggested that game consoles could achieve film-quality run-time visuals using the system. Could this processor be included in the next generation of Xbox and PlayStation consoles? Perhaps so. And like Agia, it may be possible that the CausticRT system will become a part of the Nvidia--or AMD for that matter--collective.

Well, let's hope it will be affordable sometime soon. Of course they will need some support from a company with deep pockets and a product line it possibly supports (all the usual suspects ... AMD, nVidia, even intel though the latter may try to compete with Larrabee). On the other hand, the majority of PC gamers is frugal and high-tech extensions for several hundred bucks are strictly for enthusiasts (or a small percentage and market).
 
Yeah, I thought the "future" was moving the gpu onto the cpu core and eliminating the graphics card, not creating new cards.
This seems a little out of sync with where the industry is going.

I like the idea of having say a 16 core processor, and four cores are running graphics, 8 running 16 threads, and the other four cores are dedicated to sound, physics, raid, and networking. Or something like that..
 
I read about these guys here about a month ago? It seems they have lost some venture capital or something to have such steep pricing for what really amounts to a research and developement franchise. It will be welcome to have these guys absorbed into the Intel/nVidia/AMD embrella, because I seriously doubt they have the ability to take this venture out to the full spectrum of the public. Maybe Microsoft will be in line to buy them as well, as I think they are invested into ray-tracing.
 
I just typed out a huge explanation and a bit of insight for this tech, but the useless toms site and it's terrible support for anything internet explorer just erased the whole thing. So screw it. Meh.
 
[citation][nom]thogrom[/nom]it won't do anything if the games aren't programmed for it...so its just a giant money sucker paperweight until the game devs stop sitting on their asses making the same game with a new game...the software is well behind the hardware in the present[/citation]Isn't that the same argument as PhysX? But if this goes the way that did, it could become ubiquitous on video cards in the future, making it worthwhile to program for, making it worth having on the cards.
 
another core in the computer?
Having a 100W CPU, and a 200W graphics card is already pretty heavy.
I wonder how much power this chip will need.. Soon we'll have to buy 1000W PSU's for just a home pc?
 
ok...AMD/ATI already screwed up once with Physx and let it go saying that it was worthless. They need to swoop in and purchase this before Nvidia gets out the checkbook and adds this to their lineup.
 
So were 3d graphics in general not too long ago. This seems to be just another piece in the puzzle for offloading CPU tasks. I'm all for offloading things from the CPU (regardless of cores/clock).

By the time the graphics, physics, and "lighting" are offloaded, the bulk of the CPU can be dedicated to much more complex AI and other core gaming bits (all that stuff behind the pretty pictures like, team-speak, movement calculations, and such).
 
[citation][nom]sublifer[/nom]I was gonna throw in the same comment. I knew it would be asked[/citation]


I better it will still stutter with AA turned on.
 
[citation][nom]thogrom[/nom]it won't do anything if the games aren't programmed for it...so its just a giant money sucker paperweight until the game devs stop sitting on their asses making the same game with a new game...the software is well behind the hardware in the present[/citation]
Obviously, you don't run FSX, Crysis, or HD video editing software.
 
[citation][nom]TheFace[/nom]This isn't for gaming, it's for rendering. Think Hollywood, or commercials, or promotional videos.[/citation]
That maybe true but maybe some version for helping rendering in games will come out if the gpu company acquire this technology.. $4000 is fair enough for CG makers... wait till it hits mass production... then the prices would lower down... maybe put Radeon 4870 in crossfire and pop in this as the third would help running Crysis...

i think this would be the same as Physix, Cuda or ATI's avivo hope AMD gets the first dip tho... looking at how much Nvidia's marketing use the Physix and Cuda to sell their product, ATI would be at lose if they have nothing to bounce back...

 
The CausticOne is a board built using FPGAs (field programmable gate arrays) and 4GB of RAM. Two of the FPGAs (the ones with heatsinks on them) make up SIMD processing units that handle evaluation of rays. We are told that the hardware provides about a 20x speedup over modern CPU based raytracing algorithms. And since this hardware can be combined with CPU based raytracing techniques, this extra speed is added on top of the speed current rendering systems already have. Potentially, we could integrate processing with CausticOne into GPU based raytracing techniques, but this has not yet been achieved. Certainly, if a single PC could make use of CPU, GPU and raytracing processor, we would see some incredible performance. - Anandtech

So... yeah... one day it could move to consumer group alongside current gpu...
 
[citation][nom]njalterio[/nom]But can it run Crysis?[/citation]

Don't negative rate that man..., you guys are just jealous you didn't post it first!! LOL! I want a "...but can it run Crysis" T-shirt!!!!
 
Those scenes don't look like a hollywood movie... so who cares if you can do it at 4 frames per second. Mediocre is still mediocre even if done quickly.
 
Very promising looking board they have.

One nice thing about true raytracing is has almost an infinite number polygons when it comes to making smooth surfaces -- thus why pixar's movies look so great.
 
Status
Not open for further replies.