Now that got your attention!
before i start id like to point out ive always been a bit of an nvidia fan-boy, but stuff ive read and seen has loosened my die-hard patriotism
When ray-tracing takes off, (probably in directx 12 now by the looks of it) intel will be making multi core graphics cards (with like 128 cores on them) to handle the insane amounts of calculations required to render stuff by simulated light rays in game.
heres a little pic, originally from intel showing the difference between what we have now (rasterisation and raytracing)
http://www.cs.utah.edu/~jstratto/state_of_ray_tracing/teas_edited-1.jpg
they plan on releasing a card (codename larrabee) late this year, which has about 48 cores and is as powerful as the conventional rasterisation cards we have around now.
its a long story but i fear nvidia may be on the way out, unless they get their act together, it seems graphics cards were only ever an intermediate solution to graphics, and that cpu's are always better, its just now we are coming to the point in time when intel can whack a tonne of them on one pcb (and for less than £20,000 ) ATI are gonna be fine, them and AMD are the same company now, they can make cpu based gfx with ati's graphical knowledge. its just Nvidia in their 'microsoft styled' bully boy rush to the top that seems to have burnt all its bridges.
Here's a link to a vid of what ATI/AMD are up to with a bit of ray tracing thrown in i believe (bad quality vid :S)
https://www.youtube.com/watch?v=0YjXCae4Gu0
Just wanted to share this eye opening information with everyone and so you know to use a bit of caution spending all your/parents money on a £3000 pc next year, hoping it to last you 3 years perfectly
prepare to see Nvidia sideswiped!
Nvidia Intel/AMD ftw!
before i start id like to point out ive always been a bit of an nvidia fan-boy, but stuff ive read and seen has loosened my die-hard patriotism
When ray-tracing takes off, (probably in directx 12 now by the looks of it) intel will be making multi core graphics cards (with like 128 cores on them) to handle the insane amounts of calculations required to render stuff by simulated light rays in game.
heres a little pic, originally from intel showing the difference between what we have now (rasterisation and raytracing)
http://www.cs.utah.edu/~jstratto/state_of_ray_tracing/teas_edited-1.jpg
they plan on releasing a card (codename larrabee) late this year, which has about 48 cores and is as powerful as the conventional rasterisation cards we have around now.
its a long story but i fear nvidia may be on the way out, unless they get their act together, it seems graphics cards were only ever an intermediate solution to graphics, and that cpu's are always better, its just now we are coming to the point in time when intel can whack a tonne of them on one pcb (and for less than £20,000 ) ATI are gonna be fine, them and AMD are the same company now, they can make cpu based gfx with ati's graphical knowledge. its just Nvidia in their 'microsoft styled' bully boy rush to the top that seems to have burnt all its bridges.
Here's a link to a vid of what ATI/AMD are up to with a bit of ray tracing thrown in i believe (bad quality vid :S)
https://www.youtube.com/watch?v=0YjXCae4Gu0
Just wanted to share this eye opening information with everyone and so you know to use a bit of caution spending all your/parents money on a £3000 pc next year, hoping it to last you 3 years perfectly
prepare to see Nvidia sideswiped!