00:00
00:00
mike
hello!

Mike Welsh @mike

Age 38, Male

stupids

Penn State

San Diego

Joined on 2/24/00

Level:
23
Exp Points:
5,596 / 5,880
Exp Rank:
8,049
Vote Power:
6.47 votes
Rank:
Police Captain
Global Rank:
3,931
Blams:
1,268
Saves:
1,003
B/P Bonus:
16%
Whistle:
Normal
Trophies:
10
Medals:
916
Supporter:
9y 10m 20d
Gear:
1

I give you a :O

Next question: How can you be so awesome?

That's Pretty O.o

:Watching the GPU do all this crazy stuff makes me wonder about the future of graphics hardware. As we add things like branch logic to our GPUs, they become more and more like a generalized CPU. Will we get to a point where the GPU/CPU distinction doesn't even exist, and your "graphics card" is just used as another core? Or will the GPU continue to be distinct as a very specialized, parallel number cruncher? I'm not really a hardware guru, so maybe I'm way off base here.

I'm by no means an expert, but here goes:
The CPU is more like a project supervisor, they don't really spend too much time actually working, but use most of their effort coordinating the team/system.
The GPU is like the highly skilled coder that doubles the productivity of the team, the guy the CPU turns to when he needs something important done; but isn't really a good leader.

They will probably eventually merge though, I think of the GPU as a modern day ALU (algebraic logic unit), which was integrated into the processor. (In fact that's the reason AMD makes processors. They made ALUs for Intel CPUs. Once Intel integrated the ALU, AMD was out of a job.)

-----
What do you mean? That is horrible, there are tons of jpeg-like artifacts in the image!</sarcasm>

Here's an interesting site that talks about having a GPU and then an individual rayracing card.
"Raytracing to replace rasterization by 2020"
<a href="http://www.dacris.com/blog/2010/08/08/RayTracingToReplaceRasterizationBy2020.aspx">http://www.dacris.com/blog/2010/08/08 /RayTracingToReplaceRasterizationBy20 20.aspx</a>