Difference between 192-bit and 320-bit GDDR5?

Started by G_G, August 17, 2012, 09:03:24 am

Previous topic - Next topic

G_G

August 17, 2012, 09:03:24 am Last Edit: August 17, 2012, 09:17:59 am by gameus
I found two nVidia powered graphics cards, manufactured both by EVGA. One has 192-bit RAM and the other 320-bit. I'm curious to know if there's any real performance difference here. Any input would be nice as to which card I should get as well.

The cards I was looking at.
http://www.newegg.com/Product/Product.aspx?Item=N82E16814130687
http://www.newegg.com/Product/Product.aspx?Item=N82E16814130810

EDIT: A friend already answered my question.

Quote"In all reality, the bit count that they are referring to is the number of binary digits that the card reads per line of machine ISA code. Generally, having more bits is better, but you probably won't ever know the difference in use between the two."


With that answer, it'd be more logical to get the 320-bit one, but the 192-bit one comes with a free copy of Borderlands 2. Thus I'm getting that. ^3^

Blizzard

He's right, the difference is insignificant. Imagine the difference between a 32-bit CPU and a 64-bit CPU, it's insignificant in terms of machine code lines.
Check out Daygames and our games:

King of Booze 2      King of Booze: Never Ever
Drinking Game for Android      Never have I ever for Android
Drinking Game for iOS      Never have I ever for iOS


Quote from: winkioI do not speak to bricks, either as individuals or in wall form.

Quote from: Barney StinsonWhen I get sad, I stop being sad and be awesome instead. True story.