blockoo
|
  |
| Joined: 08 Nov 2007 |
| Total Posts: 17202 |
|
|
| 11 Jul 2012 09:48 PM |
| I'm looking at parts for a custom-built computer, and I'm wondering if I should get Nvidia or Radeon. Radeon seems to be generally cheaper, but I have seen some weak points. Nvidia seems to be more reliable, and I've noticed Radeon cards tend to have slower memory clocks. Also, what about stream processors vs. CUDA cores? I would like some recommendations. |
|
|
| Report Abuse |
|
|
NVI
|
  |
| Joined: 11 Jan 2009 |
| Total Posts: 4744 |
|
| |
|
Jaccob
|
  |
| Joined: 19 Oct 2008 |
| Total Posts: 986 |
|
|
| 11 Jul 2012 09:52 PM |
| Nivida in my standard is by far the best graphics card I've used. It runs a ton better and has tons more compatability than other graphics card companies. They currently have the most powerful graphics cards in the world right now? So quite big company, and much worth the price. |
|
|
| Report Abuse |
|
|
Garnished
|
  |
| Joined: 09 Apr 2012 |
| Total Posts: 12695 |
|
| |
|
| |
|
NVI
|
  |
| Joined: 11 Jan 2009 |
| Total Posts: 4744 |
|
|
| 11 Jul 2012 09:56 PM |
| Jaccob went full retard apparently... |
|
|
| Report Abuse |
|
|
blockoo
|
  |
| Joined: 08 Nov 2007 |
| Total Posts: 17202 |
|
|
| 11 Jul 2012 09:56 PM |
| I completely support AMD processors over Intel processors, but I don't know if they have that same edge in making GPUs. |
|
|
| Report Abuse |
|
|
|
| 11 Jul 2012 09:58 PM |
| Wait I thought we were talking about graphics cards... |
|
|
| Report Abuse |
|
|
| |
|
NVI
|
  |
| Joined: 11 Jan 2009 |
| Total Posts: 4744 |
|
|
| 11 Jul 2012 09:59 PM |
| AMD is ahead of NVIDIA in GPUs, I think. Personally, my build is all AMD. |
|
|
| Report Abuse |
|
|
LPGhatguy
|
  |
 |
| Joined: 27 Jun 2008 |
| Total Posts: 4725 |
|
|
| 11 Jul 2012 10:01 PM |
I see that you're already assuming certain things about cards.
Both cards have downfalls, and the entire thing is a whole opinionated internet war. There is no "best" card as far as I'm concerned, but Nvidia cards are more popular.
Here's how I see it:
1.) AMD cards have something called EyeFinity (which as far as I know, has no Nvidia equivalent) which allows multiple displays to act as one display. 2.) Nvidia cards are typically geared for less displays, whereas AMD really emphasizes the ability to use 6 displays at once. 3.) Nvidia cards don't handle higher resolutions as well as AMD ones do at this point. (but this could change with a driver update at any point) For an example, there's a comparison between an AMD Radeon HD 7970 and an Nvidia GeForce GTX 680 on a 4k resolution monitor, and the Nvidia card could only use half the display due to lack of an EyeFinity equivalent (despite claiming that the card could support 4k resolutions.) Regardless, neither card could play BF3 above 30 FPS consistently. 4.) This is just a rumor, but I've heard that Nvidia cards handle anti-aliasing a smidge better than AMD cards. 5.) One thing I've noticed just from various benchmarks is that AMD cards don't handle games as well with lower resolutions as an Nvidia card of an equivalent caliber would, but as the resolution increases, the AMD cards tend to creep up on the Nvidia card and eventually pass it (but that's usually at or around 4k resolutions)
Really, it's down to whether you like red or green more. I've got an AMD rig, because I've noticed that their products are certainly reliable (as much as Nvidia's products I'm sure) but I like red, and I wanted a change from Nvidia. It also pairs nicely with my AMD processor and AMD chipset motherboard.
AMD has more to their versioning to the best of my knowledge too, but some examples of equivalent cards, and some gaps that Nvidia has (correct me if I'm wrong): Radeon HD 7990 (unreleased) ~= GTX 690 Radeon HD 7970 ~= GTX 680 Radeon HD 7870 ~= GTX 670 Radeon HD 7970 (unreleased) ~= GTX 660 (unreleased)
Both the Radeon HD 7000 series and the GTX 600 series are incomplete.
That's my two cents. I'm an AMD person, so a lot of my Nvidia speculation was guesswork and third-person. |
|
|
| Report Abuse |
|
|
blockoo
|
  |
| Joined: 08 Nov 2007 |
| Total Posts: 17202 |
|
|
| 11 Jul 2012 10:02 PM |
I think Nvidia has the best cards, but Radeon has better price vs. performance.
Proof: EVGA GeForce GTX 690 |
|
|
| Report Abuse |
|
|
LPGhatguy
|
  |
 |
| Joined: 27 Jun 2008 |
| Total Posts: 4725 |
|
|
| 11 Jul 2012 10:02 PM |
@Blockoo AMD's Radeon HD 7990 will be priced at or below that. That's the price of the top-of-the-line card nowadays. You'd be better off with an overclocked GTX 680 or 7970, I think, personally. |
|
|
| Report Abuse |
|
|
blockoo
|
  |
| Joined: 08 Nov 2007 |
| Total Posts: 17202 |
|
|
| 11 Jul 2012 10:04 PM |
| I'm only interested in running a single 1080p monitor, btw. I also read somewhere that the human eye can only distinguish around 60 FPS, and that's in a panicked, adrenaline-rush situation. |
|
|
| Report Abuse |
|
|
LPGhatguy
|
  |
 |
| Joined: 27 Jun 2008 |
| Total Posts: 4725 |
|
|
| 11 Jul 2012 10:06 PM |
@blockoo Well that's entirely and completely false and was obviously found out in a stupid study.
You can't tell the difference between 60FPS and 120FPS on a standard 60Hz monitor. I have two 75Hz monitors.
Try it sometime, get a 60Hz monitor and a 120Hz monitor, and get two games, both running at a steady 180FPS (just so it doesn't dip below 120 on the 120Hz monitor) and tell me if you see a difference. You will. |
|
|
| Report Abuse |
|
|
blockoo
|
  |
| Joined: 08 Nov 2007 |
| Total Posts: 17202 |
|
|
| 11 Jul 2012 10:08 PM |
| Well, I have a 240Hz LCD TV in my room, and I can definitely tell the difference between it and my old 60Hz one. |
|
|
| Report Abuse |
|
|
LPGhatguy
|
  |
 |
| Joined: 27 Jun 2008 |
| Total Posts: 4725 |
|
|
| 11 Jul 2012 10:11 PM |
@blockoo Now 240Hz might be getting a big excessive, unless it's a large screen. But 120Hz should still be enough. You'll also notice that most 3D screens are 120Hz, so that you can get a 60Hz display (essentially) in either eye.
Also, don't quote me on the "you can't see things above 60FPS" thing, because it's still something that's being studied as far as I know. Just keep that in mind. |
|
|
| Report Abuse |
|
|
blockoo
|
  |
| Joined: 08 Nov 2007 |
| Total Posts: 17202 |
|
|
| 11 Jul 2012 10:16 PM |
| My TV is 40". Anyway, I don't really want a 3D monitor, but the only 120Hz monitors I could find were 3D. The fastest affordable 1080p monitor on Newegg was 75Hz. |
|
|
| Report Abuse |
|
|
blockoo
|
  |
| Joined: 08 Nov 2007 |
| Total Posts: 17202 |
|
|
| 11 Jul 2012 10:19 PM |
| Anyway, another conversation, what resolution would the human eye be able to distinguish on an average 23" monitor? |
|
|
| Report Abuse |
|
|
LPGhatguy
|
  |
 |
| Joined: 27 Jun 2008 |
| Total Posts: 4725 |
|
|
| 11 Jul 2012 10:31 PM |
@blockoo If you mean resolution, there's Apple's "retina" display, which they have some sort of required pixel density for something to be "retina," but I think it's not quite necessary. 1080p on a 23" monitor should be perfectly fine. I'd prefer a 21" though, because I'll have 3.
If you mean refresh rate, 75Hz is fine. When I said there would be a difference between 60FPS and 120FPS, it's not much of a noticeable one unless you *literally* put them side-by-side, and even then it's subtle. |
|
|
| Report Abuse |
|
|
stravant
|
  |
 |
| Joined: 22 Oct 2007 |
| Total Posts: 2893 |
|
|
| 11 Jul 2012 11:02 PM |
"1080p on a 23" monitor should be perfectly fine"
While I agree that a 1080p should be "ggod enough", you can _easily_ tell the difference between a "retinal" or similar display ( > 150dpi ), and a typical display. There is a difference. |
|
|
| Report Abuse |
|
|
NVI
|
  |
| Joined: 11 Jan 2009 |
| Total Posts: 4744 |
|
|
| 11 Jul 2012 11:11 PM |
Late but
"4.) This is just a rumor, but I've heard that Nvidia cards handle anti-aliasing a smidge better than AMD cards."
As deferred rendering becomes more prevalent (and trust me, it is), this point becomes irrelevant. |
|
|
| Report Abuse |
|
|
|
| 12 Jul 2012 04:22 AM |
You can distinguish the pixels if it's about less than 300 PPI (Pixels Per Inch)
Apple's Retina Display on the iPhone is about 326 PPI.
☜▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬☜☆☞▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬▬☞ - Candymaniac, a highly reactive substance. |
|
|
| Report Abuse |
|
|
|
| 12 Jul 2012 04:30 AM |
| remember kids, VRAM is important too. |
|
|
| Report Abuse |
|
|
|
| 12 Jul 2012 06:13 AM |
*random googles*
y nobody know nothing D:
AMD looks like they combine some strim procöccörs so that they share some things and stoof, while nvidia seems to have each pröcöccör have its own stuff.
I think that will make amd processors good for simpleish stuff (as they work fine even while sharing stuff?) while nvidya cores can do more complex stuff easier as the processors each have their own computing resoorces.
Amd also looked like it has faster memory.
So AMd has moar cores cuz they cheaper to make as they share stoof.
And amd seems to be good for moar pixels/polygoons/stoof while nvidya might be better for dat über-gpu-raycasting-prototype of yours.
k. |
|
|
| Report Abuse |
|
|