1. Post #81
    BRQNCO's Avatar
    September 2009
    107 Posts
    Anything to fight back eyefinity? If not nvidia will still fail in my eyes.
    Nvidia are probably working on something similar to compete, since when have they ever let ATI do something without offering an alternative at some point?

    We probably won't see 3+ monitor capability in the first GT300 card to hit shelves but in a later special model or mini-series.

    Though who honestly knows, your guess is as good as mine.

    Edited:

    also

    The GT300, the first Fermi-based GPU, will feature about three billion transistors; up to 512 CUDA cores, GDDR5 implementation, 6 channels of 64-bits, a total memory bus of 384-bits and support for up to 6GB of memory.
    Fuck yes!.. GTA4 will run properly with all those fucking massive modded textures. :D
    Reply With Quote Edit / Delete Reply Australia Show Events Agree Agree x 1 (list)

  2. Post #82
    Gold Member
    Dennab
    December 2005
    2,126 Posts
    This GT300 is looking to be very impressive, but it's not at all aimed at the consumer/gamer demographic, it's pretty obviously going to be targeting very specialized high performance applications. It's quite interesting to see how this is going to be, Fermi looks pretty amazing. However it's almost definitely going to be far too expensive for the average consumer, a die size that large is going to be really fucking expensive; it seems Nvidia is really hoping to win big on their whole Tesla idea, because this isn't going to be much of a winner in the consumer market.

  3. Post #83
    Gold Member
    pebkac's Avatar
    January 2009
    2,645 Posts
    GTX 380: Vertex Shaders (512), ROP's (64), Pixel Shaders (512)
    Core Speed (700, 1.6), Pixel Fill Rate (44,800), Texture Fill Rate (89,600)
    Memory Bandwith/Type (512Mbit GDDR5), Memory Speed (1100Mhz), Memory Bandwith (281.6 GB/sec)
    Just a note: the bandwidth for the 380 was calculated incorrectly, the Anandtech article clearly states it's going to have a 384 bit bus, not 512, so: (1100MHz*4)*(384bit/(8bit/byte))= 211,2 GB/s or 195,7 GiB/s, but i think it's usualy measured in GB. Over 200 GBps is still a lot though.

    Nvidia's Shaders are faster than ATI's so ATI having a crap load more of them doesn't make that much of a difference, the 4890 has 800 of them while the GTX275 only has 240, but both cards perform similar, the GTX380 has more than doubled them (512 vs 240) while ati's have been exactly doubled (1600 vs 800), so it's already a sign that the GTX380's might be faster.
    Yeah, the Nvidia's shaders are a lot faster, they work at A LOT higher clockspeed than the GPU core clock, in ATI cards they work at GPU core clockspeed. Also, ATI's shaders are grouped into packs of 5, with only one of those 5 shaders being able to perform all operations and the other four can only do simpler stuff. In Nvidia's cards, all shaders have full computational capabilities. ATI makes up for these differences by having a shitload more shaders.

    Looks like nvidia has pretty much doubled everything up, the same as ati did with the 5 series. I wonder if all the architectural optimizations are going to make any difference in gaming performance. If not, the GTX 3xx vs HD5xxx performance differences should be around the same as GTX2xx vs HD4xxx. These new cards have the potential to be awesome, now we jus have to wait for the prices to see if it's worth it.

  4. Post #84
    Gold Member
    hexpunK's Avatar
    August 2008
    15,639 Posts
    As powerful as this seems it may be, I would rather have full DirectX 11 support, and be able to see new features in game engines, than CUDA and Physx, which I would most likely never actually use fully. Though the specs for it do make it seem pretty damn nice, I sort of hope it performs great too, even if I won't buy one.
    Reply With Quote Edit / Delete Reply United Kingdom Show Events Dumb Dumb x 1Agree Agree x 1 (list)

  5. Post #85
    Gold Member
    pebkac's Avatar
    January 2009
    2,645 Posts
    I would rather have full DirectX 11 support
    uhhh

    Fermi will support DirectX 11 and NVIDIA believes it'll be faster than the Radeon HD 5870 in 3D games.
    From the article.

  6. Post #86
    :smug:'s Avatar
    August 2009
    3,013 Posts
    I want pricing information.
    Its gona be like 150-200 ($240-$320), i bet
    Reply With Quote Edit / Delete Reply United Kingdom Show Events Disagree Disagree x 14Funny Funny x 2Dumb Dumb x 1 (list)

  7. Post #87
    Gold Member
    johanz's Avatar
    February 2008
    7,578 Posts
    It's funny how people think that new nvidia card won't support dx11 fully because nvidia said it's not the main reason to buy it.
    They never said they won't support dx11 or it's features, they're just not their main focus.
    Reply With Quote Edit / Delete Reply Latvia Show Events Agree Agree x 7 (list)

  8. Post #88
    Gold Member
    Hatsen's Avatar
    January 2006
    1,042 Posts
    It really looks quite hardcore.

    Reply With Quote Edit / Delete Reply Denmark Show Events Agree Agree x 8Disagree Disagree x 3Dumb Dumb x 1Funny Funny x 1 (list)

  9. Post #89
    TheMadness's Avatar
    June 2009
    867 Posts
    bleugh shiny

    Matte is what's cool now
    Reply With Quote Edit / Delete Reply United Kingdom Show Events Agree Agree x 12Funny Funny x 1Dumb Dumb x 1 (list)

  10. Post #90
    Gold Member
    hexpunK's Avatar
    August 2008
    15,639 Posts
    uhhh



    From the article.
    :saddowns: I'm sure that wasn't there when I read it the first time. Must. Stop. Skimming.
    Reply With Quote Edit / Delete Reply United Kingdom Show Events Agree Agree x 1 (list)

  11. Post #91
    Gold Member
    acds's Avatar
    October 2008
    14,971 Posts
    Guys just because it's not only aimed at gamers it does not mean it won't be good for games, for all we know it could beat the 5000-series by far. Lets wait for benchmarks before dismissing it as not being a GPU for gamers (not saying that anybody has done so, but I can see that coming sooner or later in this thread).

    Also personally I don't think Nvidia will do too much to oppose eyefinity, afterall Nvidia has the whole 3D thing, which I personally would greatly prefer to eyefinity (two completely different things I know, but since you have to choose between the two, I'd choose 3D).
    By the way my opinion on the Eyefinity vs Nvidia 3D is, as I said, an opinion, I don't prefer 3D because it's made by Nvidia, it's just a matter of personal opinion.
    Reply With Quote Edit / Delete Reply Sweden Show Events Agree Agree x 3 (list)

  12. Post #92
    hang you're self
    G71tc4's Avatar
    March 2008
    4,771 Posts
    I love my cards as expensive AND as big as a football field, so Nvidia is my favorite company.
    Reply With Quote Edit / Delete Reply United States Show Events Funny Funny x 7Dumb Dumb x 3Agree Agree x 2 (list)

  13. Post #93
    Gold Member
    Dennab
    December 2005
    2,126 Posts
    Its gona be like 150-200 ($240-$320), i bet
    I'm pretty sure this is quite literally impossible. The yields on such an immensely huge and complicated die together with the sheer size of the damn thing are going to drive costs off the charts. I'm thinking this is going to be closer in the $500 range.
    Reply With Quote Edit / Delete Reply Japan Show Events Agree Agree x 6 (list)

  14. Post #94
    Gold Member
    Quantuam VTX's Avatar
    July 2006
    1,079 Posts
    Edited:

    More info:

    Just hours before Nvidia decided to share the Fermi, GT300 specification with the world, we managed to get most of the spec and share it with you.

    The chip has three billion transistors, a 384-bit memory interface and 512 shader cores, something that Nvidia plans to rename to Cuda cores.

    The chip is made of clusters so the slower iterations with less shaders should not be that hard to spin off. Each cluster of the chip has 32 Cuda cores meaning that Fermi has 16 clusters.

    Another innovation is 1MB of L1 cache memory divided into 16KB Cache - Shared Memory as well as 768KB L2 of unified cache memory. This is something that many CPUs have today and as of this day Nvidia GPUs will go down that road.

    The chip supports GDDR5 memory and up to 6 GB of it, depending on the configuration as well as Half Speed IEEE 754 Double Precision. We also heard that the chip can execute C++ code directly and, of course, it's full DirectX 11 capable.

    Nvidia’s next gen GPU is surely starting to look like a CPU. Looks like Nvidia is doing reverse Fusion and every generation they add some CPU parts on their GPU designs.
    Source: http://www.fudzilla.com/content/view/15756/1/
    Reply With Quote Edit / Delete Reply United States Show Events Informative Informative x 3Optimistic Optimistic x 1Disagree Disagree x 1 (list)

  15. Post #95
    turbo buttes
    Zorlok's Avatar
    June 2007
    14,602 Posts
    Its gona be like 150-200 ($240-$320), i bet
    Nope.

    Edited:

    It really looks quite hardcore.

    At least wipe the smudges off.
    Reply With Quote Edit / Delete Reply United States Show Events Agree Agree x 15 (list)

  16. Post #96
    Gold Member
    hexpunK's Avatar
    August 2008
    15,639 Posts
    Its gona be like 150-200 ($240-$320), i bet
    The 5870's are still 300, ATi are the cheaper of the two manufacturers, there is no way in hell this would be priced lower than the 5870 unless it was actually incredibly shit, and nVidia were just lying about it's awesome power.
    Reply With Quote Edit / Delete Reply United Kingdom Show Events Agree Agree x 10 (list)

  17. Post #97
    Gold Member
    ShaRose's Avatar
    April 2007
    1,240 Posts
    I'm pretty sure this is quite literally impossible. The yields on such an immensely huge and complicated die together with the sheer size of the damn thing are going to drive costs off the charts. I'm thinking this is going to be closer in the $500 range.
    $600. This is nvidia, remember?
    Reply With Quote Edit / Delete Reply Canada Show Events Agree Agree x 7Dumb Dumb x 3 (list)

  18. Post #98
    Gold Member
    cryticfarm's Avatar
    January 2008
    2,205 Posts
    Yeah knowing nvidia, this will beat the 5870 by a few FPS, and seeing that the 5870 costs 380, it will cost 500.
    (trubitar will get 4 anyways :\)
    Reply With Quote Edit / Delete Reply Canada Show Events Agree Agree x 2 (list)

  19. Post #99
    Gold Member
    Murkrow's Avatar
    April 2005
    4,895 Posts
    It really looks quite hardcore.

    Seeing his nasty greasy thumb that close to the gold plated PCI-E connector is almost heart wrenching:frown:
    Reply With Quote Edit / Delete Reply Slovenia Show Events Dumb Dumb x 7Agree Agree x 2Disagree Disagree x 1 (list)

  20. Post #100
    Gold Member
    Pandamobile's Avatar
    January 2009
    3,703 Posts
    Did you guys notice that it only has one 8 pin PCI-E power connector?

    Edited:

    It's funny how people think that new nvidia card won't support dx11 fully because nvidia said it's not the main reason to buy it.
    They never said they won't support dx11 or it's features, they're just not their main focus.
    Yet I get 21 disagrees on my post saying that. :saddowns:
    Reply With Quote Edit / Delete Reply Canada Show Events Agree Agree x 3Disagree Disagree x 1 (list)

  21. Post #101
    Hillo's Avatar
    November 2007
    401 Posts
    Did you guys notice that the tesla is a fake
    Reply With Quote Edit / Delete Reply Finland Show Events Dumb Dumb x 5Disagree Disagree x 1 (list)

  22. Post #102
    turbo buttes
    Zorlok's Avatar
    June 2007
    14,602 Posts
    Yeah knowing nvidia, this will beat the 5870 by a few FPS, and seeing that the 5870 costs 380, it will cost 500.
    (trubitar will get 4 anyways :\)


    Well that is why I still think I'll be going red again. I honestly will not pay another $50-100 for a few FPS here and there (just to own a specific brand). They're going to have to give me some pretty significant increases over what ATI has on the table right now (to justify that supposed cost increase), or match ATI price-wise.

    But I will try and hold out until I see some legitimate benchmarks on these cards before making a choice.

    Edited:

    But again, maybe they'll surprise us with their launch prices. I personally am not counting on it...but who knows. Things have changed quite a bit.
    Reply With Quote Edit / Delete Reply United States Show Events Agree Agree x 2 (list)

  23. Post #103

  24. Post #104
    turbo buttes
    Zorlok's Avatar
    June 2007
    14,602 Posts


    I personally think that's pretty damn sexy.
    Reply With Quote Edit / Delete Reply United States Show Events Agree Agree x 14 (list)

  25. Post #105
    Gold Member
    Pandamobile's Avatar
    January 2009
    3,703 Posts


    I personally think that's pretty damn sexy.
    What's the point of a mirror finish on a graphics card?
    Reply With Quote Edit / Delete Reply Canada Show Events Agree Agree x 7Dumb Dumb x 3Disagree Disagree x 1 (list)

  26. Post #106
    Viking Chest hair simulator 2012
    TheDestroyerOfall's Avatar
    June 2009
    2,476 Posts
    It really looks quite hardcore.

    only one 8 pin too.

  27. Post #107
    turbo buttes
    Zorlok's Avatar
    June 2007
    14,602 Posts
    Well there's also that 6 pin on the top.

  28. Post #108
    Hunterbrute's Avatar
    March 2009
    3,725 Posts
    nVidia were just lying about it's awesome power.
    You mean like their good old plan of rebranding their new series and having miner changes done to it to make it just slightly better?
    Reply With Quote Edit / Delete Reply United States Show Events Bad Spelling Bad Spelling x 1 (list)

  29. Post #109
    Gold Member
    Pandamobile's Avatar
    January 2009
    3,703 Posts
    Well there's also that 6 pin on the top.
    That's stupid, why would they put the power connectors on different sides of the card...
    Reply With Quote Edit / Delete Reply Canada Show Events Dumb Dumb x 1 (list)

  30. Post #110
    Hillo's Avatar
    November 2007
    401 Posts
    also
    http://translate.google.com/translat...ikin-vaarennos

    Edited:

    only one 8 pin too.
    And the best part about is that it's not even soldered to the board
    Reply With Quote Edit / Delete Reply Finland Show Events Agree Agree x 1 (list)

  31. Post #111
    Gold Member
    Dennab
    February 2007
    5,353 Posts
    Full DX11 capable? That means OpenCL and tessellation too? If it was the right price, I might buy it for the benefit of having PhysX/CUDA, the current leaders in GPU powered physics in games along with OpenCL. Of course, if it was substantially more expensive, I wouldn't pay more just to see cloth and broken tiles. Plus tessellation is fucking awesome, and if it's being used in AvP, then those screenshots I saw in a magazine just might not be prerendered. But Nvidia aren't known as the kings of undercutting, so I'm gonna hold out until they've had a little price battle with ATi first.
    Reply With Quote Edit / Delete Reply United Kingdom Show Events Agree Agree x 2 (list)

  32. Post #112
    Reply With Quote Edit / Delete Reply Canada Show Events Agree Agree x 2 (list)

  33. Post #113
    goatse
    Craptasket's Avatar
    January 2006
    33,016 Posts
    Reply With Quote Edit / Delete Reply United States Show Events Agree Agree x 3Disagree Disagree x 2 (list)

  34. Post #114
    Reply With Quote Edit / Delete Reply United States Show Events Agree Agree x 1 (list)

  35. Post #115
    Glod Menber
    Amez's Avatar
    June 2008
    6,721 Posts
    Nvidia's Shaders are faster than ATI's so ATI having a crap load more of them doesn't make that much of a difference, the 4890 has 800 of them while the GTX275 only has 240, but both cards perform similar, the GTX380 has more than doubled them (512 vs 240) while ati's have been exactly doubled (1600 vs 800), so it's already a sign that the GTX380's might be faster.

    Edited:



    http://forums.guru3d.com/showthread.php?t=305808
    Wait wait wait, don't the shader units differ from eachother?

  36. Post #116
    Gold Member
    Dennab
    December 2005
    2,126 Posts
    Well shit, Nvidia seems to be having a tad bit of trouble.
    Reply With Quote Edit / Delete Reply Japan Show Events Agree Agree x 1 (list)

  37. Post #117
    Gold Member
    Dennab
    February 2007
    5,353 Posts
    Oh jesus god, Nvidia you dolts. And ATi has GPU Accelerated Havok you say? I might buy ATi and hold out for ValVe including that in an engine update.

    Why the dumbs? Imagine Gmod with that tech. I'll be using those boxes to store the debris from the thousands of exploding barrels that I'm gonna be apawning.
    Reply With Quote Edit / Delete Reply United Kingdom Show Events Dumb Dumb x 7Agree Agree x 2 (list)

  38. Post #118
    Gold Member
    Ajacks's Avatar
    August 2006
    4,452 Posts
    What's the point of a mirror finish on a graphics card?
    Whats the point of a giant sticker of a half naked armor clad woman on a video card? Both are pointless, it's in your computer.

    NVIDIA is not going to release a card over their standard price bracket, it's just the rules of economics, NVIDIA knows what they are doing and they are not leaving the target demographics. Just because it's a better die archecture and size doesn't mean it will cost more, efficiency doesn't equal more expensive.
    Reply With Quote Edit / Delete Reply United States Show Events Agree Agree x 4 (list)

  39. Post #119
    Gold Member
    limulus54's Avatar
    August 2008
    3,826 Posts
    I'd only get one if:

    a) nvidia lowers their prices (like thats gonna happen)

    b) this is way better than the 5870
    Reply With Quote Edit / Delete Reply United States Show Events Agree Agree x 2Dumb Dumb x 1 (list)

  40. Post #120
    Daltacentauri's Avatar
    April 2009
    680 Posts


    I personally think that's pretty damn sexy.
    Indeed it looks hot, I'm sure the mirror plating is to show you how it can turn anything hot, even you.
    Reply With Quote Edit / Delete Reply United States Show Events Agree Agree x 1Optimistic Optimistic x 1 (list)