1. Post #1
    Gold Member
    Coridan's Avatar
    April 2006
    1,193 Posts
    Hours before the estimated (22/3/12) release date:


    An official video from Nvidia, introducing the unannounced GeForce GTX 680, has leaked on YouTube. The video features Nvidia executive Ujesh Desai, who introduces the new hardware and says that the company "invented some new technology" that goes with the GTX 680 — including "GPU boost," which Desai says converts unused power into increased clock speeds. We're not sure on the specifics of the feature yet, but it sounds like an on the fly method of overclocking. Desai also introduces "TXAA," a new anti-aliasing algorithm that's rumored to significantly boost AA performance. The video also includes features (and the same demo clips) that Nvidia demonstrated at GDC 2012: a new method of fur simulation that it says can render 100,000 hairs in real time, and PhysX advancements that will allow for real-time object fracturing. Finally, Desai says that the GTX 680 will support 3D Vision Surround and up to four displays with a single card.
    Source: http://www.theverge.com/2012/3/21/28...l-video-leaked

    And also, if you're interested:

    Nvidia has been really tight lipped on its next generation of PC video cards. The same video cards that are meant to combat AMD’s head start with the HD7970. If an early leaked review from Tom’s Hardware is any indication, the card is well worth the wait. The best part – the wait is only two days.WCCF Tech got a hold of an early review from Tom’s Hardware for the GTX 680. The review has since been removed as Nvidia is forcing everybody to remain pretty quiet on their new video card until it launches on March 22. Good thing there are people who grab these things before they’re removed. The review gives us a good look at the performance of the new video card.
    The GeForce GTX 680 uses the Kepler architecture that is comprised of a GK104 Core that contains 1536 Cuda Cores, 2 GB GDDR5 Memory buffer, a core clock at 1006 MHz and memory that’s clocked to 1.5 GHz. If you don’t understand what any of that means, just know this: the 680 is a beast that completely dominates the AMD HD7970 and the review proves that.
    According to the review, the GTX 680 has up to 50 percent better performance than AMD’s HD7970 when playing games at 19201080 resolution. That performance can fluctuate depending on the machine and the resolution. The leaked review doesn’t make any mention of the other components, but we can assume based on similar reviews that the system was rockin’ an Intel i7 and at least 8 GB of RAM.
    The funny thing is that the GTX 680 is only ever bested by the GTX 590. That makes sense, however, as the 590 is two GTX 500s working together to create a powerful card. I’m sure Nvidia will release a dual GPU card in the 600 line as well that will smoke the competition. For now though, the 680 appears to be an affordable powerhouse that should appease any gamer’s need for PC gaming.
    To see additional benchmarks at a 25601600 resolution, check out the leaked review.
    Source:http://www.webpronews.com/nvidia-gef...leaked-2012-03

    Latest news also states that the asking price for the card may have dropped from $549 to $499. Personally, I will wait until Hardware Canucks gets a hold of one and reviews it. They have the best unbiased reviews. Nonetheless, I'm excited.

    In depth Tom's Hardware review: http://www.tomshardware.com/reviews/...mark,3161.html
    Reply With Quote Edit / Delete Reply Windows 7 United States Show Events Winner Winner x 6Late Late x 4 (list)

  2. Post #2
    Superwafflez's Avatar
    April 2010
    2,236 Posts
    According to the review, the GTX 680 has up to 50 percent better performance than AMD’s HD7970 when playing games at 19201080 resolution.
    Damn son.
    Reply With Quote Edit / Delete Reply Mac Australia Show Events Disagree Disagree x 5Optimistic Optimistic x 1 (list)

  3. Post #3
    insane taco's Avatar
    April 2009
    941 Posts
    I just got a 560 Ti and I'm already jealous.
    Reply With Quote Edit / Delete Reply Windows 7 United States Show Events Agree Agree x 17Funny Funny x 1Friendly Friendly x 1 (list)

  4. Post #4
    Gold Member
    superdinoman's Avatar
    July 2006
    8,008 Posts
    Been waiting for this chipset. Hopefully when the other reviews come in they're also good so I can decide between this or the new ATI series.

  5. Post #5
    Allstone's Avatar
    November 2010
    1,692 Posts
    It trades blows but generally comes out on top. It's looking like it's 500 dollars though, so I'm looking forward to the oncoming price wars.
    Reply With Quote Edit / Delete Reply Windows 7 Australia Show Events Agree Agree x 2 (list)

  6. Post #6
    Mr Jazz
    Makol's Avatar
    January 2009
    38,787 Posts
    It's $509 at the lowest according to the Newegg leak.

  7. Post #7
    That Dog
    Ehmmett's Avatar
    March 2009
    13,206 Posts
    I'll take 2, thanks.
    Reply With Quote Edit / Delete Reply Windows 7 United States Show Events Agree Agree x 2 (list)

  8. Post #8
    Gold Member
    Ins4ne's Avatar
    November 2008
    910 Posts
    Anyone know if EVGA still does the "trade-up" program so long as you pay the difference? I can't seem to find it anywhere.

  9. Post #9
    Gold Member
    vexx21322's Avatar
    December 2008
    10,577 Posts
    That TXAA seems interesting.
    Reply With Quote Edit / Delete Reply Windows 7 United States Show Events Agree Agree x 3 (list)

  10. Post #10
    Gold Member
    GoDong-DK's Avatar
    November 2009
    14,465 Posts
    Way to get me excited while I'm on the toilet.
    Reply With Quote Edit / Delete Reply Windows 7 Denmark Show Events Funny Funny x 17 (list)

  11. Post #11
    Wait... so if I write anything here, it's going to show up under my name?
    B!N4RY's Avatar
    December 2009
    7,214 Posts
    I just got a 560 Ti and I'm already jealous.
    I hate it when people say "I just bought a X and I'm jealous that Y is going to be out" when X isn't even the flagship component.
    Reply With Quote Edit / Delete Reply Windows 7 Canada Show Events Agree Agree x 19 (list)

  12. Post #12
    Gold Member
    reapaninja's Avatar
    November 2008
    8,118 Posts
    I hate it when people say "I just bought a X and I'm jealous that Y is going to be out" when X isn't even the flagship component.
    or the people who care enough to whine about it, but not enough to take 20 seconds to to google for information before they buy
    Reply With Quote Edit / Delete Reply Windows 7 United Kingdom Show Events Agree Agree x 4 (list)

  13. Post #13
    Civil's Avatar
    December 2009
    3,734 Posts
    I want to see the specs of the 680 compared to the 7970 and see if all of this is only about some magical driver from nvidia.
    Reply With Quote Edit / Delete Reply Windows 7 Sweden Show Events Dumb Dumb x 1 (list)

  14. Post #14
    Gold Member
    GoDong-DK's Avatar
    November 2009
    14,465 Posts
    From what we've seen, it's bull. Unless we're talking the difference between 10 and 15FPS.
    Reply With Quote Edit / Delete Reply Windows 7 Denmark Show Events Agree Agree x 3 (list)

  15. Post #15

    June 2011
    492 Posts
    I want to see the specs of the 680 compared to the 7970 and see if all of this is only about some magical driver from nvidia.
    Most likely, remember, GPU BOOST, DAT FREE OVERCLOCKING!

  16. Post #16
    Gold Member
    GoDong-DK's Avatar
    November 2009
    14,465 Posts
    Most likely, remember, GPU BOOST, DAT FREE OVERCLOCKING!
    Yeah headroom we had either way! And from what I understood of the video, it overclock when you already have a fine FPS? Wouldn't it be better to just do it when the FPS is low? It doesn't really make all that much sense to me.
    Reply With Quote Edit / Delete Reply Windows 7 Denmark Show Events Agree Agree x 3 (list)

  17. Post #17
    Headphone doctor
    David Tennant's Avatar
    April 2010
    5,449 Posts
    Yeah headroom we had either way! And from what I understood of the video, it overclock when you already have a fine FPS? Wouldn't it be better to just do it when the FPS is low? It doesn't really make all that much sense to me.
    I think, but don't quote me on this, you define an FPS (similar to V-sync) and the GPU will underclock/overclock itself to a certain degree to stay as close to that frame limit as possible, so if choose 60fps and you're playing an intensive game and you're getting 50fps, it will overclock (with the maximum overclock hopefully being a manual setting) itself to try get that 60fps, and if you're playing a less intensive game and getting 150fps, it'll underclock and be less power and heat intensive.

    I just made this shit up, but it sounds like that's what it does, and it sounds fairly great, especially when you're playing older games without V-sync.
    Reply With Quote Edit / Delete Reply Windows 7 United Kingdom Show Events Agree Agree x 9Informative Informative x 1 (list)

  18. Post #18
    Gold Member
    vexx21322's Avatar
    December 2008
    10,577 Posts
    I think, but don't quote me on this, you define an FPS (similar to V-sync) and the GPU will underclock/overclock itself to a certain degree to stay as close to that frame limit as possible, so if choose 60fps and you're playing an intensive game and you're getting 50fps, it will overclock (with the maximum overclock hopefully being a manual setting) itself to try get that 60fps, and if you're playing a less intensive game and getting 150fps, it'll underclock and be less power and heat intensive.

    I just made this shit up, but it sounds like that's what it does, and it sounds fairly great, especially when you're playing older games without V-sync.
    I personally think that this type of feature is great. I don't need 900 fps in half-life 1. Sure, you can just limit your framerate, but your card would still be pulling unnecessary power.
    Reply With Quote Edit / Delete Reply Windows 7 United States Show Events Agree Agree x 2 (list)

  19. Post #19
    matte3560's Avatar
    March 2009
    362 Posts
    Finally they got surround working on a single card. Now I actually have the option to go green without having to part with at least one limb.
    Reply With Quote Edit / Delete Reply Windows 7 Norway Show Events Dumb Dumb x 1Agree Agree x 1 (list)

  20. Post #20
    Allstone's Avatar
    November 2010
    1,692 Posts
    Why couldn't you have just gone for an AMD card if you actually needed surround?

  21. Post #21
    Mombasa's Hubby :3
    venn178's Avatar
    October 2008
    3,886 Posts
    Finally they got surround working on a single card. Now I actually have the option to go green without having to part with at least one limb.
    This post confused me. The only way it makes says is if I'm to assume that you're talking about three monitors, and NVIDIA only.

    Because I've had three monitors on my HD 5770 for a year now.
    Reply With Quote Edit / Delete Reply Windows 7 Show Events Agree Agree x 1Dumb Dumb x 1 (list)

  22. Post #22
    matte3560's Avatar
    March 2009
    362 Posts
    This post confused me. The only way it makes says is if I'm to assume that you're talking about three monitors, and NVIDIA only.

    Because I've had three monitors on my HD 5770 for a year now.
    Well, Nvidias tech is called Surround. I figured people would be able to understand I was talking about that specifially, since AMDs tech is called Eyefinity, and already works on a single card. I guess not...

    Edited:

    Why couldn't you have just gone for an AMD card if you actually needed surround?
    I already have a 5870 (running eyefinity), but in a year or so it will probably need upgrading. I'm holding out for the next generation anyway, but it's nice that Nvidia finally has surround working on single card solutions. That way when the time comes, I can actually pick based on which card is best, and not based on which card can drive 3 monitors at once.

  23. Post #23
    Moderator
    AshMan55's Avatar
    July 2006
    10,207 Posts
    They put up an introducing article

    http://www.geforce.com/whats-new/art...u?sf3580958=1/
    Reply With Quote Edit / Delete Reply Windows 7 Australia Show Events Informative Informative x 1 (list)

  24. Post #24

  25. Post #25
    Quite so, ol' chap.
    Dominicus's Avatar
    December 2011
    4,040 Posts
    Guess I won't be buying a new car just yet.
    Reply With Quote Edit / Delete Reply Windows 7 Netherlands Show Events Winner Winner x 2 (list)

  26. Post #26
    Moderator
    AshMan55's Avatar
    July 2006
    10,207 Posts
    apparently it's $679 here in aus

    damnit :(
    Reply With Quote Edit / Delete Reply Windows 7 Australia Show Events Friendly Friendly x 2 (list)

  27. Post #27
    Gold Member
    Coridan's Avatar
    April 2006
    1,193 Posts
    That's a horribly incompetent and painfully biased review. I'd wait for a more professional and/or unbiased review.

    Edited:

    Like this one: http://www.pcmag.com/article2/0,2817,2401953,00.asp
    Reply With Quote Edit / Delete Reply Windows XP United States Show Events Funny Funny x 3 (list)

  28. Post #28
    metromod.net
    _Chewgum's Avatar
    April 2010
    2,216 Posts
    Gonna wait for the 660 Ti, or maybe 560 Ti will go down in price

  29. Post #29
    T3hGamerDK's Avatar
    January 2011
    2,551 Posts
    I guess I'll buy a budget AMD 7770 instead, or something like that.
    The cheapest I can get it here is 708 USD :(

  30. Post #30
    Gold Member

    August 2006
    1,003 Posts
    The 7970 was overpriced anyways, it should have been 449$ at release. Seeing as it overclocks better, hopefully the 680 will push it down there considering it's a little bit faster (well depending on the game of course).

    I wasn't disappointed with the 7970's 25% over the 580 but seeing at 2900xt>3870>4870 was double the ALU's and this was about 25% more ALUs and 45% over the 6970 it's not surprising Nvidia could easily beat it.

    The only thing different about this round is that Nvidia doesn't have the massive power problems and can achieve the same performance. But still selling a 294mm2 chip at $500 is crazy.
    Reply With Quote Edit / Delete Reply Windows 7 Canada Show Events Optimistic Optimistic x 1 (list)

  31. Post #31
    Master Cheese Tactician
    The Decoy's Avatar
    August 2011
    947 Posts
    apparently it's $679 here in aus

    damnit :(
    atleast it's not... $680 *cough* but yeah I feel sad about that price too
    Reply With Quote Edit / Delete Reply Windows 7 Australia Show Events Dumb Dumb x 2 (list)

  32. Post #32
    Moderator
    AshMan55's Avatar
    July 2006
    10,207 Posts
    I'm still buying one :D
    Reply With Quote Edit / Delete Reply Windows 7 Australia Show Events Winner Winner x 1 (list)

  33. Post #33
    Gold Member
    Coridan's Avatar
    April 2006
    1,193 Posts
    Reply With Quote Edit / Delete Reply Windows XP United States Show Events Useful Useful x 1 (list)

  34. Post #34
    Allstone's Avatar
    November 2010
    1,692 Posts
    That's a horribly incompetent and painfully biased review. I'd wait for a more professional and/or unbiased review.

    Edited:

    Like this one: http://www.pcmag.com/article2/0,2817,2401953,00.asp
    It was the only one up when I was writing the post.

  35. Post #35
    Live2becool's Avatar
    December 2009
    1,987 Posts
    I received word that the GTX 680 is backwards compatible with the 2.0 PCI slot. Is this true... and if so, would it make a difference in performance if you used the 2.0 slot? Just curious!

  36. Post #36
    metromod.net
    _Chewgum's Avatar
    April 2010
    2,216 Posts
    I received word that the GTX 680 is backwards compatible with the 2.0 PCI slot. Is this true... and if so, would it make a difference in performance if you used the 2.0 slot? Just curious!
    http://www.geforce.com/hardware/desk...specifications

    *GeForce GTX 680 supports PCI Express 3.0. The Intel X79/SNB-E PCI Express 2.0 platform is only currently supported up to 5GT/s (PCIE 2.0) bus speeds even though some motherboard manufacturers have enabled higher 8GT/s speeds.
    Reply With Quote Edit / Delete Reply Windows 7 United States Show Events Informative Informative x 1 (list)

  37. Post #37
    I heard that middle of next month we should see 4GB cards out. If that's true, I'll be buying atleast two.

    No way I'm going to 2GB from 3GB.

  38. Post #38
    Dancing Member
    BlueYoshi's Avatar
    July 2009
    2,748 Posts
    I heard that middle of next month we should see 4GB cards out. If that's true, I'll be buying atleast two.

    No way I'm going to 2GB from 3GB.
    Source?

  39. Post #39
    BMCHa's Avatar
    August 2007
    851 Posts
    I received word that the GTX 680 is backwards compatible with the 2.0 PCI slot. Is this true... and if so, would it make a difference in performance if you used the 2.0 slot? Just curious!
    PCI-E is designed to be backwards compatible. You should be able to use a 680 in a PCI-E 1.0 slot if you wanted to.
    Reply With Quote Edit / Delete Reply Windows 7 United States Show Events Winner Winner x 1Agree Agree x 1 (list)

  40. Post #40
    Tucan Sam's Avatar
    May 2007
    858 Posts
    http://www.anandtech.com/show/5699/n...x-680-review/8

    what is this 50% better about now? all I see is 1-2%
    Reply With Quote Edit / Delete Reply Windows 7 United States Show Events Dumb Dumb x 3 (list)