Comments Locked

68 Comments

Back to Article

  • prophet001 - Monday, April 30, 2012 - link

    I was wondering about dual-gpu cards and how they handle micro-stutter.

    Do they suffer from micro-stutter in the same way that SLI setups do?

    If not then, why do GPU manufacturers not apply what they've learned to SLI setups.

    If so then, how do they sell these cards?

    Anyone know about this?
  • Ryan Smith - Monday, April 30, 2012 - link

    "Do they suffer from micro-stutter in the same way that SLI setups do?"

    Yes. It's exactly the same as having a pair of cards in SLI/CF.
  • imaheadcase - Monday, April 30, 2012 - link

    Assuming you HAVE microstutter. Majority of people do NOT.
  • LtGoonRush - Monday, April 30, 2012 - link

    All multi-GPU configurations experience micro-stutter, it's just down to an individual level how much you notice it. Much like how some people aren't bothered by aliasing or tearing artifacts, some people just don't care about micro-stutter.
  • haukionkannel - Monday, April 30, 2012 - link

    It seems that 3 GPU suffer less micro-shutter than 2 GPU setup... There is a test in Toms hardware. Though the maximum performance does not go up much...
    The first is hard to explain... but the test seems to show that minimum frame rates does goes up. Maybe that explains less micro shuttering?
    The later is easier to explain. The scaling is not perfect and the optimization of drivers goes to 1 GPU > 2 GPU > (3 GPU or more)
  • Murloc - Monday, April 30, 2012 - link

    who cares anyway, SLI is useless if you want to buy a computer just for gaming.
    With the best single-gpu video card you already get what you need, if you want more it's for benchmarks. Who cares about stuttering then.
  • prophet001 - Monday, April 30, 2012 - link

    This is not entirely true.

    Based on your resolutions, games, and in game settings, there are a variety of situations where 2 GPUs are necessary to achieve the desired level of graphical detail with playability.
  • tipoo - Monday, April 30, 2012 - link

    More accurately, a single card is usually better until you hit huge resolutions.
  • B3an - Tuesday, May 1, 2012 - link

    Exactly. ANYONE who buys a GPU like this should be using very high res. Infact i would say that anyone who buys a GPU like this and dont use it for higher than 1080p res gaming is a moron. 2560x1600 and upwards is for these cards (like multiple monitors). Anything else is a complete waste. Talking from loads of experience here.

    All these pathetic consoles ports with seriously out of date graphics, because of ancient console hardware, really do not stress cards like this in the slightest at 1080p and under.
  • Digimonkey - Wednesday, May 2, 2012 - link

    Not exactly. If you play on a 120hz monitor you are always aiming for 100-120fps. With details set to their highest and anti-aliasing on you can still struggle to keep your fps at the desired rate even at 1080p.

    I've actually found myself turning the graphics down a bit on some games, as I prefer the smoother gameplay over graphics quality as long as the sacrifice isn't to large.
  • CeriseCogburn - Monday, April 30, 2012 - link

    Except now the NVIDIA drivers address microstutter with frame rate target, and SMOOTH is the result.
    The drivers are currently BETA 301.24 but update all the way back to SERIES 8, the infamous "rebrand" release, breathing gigantic added value into Nvidia's entire line sold for years now.
    At the same time, amd dumped 4000 series and before support.
    Added Value is Nvidia's forte.
  • InsaneScientist - Monday, April 30, 2012 - link

    The drivers don't update all the way back to the GeForce 8 Series because of rebranding; they update that far back because of their "unified driver architecture".
    Basically, due to the way they write their drivers, regardless of the architecture of the card, the driver that they build should work with it. I, unfortunately, can't find a decent explanation of HOW they accomplish that. If anyone knows, please speak up! :)
  • CeriseCogburn - Thursday, May 3, 2012 - link

    Amd hasn't a clue either, but of course "amd drivers are just as good 'now', unlike 'in the past'.... blah blah blah blah.
    amd drivers a bad, sick joke in comparison.
  • Alexvrb - Tuesday, May 1, 2012 - link

    4000 and earlier are just not on a rapid release cycle anymore. There's no more performance to be squeezed out on old cards, just occasional settings/tweaks and bugfixes. So they didn't drop 4000 support.

    :-/
  • CeriseCogburn - Thursday, May 3, 2012 - link

    amd apologist, did they pay you ?
  • RubberJohnny - Sunday, May 6, 2012 - link

    Nvidia Fanboi, did they pay you?
  • CeriseCogburn - Thursday, May 10, 2012 - link

    No need to be a fanboy when the facts are clear and available.
    No I didn't get paid.
    I find it likely it's a far too expensive job to try to have amd wackos acknowledge simple basic facts and have a minimum bit of honesty when they spew forth their apologetics.
    They are after all living under their amd Gamer's Manifesto activism PR mind bend scheme and hate filled rancor toward nVidia.
  • imaheadcase - Tuesday, May 1, 2012 - link

    I could go with that reasoning, except the ones you mentioned if you look REALLY hard i can notice those. Micro stutter i don't notice at all.
  • imaheadcase - Tuesday, May 1, 2012 - link

    To add, its most likely on certain system setups to.
  • skroh - Wednesday, May 2, 2012 - link

    Correction: All AMD/ATi multi-GPU configurations experience microstutter. NVidia SLI has implemented a frame-timing algorithm to prevent it for some time now, and that method has more recently been incorporated into hardware.
  • CeriseCogburn - Thursday, May 3, 2012 - link

    but amd drivers are "just as good as nvidia's now" even though, granted, in the "past" they weren't up to par.... but but but but but ....
    Oh sorry my 7970 just locked up had to come back
  • regularcomputer - Monday, April 30, 2012 - link

    is micro-stutter something noticeable, distracting enough that it affects gameplay or is it something that is only slightly noticeable when one is paying close attention to it?
  • Death666Angel - Monday, April 30, 2012 - link

    Comes down to individual susceptibility. Micro stutter can make a 90fps avrg. frame rate game behave like a 30 fps game. Or it can make a 40 fps game behave like 15fps. It can also have no effect at all. It is very dependant on the game, engine and of course the player. Having 180fps drop to 120fps because of micro stutter isn't a big deal. But then again, why are you using 2 cards for that? If you want 2 cards or more, I hope you are using 1440/1600 vertical resolution, multi-monitor gaming and/or 3d gaming, all of which has current games at highest settings playing in sub 60fps scenarios.
  • SlyNine - Tuesday, May 1, 2012 - link

    It's why I will NEVER go back to SLI or Crossfire. Otho I think I had a bad case of that AND over heating.

    The 8800GT's were really bad for heat. So my experience was VERY bad. But I had 2 5770s (1 5770 and 1 6770) in my dads computer for a time and I didn't see the Micro studder very much, but I wasn't actually playing the games myself.
  • Sabresiberian - Tuesday, May 1, 2012 - link

    I would say a majority of people overlook it; they have it, but don't notice it, or it doesn't bother them much.

    It's not just a multiple-card problem, either; while it tends to be worse on those rigs, and the actual cause may be different, I've seen microstutter on single card systems.

    ;)
  • softdrinkviking - Saturday, May 5, 2012 - link

    So What was the Stc of the 690???
  • CeriseCogburn - Monday, April 30, 2012 - link

    I can't believe it. I thought the plastic machine gun case with the orange safety tip was as juvenile as amd could possibly get, but this is right up there with them.
    This is what the $999 is spent on, crap like this printed wooden weapons case, and the prior delivered crowbar.
    This is a bad joke on all of us.
    I can just hear the juvenile cheers of "that's cool!" just like with the plastic machine gun case replete with child safety glowing orange muzzle tip.
    *shakes head in utter disgust and shame at our would be adults of this world*
  • prophet001 - Wednesday, May 2, 2012 - link

    It's not a crowbar....

    It's a pry bar :D
  • CeriseCogburn - Thursday, May 3, 2012 - link

    Did you mean PR bar...
  • Chaitanya - Monday, April 30, 2012 - link

    That is one cool way to package a video card.
  • CeriseCogburn - Thursday, May 3, 2012 - link

    Do you know how much energy that wastes ? Natural resources are dwindling, but just go ahead and make wooden crates with gigantic oil created foam so you weirdos can play gun and bang bang.
  • Articuno - Monday, April 30, 2012 - link

    What a disgusting joke they pulled with the HL3 implications. I know they for a fact lost hundreds of potential customers, including myself.
  • RussianSensation - Monday, April 30, 2012 - link

    I guess from now on you'll be gaming on Intel's CPUs since ATI bundled HL2 vouchers with 9600/9700Pro cards but HL2 was delayed for a year after that at which point those cards were replaced by even better cards. So people who purchased them in hopes of playing HL2 "soon" got ripped off.

    Also, no offence, but HL2: EP3 isn't even out. Only an ***** would think HL3 would be launching in 2012!!
  • MamiyaOtaru - Tuesday, May 1, 2012 - link

    c'mon man, common speculation is that there will be no ep 3, but that they are just going to call the next Half Life release "Half Life 3"
  • imaheadcase - Monday, April 30, 2012 - link

    you got to be kidding. Never was a implication of it.
  • eddman - Monday, April 30, 2012 - link

    LOL, they didn't pull anything. I guess you pulled the joke on yourself, by trying your hardest to make yourself believe that there's a connection between nvidia's crowbar and HL3. Hell, it's not even a crowbar.
  • Solidstate89 - Monday, April 30, 2012 - link

    They didn't imply a damn thing. So not only is your whole belief completely your fault, but your reaction to it is pretty damn laughable as well.
  • CeriseCogburn - Monday, April 30, 2012 - link

    Oh good, hope more of you bail, that way mre cards for the rest of us not intent on moaning our way around every forum, forever, poorboy meat sandwich in hand.
  • VoidQ - Monday, April 30, 2012 - link

    ...trying to get that through customs.
  • ThaSpacePope - Monday, April 30, 2012 - link

    When the 690gtx comes out only a month after the 680 and there is nary a mention of the 7990.
  • CeriseCogburn - Monday, April 30, 2012 - link

    When amd is late, it's easy to "forget about it" till the day after one expires, and thus it never matters nor means anything.
    Wouldn't want to send any bad vibes about toward the extreme fanboy company.
  • 3DoubleD - Monday, April 30, 2012 - link

    "Applying the prybar in a slightly more civilized manner than we would in most video games, we find the GeForce GTX 690 inside. (ed: If this was a 90's video game, then according to the Crate Review System NVIDIA is already doing very well)"

    ...but doesn't having a crate at the get-go automatically make NVIDIA review badly? StC score would most certainly be zero or negative in this case. Higher StC scores are technically better according to your source.

    Still, hilarious if that was the context they were sending it in.
  • cknobman - Monday, April 30, 2012 - link

    Sorry this is less cool than it is gay.
  • NARC4457 - Monday, April 30, 2012 - link

    Can you clarify how this is gay? I just want to understand how much of an idiot you are based on your response. Thanks much.
  • umbrel - Monday, April 30, 2012 - link

    It made him laugh? I just want to know how much of an idiot I am based on your assesment. Thanks much.
  • Sivar - Monday, April 30, 2012 - link

    Hey, at least he used "than" right.
  • cknobman - Monday, April 30, 2012 - link

    LOL.

    While I found it to be funny, I still thought the whole "gag" of packaging the card up to simulate some type of military grade equipment to be a bit much/eccentric.
  • CeriseCogburn - Monday, April 30, 2012 - link

    Allow me to clarify.
    One opens the crate and doesn't find a gun.
    How *** is that ?
    I guess you're supposed to pick it up and go "bang ! bang! bang bang bang ! " while "pointing it" at.. errr something, then start feeling all out there and manly...so long as you don't injure yourself or the delicate item your tender nerd fingers holds.
    Maybe run around the lab chasing your lab partner while giggling and then flop on the couch laughing and saying "you're dead!" (be careful to hold 680 away from possible woolen and static couch material)
  • eddman - Monday, April 30, 2012 - link

    hmm, let me read that box again; "weapons GRADE, GAMING power". That doesn't sound like a gun to me. Care to try again.

    Since when cheap plastic cases are the same as nice looking wooden boxes? Valuable things usually come in wooden boxes, you know, like good wine.
  • CeriseCogburn - Thursday, May 3, 2012 - link

    That doesn't sound like a gun to you... what did you expect, a basketball ?
  • illusionslayer - Friday, May 4, 2012 - link

    Well, since it also has the nVidia logo on it and I just got a weird shipment from them a few days ago, I'd expect a graphics card.
  • Solidstate89 - Monday, April 30, 2012 - link

    Times like this I really wish we had a report button.

    I mean what it this crap, the god damn YT comment section? Fuck off with your inane comments.
  • hschachtner - Monday, April 30, 2012 - link

    I would buy this just for the box. I use ATI/AMD for most of my machines, but a major vendor of graphics software recommends NVIDIA.

    Again, the box is way cool.
  • Dracusis - Monday, April 30, 2012 - link

    In that case, I'll sell you a wooden crate for $998.00, mine is bigger, cheaper and I'll even throw in free delivery.
  • regularcomputer - Monday, April 30, 2012 - link

    so does the aluminum or metal shroud help in anyway besides making it look awesome and perhaps durable--in dissipating heat?
  • CeriseCogburn - Thursday, May 3, 2012 - link

    No, of course not. Aluminum is a heat retainingdevice in this implementation, that's why they used it.
  • _vor_ - Thursday, May 3, 2012 - link

    Ok assclown. That's enough. If the aluminum isn't touching any of the chips/heat pipes etc. is ISN"T DOING A DAMN THING.

    Also, aluminum (at 68F) has a thermal conductivity of ~118 Btu/(hr F ft) while copper (at 68F) has a thermal conductivity of ~223 Btu/(hr F ft). Just to make things easy for you, that means copper can conduct heat 189% of that of aluminum. *IF* the shroud was really there to conduct heat, it would be copper (oxidization issues notwithstanding); that's why the good heatsinks are copper or have the copper pad on the bottom?

    So, in conclusion, please do us all a favor and break your keyboard over your face and stop posting garbage.
  • CeriseCogburn - Thursday, May 10, 2012 - link

    Well you're wrong again, period, as usual, your 100% incorrect score so far, along with your 100% name calling stat.
    The aluminum has to touch NOTHING as the auir is passing by it and it conducts heat from the air passing by it and it dissipates it, PERIOD.
    Of course you'd have to have basic 6th grade science down to understand that, and not be an amd flamer.
  • AtwaterFS27 - Monday, April 30, 2012 - link

    Cause the joke is on YOU if u buy this thing at $1K.

    While the card itself is quite a looker, anymore than $600 for this is obscene
  • CeriseCogburn - Thursday, May 3, 2012 - link

    $401 for the fake gun box isn't bad.
  • InsaneScientist - Monday, April 30, 2012 - link

    So, I know that their supply is incredibly constrained, so the likelihood is that they only sent you one, but did they, by any chance, send you two of them so we can see some quad SLI figures?

    Granted that level of performance is insane, and the only thing that would be likely to need that kind of horsepower would be a large eyefinity rig, which you can't do with nVidia, but still...
  • Ryan Smith - Tuesday, May 1, 2012 - link

    Just the 1.
  • Sabresiberian - Tuesday, May 1, 2012 - link

    I get a chuckle out of the "crowbar and crate" thing. If I buy one, will it come with those things?

    :D

    This card looks so beautiful; I'm having a hard time fighting off the "coolz toyz" factor, heh. Maybe I'll be over it by the time they are available.

    Personally, I'd like a smidge more memory - I know FXAA is working wonders with memory usage, but still, in a multi-monitor setup, will 2GB be enough? (4GB is 2GB per GPU,and it doesn't stack, but is mirrored, for those who aren't familiar.)

    ;)
  • repoman27 - Tuesday, May 1, 2012 - link

    I noticed the 690 in binary on there right off the bat, but what is the meaning of the rest of the markings?
  • repoman27 - Tuesday, May 1, 2012 - link

    Never mind, I used the power of Google.

    0b1010110010 = 690
    BT-7.080 = 7.080 billion transistors
    G08-H86-A000 = 408-486-2000 (NVIDIA's phone number masked by GHA—Graphics Hardware Acceleration)
  • justniz - Tuesday, May 1, 2012 - link

    When nVidia come up with a new GPU architecture they always do this:
    First, they release a single GPU card called x80 GTX. It costs around $500 and it works perfectly fine with all games for the next 2-3 years at least.
    Then they release downtuned versions called x70 GTX, x70 GT, x60 GT, etc. fpor the budget market.

    Then they do a Dual GPU version with slightly lower clocks, called the x90 GTX that nobody can really put to full use, and it costs around $1000).

    Then its only a very short time until the next chip rev and its x80 come out, that blow away the old x90, totally devaluing the investment of anyone who actually bought a x90 only about 2 months earlier. More often than not because Microsoft also release a new version of Windows or Direct3D that coincidentally have 'features' that review sites say are to die for, but supposedly cant be supported on older GPUs than the very latest gen of hardware.

    Furthermore, it takes several years to make a video game, and they usually pitch the graphics only around whatever the top-end hardware is when they started the development.. At least because they know that a large amount of their customers will only have a budget card (x70 GT or lower).

    Consequently, assuming you don't have a large multi-monitor setup. as far as I can tell there are literally no PC games in existence or coming anytime soon (i.e. before the next nVidia chip rev) where you will actually need or use all the power of a GTX 690, even with all the graphics settings fully maxxed out.

    So I ask, unless you have a massive multi-monitor system, why waste the money? There are no games to use it, and when any game comes out that can stretch even a GTX680, the 6xx series will be at least a year out of date and probably wont support the latest DirectX anyway.

    My GTX 580 is still handling perfectly everything I can throw at it, even fully maxxed out.
  • slikts - Thursday, May 3, 2012 - link

    One reason is 120Hz screens: for instance, GTX 690 could run BF3 at ultra settings, full HD and 120 FPS. It's the only single card that could do this (I assume anyway) currently, and maybe it would even drop below 120 if the action is sufficient.
  • stjoker69 - Wednesday, May 2, 2012 - link

    "(ed: If this was a 90's video game, then according to the Crate Review System NVIDIA is already doing very well)"

    Ryan Smith, you got this backwards. According to the CRS, the longer it takes you to find a crate, the better. Come on mang!

Log in

Don't have an account? Sign up now