AMD R9 390X and Nvidia GTX 980Ti performance benchmarks leak


It is known that AMD is on the verge of launching its new flagship GPU, while Nvidia is looking to launch a lower-spec version of its existing flagship card, the GTX Titan X. A few leaks regarding the specs of the AMD R9 390X have already surfaced on the web. It’s not surprising, then, to see benchmark statistics for the card also making rounds around the internet. The source of these supposed benchmarks is Chiphell, the Chinese website known for its random leaks.

What’s interesting is that these benchmarks compare the R9 390X, or Fiji XT, with a cut down version of Nvidia’s GM 200 chip, which we’re assuming is probably the unannounced GTX 980Ti. The full GM 200 chip with all cores active is available in the form of Nvidia’s flagship GTX Titan X. A block diagram of the GTX Titan X can be seen below.

chiphell 1

Given that the GTX 980Ti is bound to be cheaper than the GTX Titan X, it’ll have to be toned down in terms of effective performance. At this point, it is unknown how many CUDA cores will be disabled, but performance should logically fall somewhere between the GTX 980 and the GTX Titan X, preferably closer to the latter. It’s also likely that the VRAM will be halved from the Titan X’s 12GB to 6GB.


Based on Chiphell’s benchmarks, the tests have been conducted with 19 games at resolutions of both 4K and 1600p. If the benchmarks are legitimate, the tests should be thorough enough to get a fair idea of the performance of each of the unreleased GPUs.

Without further ado, here’s a look at the series of benchmarks.

chiphell 2 chiphell 4 chiphell 3

It’s clearly evident from each of these tests that not only does AMD’s R9 390X (Fiji XT) comfortably beat Nvidia’s GTX Titan X, it also gives its cut down sibling, the GTX 980Ti, a run for its money. While the performance gap isn’t significant, it’s certainly noteworthy.

Power Consumption

The tables are turned when it comes to power consumption, however, with the GTX Titan X consuming 30 watts less power than the R9 390X (Fiji XT). In comparison to AMD’s previous flagship GPU, the R9 290X, it’s certainly looks like a noteworthy improvement, especially given the R9 390X’s significant boost in performance based on these benchmarks.

chiphell 5

In the coming months, it should be interesting to note how both of these unannounced cards will be priced. None of this information is confirmed, however, and we urge readers to take it with a pinch of salt. That said, it does present a highly competitive scenario for the GPU market, so we’re quite hopeful that it turns out true.

  • Wyat Eark

    Half a Titan for 3/4 the price, and split 970ish memory = AMD win’s

  • tete

    GCN? I didn’t know amd used the Nintendo GameCube in their hardware

  • paalo sordoni

    It shouldn’t be taken with a pinch of salt, it should be taken with a truck full.

  • ( )

    These benchmarks are based on the OBSOLETE DX11.

    The performance gains from DX12 and Mantle exceed 500%.

    If I am going to spend $1000 on a dGPU card I wold like to know how it will perform on the latest API not obsolete software.

    AMD also is loaded with IP that is designed to fly on Mantle and DX12. nVidia doesn’t enjoy those same benefits that HSA and GCN provides to the users of AMD silicon.

    I guess that is why the author Johan Zietsman has chosen to lie to the consumer by using obsolete benchmarks.

    • Lachlan Stewart

      uuum, you do know these are leaks, right? the author didn’t lie, he’s sharing something that people find interesting. Johan Zietsman didn’t say these were real, and he has no reason to ”lie to the consumer” he’s not the one selling the cards! DX11 is not obsolete, far from it. new engines need to be written to use it, so DX11 is still in for at least the next couple years, most likely longer for legacy support.

      before you go throwing numbers around, get your fact straight, you just end up looking stupid.

      • penis

        these leaks are fake half these cards can’t get over 60fps on 4k resolution low settings on heeps of games let alone highest settings on 4k hmmm this is on 19 games must be games from 5 years ago

  • ReadySalted80 .

    390x 4gb? this has to be more with the newer cards.

    • Petar Posavec

      The 390X is supposed to use HBM.
      Generation 1 of HBM (which is what 390X will be using) can max out at 4GB (however, that doesn’t mean it will be insufficient – because, HBM is able to shuffle much larger amounts of data through the memory as opposed to DDR5).
      Apart from that, I think were was talk of 390X2… this card will supposedly come with 8GB of HBM (type 1 also) but in a way where probably 2 GPU’s share the same connector.

      • Thomas K.

        390X will have a dual link interposer to stack 2x 4-HI HBM components for a net total of 8GB HBM. Next generation will have type 2 and double 8 to 16

  • Zon

    Old benchmarks… Was hoping something newer

  • David Compart

    Keep in mind, the Titan X can overclock like a beast where AMDs tend to not have much room.

    • Petar Posavec

      These ‘benchmarks’ cannot be taken into account at all.
      They only estimate AMD’s capabilities from a hardware and supposed clocks point of view without taking into consideration architectural changes (which I think Fiji DID get).

      Also, overclocking can only take you so far… and is not necessarily a good way to keep yourself updated with the times.

    • BestJinjo

      The Titan X in those benches appears to be overclocked already, showing 1189mhz base clock and that corresponds to 1.4Ghz+ boost. That actually matches nicely with the 8300-8400 FireStrike extreme score the Titan X gets at those base/boost clocks.

      • David Compart

        That is the max boost clock of a stock Titan X SC. That’s not full fledge overclock at all, just take a look at the VRAM. Right now my two Titans run at 1379MHz Core CLK and 2000MHz (8k) Mem CLK. Each can hit 1450 on its own, but SLI doesn’t seem to push the cards together over 1379.

        • BestJinjo

          The problem with your explanation is that a Titan X (stock) with 1189mhz Max Boost can’t hit 8300-8400 FireStrike Extreme. It usually scores 7600-7800 at stock. Therefore, either their charts are completely made up, or that 1189mhz is the Base Clock they are showing, not the boost, meaning they overclocked the card.

    • Tim

      Keep in mind though that when overclocked the Titan X consumes 460+ watts of power!

      • David Compart

        Lol. I have two different voltage meters and the max my system has pulled from the wall is 650 watts with 2x OC Titan Xs. Then add in the fact that is being pulled from the wall its even lower (remember PSUs only run at 90% efficiencies).

        • Tim

          I suggest that you get your voltmeters checked out. Titan X guzzles power just like all other cards when overclocked.

        • shadowhedgehogz

          Is that real FPS or BS fps? In other words, micro stutter. That gives false inflated FPS numbers which are not always accurate to the actual experience, this is why SLI/Xfire is not perfect compared to running a single card.

          To make matters worse, it seems Nv will allow up to 8 cards to be connected to each other soon enough, 3 is already diminishing returns and more than 1 is not perfect because not all games are compatible, plus the fact not all games are smooth. Seems they are more interested in people pouring money into buying their cards rather than solving the issue.

      • John Strickland

        Enthusiast gamers don’t give a shit about that…who cares?

  • Paul17041993

    why re-release news from months ago…?

  • Steve Smith

    These benchmarks are a few months old.

    • Beasthuntt

      I was about to say this.

    • penis

      there fake too how can you get higher performance at a higher screen res with the same settings?

  • Robert Johnson

    Just guesswork without any known knowledge of the memory speed or final clocks.