Xbox One already has an answer to Nvidia G-Sync

With the announcement of its G-sync technology, many believe that Nvidia may already have ousted the next generation of consoles before their launch. Sure enough, the GPU manufacturer makes a rather fascinating proposition, with the promise of gaming without annoyances like input lag, stutter, and screen tearing.

You would, of course, require additional hardware in the form of a display monitor specially equipped with the G-Sync module. And that’s only if you’re already equipped with an Nvidia GPU that is capable of supporting the feature. Nvidia’s solution will exist for PC gamers who wish to go all the way in terms of fulfilling the necessary hardware requirements. Such a technology does not exist on the console side of things, but Microsoft’s engineers have incorporated a rather smart and efficient solution into its forthcoming console, the Xbox One, in order to tackle some of the issues Nvidia intends to eliminate with G-Sync. There’s certainly no actual connection between either technique, however.

Spoken of briefly during a recent Digital Foundry interview with the Xbox One architects, the console’s scaler will allow developers to employ a dynamic resolution in games. In other words, the load on the GPU can be reduced by altering scaler parameters on a frame-by-frame basis in order to maintain the target frame rate of either 30 or 60 frame per second.

Theoretically, as mentioned before, this solution should tackle some of the issues Nvidia intends to eradicate with G-Sync. While G-Sync essentially allows a display screen to sync its refresh rate with the rate at which frames are being sent its way via the GPU framebuffer, the Xbox One’s scaler can supposedly maintain a constant frame rate to match the display’s refresh rate by dynamically scaling back on resolution whenever the GPU load exceeds the required performance budget. In turn, this could make screen tearing and frame stutter less of an issue on Xbox One games, much like Nvidia’s motto with G-Sync.

Dynamic resolutions have been implemented in the past with games like WipEout HD on the PS3, and more recently, Killzone Mercenary on the PS Vita. However, Microsoft’s solution appears to be available at a more fundamental level as opposed to being a software-driven solution entirely.

Of course, all of this is only in theory based on what the Xbox One architects have claimed. It remains to be seen if such favorable circumstances for game performance will actually be realized in games.

Stay tuned for further updates on the Xbox One’s hardware features.

Danial Arshad Khan

Founder of GearNuke.
Follow him on Twitter and Google+

View all posts
  • Ben A


  • Jack Slater

    My goal with my comment was to show that all the PC guys that are always bashing consoles, most have spent tons of cash on it, and they simply don’t accept that a console might manage to output some nice 1080p graphics. For them, a 400$ system should only have blurred textures and 640*480. No more ranting, with that. But as soon as a console game looks really nice, like killzone shadow fall, they need to come and say they had that resolution on PC 6 years ago.

    One thing these guys will never understand, is that console games are coded with optimization in mind.
    Take the 470mb of ram on ps3, and give them to a PC dev, and ask him to make The last of us, with that ram: we wouldn’t even make the welcome screen, or the selection menu, with it. PC games are coded with feet, instead of smartly using the brain and optimizing everything.
    A console coder will be like ‘damn, I have 20mb to display this room, all the 3d objects, lights, textures’ and he manages to do it. A PC coder will be like ‘well, the guy has 3gb of video ram, and 8gb of system ram, to hell the optimization’, and that same room will eat up 300mb.with the same assets. Only, at higher resolutions, textures will be less co pressed than on consoles. But it’s still the same textures.

    For 4-5 years already, PC gamers have had big graphic cards and huge CPU and tons of ram.current gen consoles have been working with +10 yo directx9. PC had directx10, directx11, etc. Even big titles barely use more than 10gb of HD space. Shit, PC devs should include huge ultra-big-ultra-high quality textures, so there’s a real difference when playing at high or ultra. But no, they barely have the same console textures, with less compression. Even today, where are the PC games that CAN use +5gb of video ram, 10+ GB of system ram, and use 40-50gb on HD? Where? Damn, the tech is there, since 4,5,6 years.

    Just give a console dev a 3770k with 8-16gb of ram, and 2-5 GB of video ram, with a nice card, they would make a crysis 3 on PC run at 4k and 120fps, if using the same level of optimization as on consoles.

    PC gamers are proud to have their PC rig that can run games at ultra. Damn, do you realize that the tech they have when making games, if they reall optimize every mb of ram and every CPU cycle, games could look 3 times better?

    And why this, I mask you?
    Because if directx gave the same level of coding to the metal as consoles, and amd/nvidia drivers were perfectly optimized to use all the directx features, and coders optimized PC games like consoles, what would happen?

    Well, an average CPU ,4 simple gb of system ram and a 200$ graphic card would allow you guys to play all games at ultra, at 2560*1600,60+fps.
    And if that happens, will Intel sell the 1000$ CPUs? Will nvidia still sell 700$ graphic cards? What for? For the 20 PC gamers in the world that have a 4k display?

    That’s why games always seem to require the latest GPU and CPU to run. That makes Intel and nvidia/amd stay alive.

    Tell me, why don’t Intel invest on 15um, or 8um, instead of waiting for 2017 or later to do so?
    In one year, they would release a CPU 10 times faster than a 4470k, that would just eat up 20-30w, and they would definitely kill amd CPUs. Why don’t they do it?
    Because with such CPU, you won’t need to upgrade you PC before 5-7 years. And meanwhile, Intel won’t sell a single extra CPU, everybody already has the ‘best’.

    No, they prefer releasing a CPU each year, or GPU, and have people spend billions on upgrades.

    Again, ask me, why nvidia doesn’t make a graphic card,with 10gb of ram, and 4 titan chips,’and sell it for 2000$? Or 16gb and 8 titan chips, for 4000$? Why? Let’s admit a PSU would provide enough juice to power it. Why don’t they do it? Once again, because the guy that would buy it, won’t buy another card until 2018.
    They prefer slightly upgrading the hardware, and have people spend 500$ every single year.

    Once thing really positive for PC gamers, is the xbox one and ps4. Instead of shitting on it, you should applaude with both hands and feet, and embrace them. With the ps3 and x360, no matter which PC game is, it would always look miles better than consoles, without even comparison. But now, when games like killzone shadow fall, NBA, infamous second son, and many other awesome 1080p games will be released, PC devs will have to move their butts, and really work much harder, increase the gap.
    Of course, a 4770, or an overclocked 2500, with a 500$ card, games on PC will STILL look much better.
    But maybe only 10% of gamers actually have rigs like these. Most, have much more modest configs, and can barely play at 1080p at high-setting. And IT’S these guys, who will compare their PC games with an uncharted 4 on ps4, for example, and will start saying ‘crap, I have a 1000-1200$ rig, and my graphics aren’t that BETTER than uncharted 4’. Yes, don’t tell me I don’t know about PCs, I have like 10-12 PCs at home(s), for different purposes(music with cubase, video/photos, servers, testings, virtualization, etc, don’t tell me you can buy a PC that will smash a ps4, with 500$. You need a fast-response monitor for gaming, for 2560 res, a 27″ will already cost 400-500$, and not talking about no-name ones from eBay.add CPU,ram, graphic card, a nice silent PSU, a nice Noctua, a nice silent antec-like p280, have 2 of these, they are amazing, DVD player, etc etc, you are far from 500$)

    With this said, now that next-gen games will start looking amazing(console standards..), and quite good and respectable (to PC standards), PC devs no longer can sit, and use 300mb of ram for displaying a simple scene, just because he know gamers have tons of, for justifying and convincing people that PC is really the place where the best graphics are, they need to work harder.optimize.finally.
    Because not everybody has titans or 2000$ rigs. Because if a guy watches a gameplay video of cod on PC, with setting most gamers have,and there’s a killzone shadow fall running as well, will there be a huge difference? Not really.

    All this to say next gen consoles is actually REALLY good for PC games and gamers. The bar is being set quite high on next gen consoles, PCs need to so the same. And it’s only up to devs to optimize their PC games, and use 1mb for displaying something, instead of 10mb. Because having a PC game using 10gb of system ram doesn’t mean the game is so powerful and the engine is so advanced that it requires all that power. It can most of the time mean nothing has been optimized.

    Hope you guys understand… instead of attacking with the same old arguments,with no thinking behind them.

    Thanks for reading.

    • All this might be true if your assumption that all PC devs are lazy were true. But it’s not, and it’s foolish to think that. Games suffer due to consoles, but in ways other than graphics. Why do you think most games have shitty ass AI? Why aren’t there massively multiplayer games on consoles? Why do you never see more than at most 20 characters on screen? Console limitations with regards to CPU horsepower and memory.

      And you’re right! This next generation will help that! But for how long? A couple years down the road and PC is stuck in that same space where all these cool innovations are put on hold in order to have these games release on consoles. It’s frustrating as hell.


      Honestly, i don’t know why I even bother trying. This will probably go right over heads like is has every other time I try to correct someone.

      • Jeffrey Byers

        Jack Slater is completely misinformed on how game dev’s work.

        John Carmack just gave a speech talking about how great it was to have the extra processing and especially memory in the next-gen consoles because the COST TO OPTIMIZE games was too expensive. John said they’d do most of the coding then spend almost as long optimizing code to fit the memory/processing budget.

        He then said while it would be nice from an aesthetic sense to optimize every game to run at its best on next-gen (applies to PC) it just can’t be done from a fiscal standpoint especially with the spiralling cost of AAA game development.

        Jack, it’s got NOTHING to do with laziness and EVERYTHING to do with something called a budget.

  • Guest

    Just G-Sync that shit in the cloud!

  • John G

    I don’t think the author of this piece understands either the article or the technologies being discussed, because this has absolutley nothign to do with V or G sync.

    • Erudito De Firenze


  • JinOntario

    I’m glad for advancements in hardware, software etc., just so long as it’s a benefit to gamers. Eventually, the best that each platform has to offer is imitated, in one way or another, by the others.

  • Jack Slater

    Attention.attention. intrusion detected. A ship full of PC elitists is approaching. They’re coming to defend nvidia technology, that must be superior to anything ever made. If a console can reproduce the same effects via special chips or software, it’s simply forbidden. Only PCs should be able to do it.consoles must be weak, and display average graphics. If Sony first party manages to create some amazing graphics that would require ‘very high’ settings on PC, for having the same quality, once again, it’s forbidden. Never a 400$ console should output amazing graphics.forbidden. people who spent 2500-3000$ on a PC and monitor will sue Sony, if they make something that closely looks really good, compared with the PC.
    Once again, the PC community won’t allow studios to max out the consoles and release real good looking games.
    A 400$ system should only be authorised to output 640*480 graphics, at 30 fps.something like ‘very low’ settings on PC, with all the graphic enhancements turned off. Anything above that, people that have spent 1200$ on a sli with 2 graphic cards, simply won’t tolerate it. Once again, it’s forbidden. These guys should make a petition and send it to guerrilla games, asking them to lower the resolution to 720p max, and 20-30fps, with blurred textures.that’s what a ps4 should only be allowed to do, because of its low price.

    Attention.attention. more ships approaching.

    • Guest

      More like a ship full of crybaby $0N¥ paupers.

    • John G

      Why are you so angry?

    • baz

      That was quite a rant. Most of the time you can equate cost to performance with quite a high degree of accuracy. PC’s at the enthusiast levels are very expensive, the GPU alone may cost double what an xbox one or ps4 may cost. It is not a case of brand loyalty or any such nonsense, but purely a function of paying more for more performance. Why not stop talking absolute shit on internet forums and start examining the technology at play?

      PCs are not ‘better’ than consoles on a dollar for dollar or pound for pound basis. They are simply the by product of spending more money for more performance. I have owned PCs and game consoles for decades and fanboyism, ignorance and blatant stupidity towards matters of hardware really annoys me.

      If you are an enthusiast, why not educate yourself a little on the underlying technology, there is no great mystery here. As a matter of fact the new generation of consoles are completely PC derived, do a little reading.

      • ChrisW

        “PCs are not ‘better’ than consoles on a dollar for dollar or pound for pound basis.”

        Absolutely true! Consoles give you way more bang for your buck concerning entertainment. And despite myself having a rather ‘high-end’ PC, I know very well that if one were to, right now, build a computer for around $400 (minus monitor, keyboard, mouse, and other accessories) it would more than likely perform less than the next-gen consoles. However, 3 or 4 years into the next-gen consoles’ life cycle, PCs *might* be able to outperform… but not by much.

        • Try it. Try building one with that budget. If you have any sort of chops for PC building you’ll find it’s actually pretty easy to match or exceed. Then take into consideration there’s no yearly fee for online… add +$50 to the budget. Game sales/humble bundles/AAA free to play? +$200 to the budget. Now you have something with a little longevity too.

          It’s not that hard. Just takes a little thought.

    • EZ

      WTF are making stupid and inaccurate labels for? Who in their thinks every PC gamer owns an Nvidia card because the Steam Hardware Survey says otherwise. And why I am seeing string of numbers of random resolutions and frames? It is an attempt for you to look smart and talk about a topic you clearly have no understanding.

  • Joseph Lan

    “…the console’s scaler will allow developers to employ a dynamic
    framebuffer in games without the need for additional software
    intervention, given that the scaler parameters can be altered on a
    frame-by-frame basis.”

    This “dynamic frame buffer” has absolutely nothing to do with frame-tearing, V-sync, or G-sync. What the developers basically mean is that they have the ability to use the built-in scaler of the Xbone to change the resolution (for example, dropping from 900p to 720p). They would drop the resolution in case the console is too slow to render a complex scene without major slowdown. This is NOT a fix for screen tearing or frame timing issues.

    Many current-gen PS3 and 360 games already have dynamic resolutions. For example, Riddick: Dark Athena, Wipeout HD, and Black Ops 2 on PS3.

  • MJA6

    HERE WE GO AGAIN! Sony fans unite!
    Despite this article coming off as incomprehensible gibberish to the majority of people about to read it (myself included), I feel some strong opinions about to form..

  • Level Zero

    Honest opinion? you shouldn’t write about hardware unless you know what it is, GSync is intended to insure only complete frames are displayed at a rate the supporting gpu can handle, its a technology to support the DISPLAY, not the graphics processor, allowing for smooth play and no torn frames even if the GPU has vsync turned off, at any framerate you want, not locking it to 30 or 60.

    Again, this is technology aimed at the DISPLAY, not the gpu.

    • SHU

      A rigatou gozaimaztaa

      • American Joe

        Thank You Microsoft, for outsourcing programming to India, and production To China.. Made In the USA, yea right,, who do you think you are, Wal-Mart?

        • Megaman

          yes….Xbox One ….Everything u need in one box *You Don’t hate ….I don’t your not

      • Erudito De Firenze

        ms uses foxcon too and sony consoles typically do have better cooling than ms ones

        • Ben A


      • chimp

        LOL. Do you think any American company is purely American anymore? Most of them outsource to Asia.

    • Megaman

      What system are you picking up at launch or at all??? 🙂

    • Garrett McGinnis

      You’re right, and every idiot who reports on this so called “story” tries to spin it like lowering the resolution when the framerate gets low is somehow akin to matching the frames drawn by the GPU to the frames displayed by the monitor, they’re not the same thing. This sort of resolution changing “technology” deserves to be laughed at, hard, what happened to that 1080p @ 60fps fairy tale? Guess that’s all it was.

  • Axe99

    Maintaining a constant frame rate by scaling resolution has been done in a few games this gen. Only one I can think of, off the top of my head, is Wipeout HD on the PS3, but I think a number of titles across both HD platforms this gen have given it a crack. It sounds like the XB1 hardware will make this easier to do, but it’s not something ‘new’ per se.

    • Usman Khan

      G-sync will works only with g-sync chip enabled monitors.

      • Axe99

        Aye, that’s what I meant – G-Sync sounds interesting, but given it’ll need a new GPU (for me) and a new monitor, I’ll wait until the tech’s matured a bit and the prices have dropped.

        • John G

          That’s not an unreasonable position.

          And as I mentioned below, this has zero to do with V0sync or screen tearing or G-sync.

          It’s not even a particularly great feature. Unfortunately some devs will simply utilize it vs spending the time optimizing level design or the 3D renderer leading to games that jump around in resolution, and therefore jump around in clarity/detail/image quality. Not sure that’s necessarily a good thing.

          • Axe99

            Aye, agreed. It’s a confused and misleading article, this one.