FPS limit for g sync. Review of NVIDIA G-Sync technology and ASUS ROG SWIFT PG278Q monitor

G-Sync technology overview | A Brief History of Fixed Refresh Rates

Once upon a time, monitors were bulky and contained cathode ray tubes and electron guns. Electron guns bombard the screen with photons to illuminate colored phosphor dots, which we call pixels. They draw from left to right each "scan" line from top to bottom. Adjusting the speed of the electron gun from one full upgrade to the next was not very practiced before, and there was no particular need for this before the advent of three-dimensional games. Therefore, CRTs and related analog video standards were designed with a fixed refresh rate.

LCD monitors gradually replaced CRTs, and digital connectors (DVI, HDMI and DisplayPort) replaced analog ones (VGA). But the associations responsible for standardizing video signals (led by VESA) have not moved from a fixed refresh rate. Movies and television still rely on constant frame rate input. Once again, switching to a variable refresh rate doesn't seem necessary.

Adjustable frame rates and fixed refresh rates do not match

Prior to the advent of modern 3D graphics, fixed refresh rates were not a problem for displays. But it arose when we first encountered powerful GPUs: the rate at which the GPU rendered individual frames (what we call frame rate, usually expressed in FPS or frames per second) is inconsistent. It changes over time. In heavy graphics scenes, the card can provide 30 FPS, and if you look at the empty sky - 60 FPS.


Disabling sync causes tearing

It turns out that the variable frame rate GPU and fixed refresh rate LCD panels don't work very well together. In this configuration, we are faced with a graphical artifact called "gap". It occurs when two or more incomplete frames are rendered together during one monitor refresh cycle. Usually they are displaced, which gives a very unpleasant effect during movement.

The image above shows two well-known artifacts that are often found but difficult to capture. Since these are display artifacts, on normal game screenshots you won't see it, but our pictures show what you actually see while playing. To shoot them, you need a camera with a high-speed shooting mode. Or if you have a video capture card, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to the next; this is the way we use for FCAT tests. However, it is best to observe the described effect with your own eyes.

The tearing effect is visible in both images. The top one is done with the camera, the bottom one is through the video capture function. The bottom image is "sliced" horizontally and looks misaligned. In the top two images, the left shot was taken on a Sharp screen at 60Hz, the right shot on an Asus display at 120Hz. The tearing on the 120Hz display isn't as pronounced as the refresh rate is twice as high. However, the effect is visible, and appears in the same way as in the left image. This type of artifact is a clear indication that the images were taken with vertical sync (V-sync) disabled.


Battlefield 4 on GeForce GTX 770 with V-sync off

The second effect seen in BioShock: Infinite footage is called ghosting. It is especially visible at the bottom of the left image and is related to the screen refresh delay. In short, individual pixels don't change color fast enough, resulting in this type of glow. A single frame cannot convey the effect of ghosting on the game itself. A panel with an 8ms grey-to-gray response time, such as the Sharp, will result in blurred image with any movement on the screen. This is why these displays are generally not recommended for FPS games.

V-sync: "an sew on the soap"

Vertical sync, or V-sync, is a very old solution to tearing. When this feature is activated, the graphics card tries to match the screen refresh rate by completely removing tearing. The problem is that if your graphics card can't keep the frame rate above 60 FPS (on a 60Hz display), the effective frame rate will jump between multiples of the screen refresh rate (60, 30, 20, 15 FPS, etc.). etc.), which in turn will lead to noticeable braking.


When the frame rate drops below the refresh rate with V-sync active, you will experience stuttering

Moreover, since vsync makes the graphics card wait and sometimes relies on the invisible surface buffer, V-sync can add additional input latency to the render chain. Thus, V-sync can be both a salvation and a curse, solving some problems while causing other disadvantages. An informal survey of our staff found that gamers tend to turn v-sync off, and turn it on only when tearing becomes unbearable.

Get Creative: Nvidia Introduces G-Sync

When starting a new video card GeForce GTX 680 Nvidia has included a driver mode called Adaptive V-sync, which attempts to mitigate the problems of enabling V-sync when the frame rate is above the monitor's refresh rate, and quickly turning it off when performance falls sharply below the refresh rate. While the technology did its job faithfully, it was only a workaround that prevented tearing when the frame rate was below the monitor's refresh rate.

Implementation G-Sync much more interesting. Generally speaking, Nvidia is showing that instead of forcing graphics cards to run at a fixed display frequency, we can force new monitors to run at a variable frequency.


The GPU frame rate determines the refresh rate of the monitor, removing artifacts associated with enabling and disabling V-sync

The Packet data transfer mechanism of the DisplayPort connector has opened up new possibilities. By using variable blanking intervals in the DisplayPort video signal, and by replacing the monitor scaler with a variable blanking module, the LCD panel can operate at a variable refresh rate related to the frame rate output by the video card (within the monitor's refresh rate). In practice, Nvidia has been creative in using the special features of the DisplayPort interface and trying to catch two birds with one stone.

Even before the tests begin, I want to give credit for the creative approach to solving a real problem that affects PC games. This is innovation at its finest. But what are the results G-Sync on practice? Let's find out.

Nvidia sent us an engineering sample of the monitor Asus VG248QE, in which the scaler is replaced by a module G-Sync. We are already familiar with this display. The article is dedicated to him "Asus VG248QE review: $400 24" 144Hz gaming monitor", in which the monitor earned the Tom's Hardware Smart Buy award. Now it's time to find out how Nvidia's new technology will affect the most popular games.

G-Sync technology overview | 3D LightBoost, built-in memory, standards and 4K

As we browsed Nvidia's press releases, we asked ourselves quite a few questions, both about the technology's place in the present and its role in the future. During a recent trip to the company's headquarters in Santa Clara, our US colleagues received some answers.

G-Sync and 3D LightBoost

The first thing we noticed is that Nvidia sent the monitor Asus VG248QE, modified to support G-Sync. This monitor also supports Nvidia's 3D LightBoost technology, which was originally designed to boost the brightness of 3D displays but has long been used unofficially in 2D mode, using a pulsing panel backlight to reduce ghosting (or motion blur). Naturally, it became interesting whether this technology is used in G-Sync.

Nvidia gave a negative answer. While using both technologies at the same time would be the ideal solution, today strobe backlighting at a variable refresh rate results in flickering and brightness issues. Solving them is incredibly difficult, since you need to adjust the brightness and track the pulses. As a result, the two technologies now have to be chosen, although the company is trying to find a way to use them simultaneously in the future.

Built-in G-Sync module memory

As we already know G-Sync eliminates the incremental input lag associated with V-sync, as there is no longer a need to wait for the panel scan to complete. However, we noticed that the module G-Sync has built-in memory. Can the module buffer frames on its own? If so, how long does it take for the frame to pass through the new channel?

According to Nvidia, frames are not buffered in the module's memory. As data arrives, it is displayed on the screen, and the memory performs some other functions. However, the processing time for G-Sync noticeably less than one millisecond. In fact, almost the same delay we experience with V-sync turned off, and it is related to the features of the game, video driver, mouse, etc.

Will G-Sync be standardized?

Such a question was asked in a recent interview with AMD, when a reader wanted to know the company's reaction to technology. G-Sync. However, we wanted to ask the developer directly and see if Nvidia plans to bring the technology to the industry standard. In theory, a company can offer G-Sync as an upgrade to the DisplayPort standard, which provides variable refresh rates. After all, Nvidia is a member of the VESA association.

However, no new specifications for DisplayPort, HDMI, or DVI are planned. G-Sync and so it supports DisplayPort 1.2, that is, the standard does not need to be changed.

As noted, Nvidia is working on compatibility G-Sync with a technology currently called 3D LightBoost (but will soon have a different name). In addition, the company is looking for a way to reduce the cost of modules G-Sync and make them more accessible.

G-Sync at Ultra HD Resolutions

Nvidia promises monitors with support G-Sync and resolutions up to 3840x2160 pixels. However, the model from Asus, which we will review today, only supports 1920x1080 pixels. On the this moment Ultra HD monitors use the STMicro Athena controller, which has two scalers to create a tiled display. We are wondering if the module G-Sync support MST configuration?

Truth be told, 4K displays with variable frame rates will have to wait. There is no separate 4K upscaling device yet, the nearest one should appear in the first quarter of 2014, and monitors equipped with them - only in the second quarter. Since the module G-Sync replaces the zoom device, compatible panels will start to appear after this point. Fortunately, the module natively supports Ultra HD.

What happens before 30 Hz?

G-Sync can change the screen refresh rate up to 30 Hz. This is explained by the fact that at very low screen refresh rates, the image on the LCD screen begins to deteriorate, which leads to the appearance of visual artifacts. If the source provides less than 30 FPS, the module will update the panel automatically, avoiding possible problems. This means that one image can be played more than once, but the lower threshold is 30 Hz, which will provide the highest quality image.

G-Sync technology overview | 60Hz Panels, SLI, Surround and Availability

Is the technology limited to high refresh rate panels only?

You will notice that the first monitor with G-Sync it initially has a very high screen refresh rate (above the level required by the technology) and a resolution of 1920x1080 pixels. But the Asus display has its own limitations, such as a 6-bit TN panel. We became curious, the introduction of technology G-Sync is it only planned for high refresh rate displays or will we see it on the more common 60hz monitors? In addition, I want to get access to a resolution of 2560x1440 pixels as quickly as possible.

Nvidia reiterated that the best experience from G-Sync can be obtained when your video card keeps the frame rate within 30 - 60 FPS. Thus, the technology can really benefit from conventional monitors with a frequency of 60 Hz and a module G-Sync .

But why use a 144Hz monitor then? It seems that many monitor manufacturers have decided to implement a low motion blur (3D LightBoost) feature that requires a high refresh rate. But those who decide not to use this function (and why not, because it is not yet compatible with G-Sync) can create a panel with G-Sync for much less money.

Speaking of resolutions, it's shaping up like this: QHD screens with a refresh rate of more than 120Hz could start shipping as early as early 2014.

Are there problems with SLI and G-Sync?

What does it take to see G-Sync in Surround mode?

Now, of course, there is no need to combine the two graphics adapters to display the image in 1080p quality. Even a mid-range Kepler-based graphics card will be able to provide the level of performance needed to comfortably play at this resolution. But there is also no way to run two cards in SLI on three G-Sync monitors in Surround mode.

This limitation is due to modern display outputs on Nvidia cards, which typically have two DVI ports, one HDMI and one DisplayPort. G-Sync requires DisplayPort 1.2 and the adapter will not work (nor will an MST hub). The only option is to connect three monitors in Surround mode to three cards, i.e. There is a separate card for each monitor. Naturally, we assume that Nvidia partners will start releasing "G-Sync Edition" cards with more DisplayPort connectors.

G-Sync and triple buffering

Active triple buffering was required to play comfortably with v-sync. Is she needed for G-Sync? The answer is no. G-Sync not only does it not require triple buffering, since the channel never stops, it, on the contrary, harms G-Sync, because it adds an extra delay frame with no performance gain. Unfortunately, game triple buffering is often set on its own and cannot be bypassed manually.

What about games that usually react badly when V-sync is disabled?

Games like Skyrim, which is part of our test suite, are designed to run at V-sync on a 60Hz panel (although this does make life difficult for us at times due to input lag). To test them, modification of certain files with the .ini extension is required. As it behaves G-Sync with games based on Gamebryo and Creation engines that are sensitive to vertical sync settings? Are they limited to 60 FPS?

Secondly, you need a monitor with an Nvidia module G-Sync. This module replaces the screen scaler. And, for example, add to the split Ultra HD display G-Sync impossible. In today's review, we use a prototype with a resolution of 1920x1080 pixels and a refresh rate of up to 144Hz. But even with it, you can get an idea of ​​​​what impact will have G-Sync if manufacturers start installing it in cheaper panels at 60 Hz.

Thirdly, a DisplayPort 1.2 cable is required. DVI and HDMI are not supported. In the short term, this means that the only option to work G-Sync on three monitors in Surround mode, it is their connection via a triple SLI bundle, since each card has only one DisplayPort connector, and adapters for DVI to DisplayPort do not work in this case. The same goes for MST hubs.

And finally, do not forget about driver support. The latest package version 331.93 beta is already compatible with G-Sync, and we anticipate that future WHQL-certified versions will feature it as well.

test stand

Test stand configuration
CPU Intel Core i7-3970X ( Sandy Bridge-E), base frequency 3.5 GHz, overclocked to 4.3 GHz, LGA 2011, 15 MB shared L3 cache, Hyper-Threading enabled, power saving features enabled.
Motherboard MSI X79A-GD45 Plus (LGA 2011) X79 Express Chipset, BIOS 17.5
RAM G.Skill 32GB (8 x 4GB) DDR3-2133, F3-17000CL9Q-16GBXM x2 @ 9-11-10-28 & 1.65V
Storage device Samsung 840 Pro SSD 256GB SATA 6Gb/s
Video cards Nvidia GeForce GTX 780 Ti 3 GB
Nvidia GeForce GTX 760 2 GB
Power Supply Corsair AX860i 860W
System software and drivers
OS Windows 8 Professional 64-bit
DirectX DirectX 11
Video driver Nvidia GeForce 331.93 Beta

Now we need to figure out in what cases G-Sync has the biggest impact. Chances are good that you are already using a monitor with a refresh rate of 60Hz. Among gamers, 120 and 144 Hz models are more popular, but Nvidia rightly assumes that the majority of enthusiasts on the market will still stick to 60 Hz.

With V-sync active on a 60Hz monitor, the most noticeable artifacts appear when the card can't deliver 60fps, resulting in annoying jumps between 30 and 60 FPS. There are noticeable slowdowns here. With V-sync disabled, the tearing effect will be most noticeable in scenes where you need to rotate the camera frequently or in which there is a lot of movement. For some players, this is so distracting that they simply turn on V-sync and endure stuttering and input lag.

With refresh rates of 120 and 144 Hz and higher frame rates, the display refreshes more frequently, reducing the amount of time a single frame persists across multiple screen scans when performance is poor. However, problems with active and inactive vertical sync persist. For this reason, we will test the Asus monitor in 60 and 144 Hz mode with technology on and off. G-Sync .

G-Sync technology overview | Testing G-Sync with V-Sync enabled

It's time to start testing G-Sync. It remains only to install a video capture card, an array of several SSDs and proceed to the tests, right?

No, it's wrong.

Today we measure not performance, but quality. In our case, the tests can show only one thing: the frame rate at a particular point in time. About the quality and experience of use with the technology turned on and off G-Sync they say absolutely nothing. Therefore, we will have to rely on our carefully verified and eloquent description, which we will try to bring as close to reality as possible.

Why not just record a video and give it to the readers to judge? The fact is that the camera records video at a fixed speed of 60 Hz. Your monitor also plays video at a constant 60Hz refresh rate. Because the G-Sync introduces a variable refresh rate, you will not see the technology in action.

Given the number of games available, the number of possible test combinations is countless. V-sync on, V-sync off, G-Sync on, G-Sync off, 60Hz, 120Hz, 144Hz, ... The list goes on and on. But we'll start with a 60Hz refresh rate and active vsync.

It's probably easiest to start with Nvidia's own demo utility, which swings the pendulum from side to side. The utility can simulate a frame rate of 60, 50 or 40 FPS. Or the frequency can fluctuate between 40 and 60 FPS. You can then disable or enable V-sync and G-Sync. Although the test is fictional, it demonstrates the capabilities of the technology well. You can watch a scene at 50 FPS with vsync turned on and think: "Everything is quite good, and visible stuttering can be tolerated." But after activation G-Sync I immediately want to say: "What was I thinking? The difference is obvious, like day and night. How could I live with this before?"

But let's not forget that this is a tech demo. I would like evidence based on real games. To do this, you need to run the game with high system requirements, such as Arma III.

In Arma III can be installed in a test machine GeForce GTX 770 and set ultra settings. With V-sync disabled, the frame rate fluctuates between 40 and 50 FPS. But if you enable V-sync, it will drop to 30 FPS. The performance is not high enough to see constant fluctuations between 30 and 60 FPS. Instead, the frame rate of the graphics card simply decreases.

Since there was no image freeze, there was a significant difference when activating G-Sync not noticeable, except that the actual frame rate jumps 10 - 20 FPS higher. Input lag should also be reduced, as the same frame is not kept across multiple monitor scans. We feel that Arma is generally less "jerky" than many other games, so you don't feel any lag.

On the other hand, in Metro: Last Light, the influence G-Sync more pronounced. With video card GeForce GTX 770 the game can be run at 1920x1080 resolution with very high detail settings including 16x AF, normal tessellation and motion blur. In this case, you can select SSAA options from 1x to 2x to 3x to gradually reduce the frame rate.

In addition, the game's environment includes an antechamber where it's easy to strafe back and forth. Running the level with V-sync active at 60 Hz, we entered the city. Fraps showed that with triple SSAA, the frame rate was 30 FPS, and with anti-aliasing turned off, it was 60 FPS. In the first case, slowdowns and delays are noticeable. With SSAA disabled, you will get a completely smooth picture at 60 FPS. However, activating 2x SSAA causes fluctuations from 60 to 30 FPS, from which each duplicated frame creates an inconvenience. This is one of the games where we would definitely turn off v-sync and just ignore tearing. Many people have already developed a habit.

However G-Sync removes all negative effects. You no longer have to look at the Fraps counter waiting for drops below 60 FPS to lower one more graphical setting. On the contrary, you can increase some of them, because even if you slow down to 50 - 40 FPS, there will be no obvious slowdowns. What if you turn off vertical sync? You will learn about this later.

G-Sync technology overview | Testing G-Sync with V-Sync Disabled

The conclusions in this article are based on a survey of authors and friends of Tom's Hardware on Skype (in other words, the sample of respondents is small), but almost all of them understand what vertical synchronization is and what disadvantages users have to put up with in that connection. According to them , they resort to V-sync only when tears due to a very large spread in frame rate and monitor refresh rate become unbearable.

As you can imagine, the visual impact of turning Vsync off is hard to confuse, although this is highly influenced by the specific game and its detail settings.

Take, for example, Crysis 3. The game can easily bring your graphics subsystem to its knees at the highest graphics settings. And because Crysis 3 is a first-person shooter with very dynamic gameplay, the gaps can be quite noticeable. In the example above, the FCAT output was captured between two frames. As you can see, the tree is completely cut.

On the other hand, when we force vsync off in Skyrim, the tearing isn't that bad. Note that in this case the frame rate is very high and several frames appear on the screen with each scan. So reviews, the number of movements per frame is relatively low. There are problems when playing Skyrim in this configuration, and it may not be the most optimal. But it shows that even with v-sync turned off, the feel of the game can change.

As a third example, we chose a shot of Lara Croft's shoulder from Tomb Raider, which shows a pretty clear tear in the image (also look at the hair and the strap of the tank top). Tomb Raider is the only game in our sample that allows you to choose between double and triple buffering when vsync is enabled.

The last graph shows that Metro: Last Light with G-Sync at 144Hz generally delivers the same performance as with Vsync disabled. However, the graph does not show the absence of gaps. If you use technology with a 60 Hz screen, the frame rate will hit 60 FPS, but there will be no slowdowns or delays.

In any case, those of you (and us) who have spent countless hours on graphics benchmarks, watching the same benchmark over and over again, could get used to them and visually determine how good a particular result is. This is how we measure the absolute performance of video cards. Changes in the picture with the active G-Sync immediately catch the eye, as there is a smoothness, as with V-sync turned on, but without the breaks characteristic of V-sync turned off. Too bad we can't show the difference in the video right now.

G-Sync technology overview | Game Compatibility: Almost Great

Checking other games

We tested a few more games. Crysis 3, Tomb Raider, Skyrim, BioShock: Infinite, Battlefield 4 visited the test bench. All of them, except Skyrim, have benefited from technology G-Sync. The effect depended on competitive play. But if you saw him, you would immediately admit that you ignored the shortcomings that were present before.

Artifacts can still appear. For example, the creep effect associated with anti-aliasing is more noticeable with smooth motion. You will most likely want to set the anti-aliasing as high as possible in order to remove unpleasant bumps that were not so noticeable before.

Skyrim: Special Case

The Creation graphics engine that Skyrim is based on activates vertical sync by default. To test the game at a frame rate above 60 FPS, add the iPresentInterval=0 line to one of the game's .ini files.

Thus, Skyrim can be tested in three ways: in its original state, by allowing the Nvidia driver to "use application settings", enable G-Sync in the driver and leave the Skyrim settings intact, and then enable G-Sync and disable V-sync in the game's .ini file.

The first configuration, in which the experimental monitor is set to 60 Hz, showed a stable 60 FPS at ultra settings with a video card GeForce GTX 770. Consequently, we got a smooth and pleasant picture. However, user input still suffers from latency. In addition, the side-to-side strafe revealed noticeable motion blur. However, this is how most people play on PC. Of course, you can buy a screen with a 144Hz refresh rate and it will really eliminate blur. But since GeForce GTX 770 provides a refresh rate of around 90 - 100 fps, there will be noticeable stuttering when the engine fluctuates between 144 and 72 FPS.

At 60 Hz G-Sync has a negative effect on the picture, this is probably due to active vertical sync, despite the fact that the technology should work with V-sync disabled. Now lateral strafe (especially closer to the walls) leads to pronounced braking. This is a potential problem for 60Hz panels with G-Sync, at least in games like Skyrim. Fortunately, in the case of the Asus VG248Q monitor, you can switch to 144 Hz mode, and despite the active V-sync, G-Sync will work at this frame rate without any complaints.

Disabling vertical sync completely in Skyrim results in much "sharper" mouse control. However, this does introduce tearing in the image (not to mention other artifacts such as shimmering water). Inclusion G-Sync leaves the stuttering at 60Hz, but at 144Hz the situation improves significantly. Although we test the game with vsync disabled in our video card reviews, we wouldn't recommend playing without it.

For Skyrim, perhaps the most best solution will disable G-Sync and play at 60Hz, which will give you a consistent 60fps on your chosen graphics settings.

G-Sync technology overview | G-Sync - what are you waiting for?

Even before we received a test sample of an Asus monitor with technology G-Sync, we've already been encouraged by the fact that Nvidia is working on a very real problem affecting games that has yet to be addressed. Up until now, you have been able to turn V-sync on or off to your liking. At the same time, any decision was accompanied by compromises that negatively affect the gaming experience. If you prefer not to enable v-sync until image tearing becomes unbearable, then we can say that you are choosing the lesser of two evils.

G-Sync solves the problem by allowing the monitor to scan the screen at a variable frequency. Innovation like this is the only way we can continue to advance our industry while maintaining our technical edge. personal computers over gaming consoles and platforms. Nvidia will no doubt stand up to criticism for not developing a standard that competitors could apply. However, the company uses DisplayPort 1.2 for its solution. As a result, just two months after the announcement of the technology G-Sync she was in our hands.

The question is, is Nvidia delivering everything it promised with G-Sync?

Three talented developers touting the qualities of a technology you've never seen in action can inspire anyone. But if your first experience with G-Sync based on Nvidia's pendulum demo test, you're sure to wonder if such a huge difference is even possible, or if the test represents a special scenario that's too good to be true.

Naturally, when testing the technology in real games, the effect is not so unambiguous. On the one hand, there were exclamations of "Wow!" and "Go crazy!", on the other - "I think I see the difference." Best activation effect G-Sync noticeable when changing the display refresh rate from 60 Hz to 144 Hz. But we also tried to test at 60Hz with G-Sync to see what you get (hopefully) with cheaper displays in the future. In some cases, simply going from 60 to 144Hz will blow your mind, especially if your graphics card can handle high frame rates.

Today we know that Asus plans to implement support for G-Sync in the model Asus VG248QE, which the company says will sell for $400 next year. The monitor has a native resolution of 1920x1080 pixels and a refresh rate of 144Hz. Version without G-Sync has already received our Smart Buy award for outstanding performance. But for us personally, a 6-bit TN panel is a disadvantage. I really want to see 2560x1440 pixels on an IPS matrix. We even settle for a 60Hz refresh rate if that helps keep the price down.

Although we are expecting a whole bunch of announcements at CES, Nvidia's official comments regarding other displays with modules G-Sync and we have not heard their prices. Also, we're not sure what the company's plans are for an upgrade module that should allow you to implement the module. G-Sync in an already purchased monitor Asus VG248QE in 20 minutes.

Now we can say it's worth the wait. You will see that in some games the influence new technologies cannot be confused, and in others it is less pronounced. But anyway G-Sync answers the "bearded" question whether to enable or not enable vertical sync.

There is another interesting idea. After we have tested G-Sync, how much longer will AMD be able to evade comments? The company teased our readers in his interview(English), noting that she will soon decide on this possibility. What if she has something in mind? The end of 2013 and the beginning of 2014 bring us many interesting news for discussion, including Battlefield 4 Mantle versions, the upcoming Nvidia Maxwell architecture, G-Sync, an AMD xDMA engine with CrossFire support, and rumors of new dual-chip graphics cards. Right now we don't have enough graphics cards with more than 3GB (Nvidia) and 4GB (AMD) GDDR5 memory, but they cost less than $1000...

Test Methodology

The ASUS ROG SWIFT PG278Q monitor has been tested with our new method. We decided to ditch the slow and sometimes inaccurate Spyder4 Elite in favor of the faster and more accurate X-Rite i1Display Pro colorimeter. Now this colorimeter will be used to measure the main parameters of the display in conjunction with the Argyll CMS software package. latest version. All operations will be carried out in Windows 8. During testing, the screen refresh rate is 60 Hz.

In accordance with the new methodology, we will measure following options monitor:

  • White brightness at backlight power from 0 to 100% in 10% steps;
  • Black brightness at backlight power from 0 to 100% in 10% steps;
  • Display contrast at backlight power from 0 to 100% in 10% increments;
  • Color gamut;
  • color temperature;
  • Gamma curves of the three primary RGB colors;
  • Gamma curve in grey;
  • Delta E (according to CIEDE2000 standard).

Delta E is used for calibration and analysis GUI for Argyll CMS - DispcalGUI, the latest version at the time of this writing. All measurements described above are carried out before and after calibration. During the tests, we measure the main monitor profiles - set by default, sRGB (if available) and Adobe RGB (if available). Calibration is carried out in the default profile, except for special cases, which will be discussed later. For monitors with wide color gamuts, we select the sRGB hardware emulation mode, if available. In the latter case, colors are converted to the monitor's internal LUTs (which can be up to 14 bits per channel) and output to a 10-bit matrix, while an attempt to narrow the color gamut to sRGB boundaries with OS color correction tools will lead to a decrease in color coding accuracy. Before starting all tests, the monitor warms up for an hour, and all its settings are reset to factory settings.

We'll also continue our old practice of posting calibration profiles for the monitors we've tested at the end of the article. At the same time, the 3DNews test lab warns that such a profile will not be able to 100% correct the shortcomings of your particular monitor. The fact is that all monitors (even within the same model) will certainly differ from each other in their small color reproduction errors. It is physically impossible to make two identical matrices - they are too complicated. Therefore, for any serious monitor calibration, a colorimeter or spectrophotometer is necessary. But even a “universal” profile created for a specific instance can generally improve the situation for other devices of the same model, especially in the case of cheap displays with pronounced color rendition defects.

Viewing angles, backlight uniformity

The first thing that interested us in the ASUS PG278Q was the viewing angles, because the monitor uses a TN-matrix - its biggest problems are always associated with them. Luckily, things didn't turn out so bad. Of course, IPS matrices have larger viewing angles, but the ASUS PG278Q didn't have to be rotated often to eliminate contrast and color distortions.

But the developers of ASUS PG278Q could not avoid problems with screen backlight. The monitor has a slight backlight in all four corners and in the upper part. If a game is running on the display, then it will not be easy to see the backlight, but it is worth running some movie in a dark room (with the usual vertical black bars on top and bottom) and the defect immediately becomes noticeable.

Testing without calibration

The maximum brightness of ASUS PG278Q was 404 cd/m 2 - even more than the manufacturer promises. Such a high value is justified by 3D support, because when using active shutter glasses, the perceived brightness of the monitor can drop by half. The maximum brightness of the black field was 0.40 cd / m 2, which is also quite good. As a result, the static contrast ratio fluctuates around 1000:1 across the entire range of backlight brightness. An excellent result - such a high contrast ratio is typical for high-quality IPS matrices. MVA, however, is out of reach.

With color gamut, our test subject is doing as well as required. The sRGB color space is covered by 107.1%. The white point is near the D65 reference point.

If we talk about games, then ASUS PG278Q has a full color palette, but there may be problems with professional photo processing due to slightly oversaturated colors due to excessive color gamut compared to sRGB. However, the display we are considering is designed just for games, so you should not pay much attention to this shortcoming.

The color temperature of the ASUS PG278Q during the measurements was kept at 6,000 K, which is 500 K below the norm. This means that light colors can have a slight warm tint.

Only the red gamma curve turned out to be close to the standard, and the blue and green curves sank, although they tried to stick together. At the same time, things are going almost well with the gray scale of the monitor. When measuring dark tones, it practically does not deviate from the reference curve, and when moving to light tones, it departs, but not much.

The average value of the Delta E color accuracy score was 2.08 units, and the maximum value was 7.07 units. The results, of course, are not the best, but, firstly, ASUS PG278Q is still intended for games, and not for photo processing, and secondly, for a TN-matrix, the results we obtained are quite satisfactory.

Testing after calibration

Usually, after calibration, the white brightness drops, and very much - by 10% or more, even for quite high-quality panels. In the case of the ASUS PG278Q, it dropped by about 3% to 391 cd/m 2 . The brightness of the black field was not affected by the hardware calibration. As a result, the static contrast ratio dropped to 970:1.

Calibration had practically no effect on the color gamut, but the white point returned to its proper place, even if it moved only a little.

After calibration, the color temperature rose slightly, but did not reach the reference. Now the gap between the measured and reference value was approximately 100-200 K instead of 500 K, which, however, is quite tolerable.

The position of the three main gamma curves, unfortunately, did not change much after calibration, while the gray scale began to look a little better.

But the calibration had the best effect on color accuracy. The average value of Delta E dropped to 0.36 units, the maximum - to 1.26 units. Excellent results for any matrix, and for TN + Film - just fantastic.

G-Sync Testing: Methodology

NVIDIA's G-Sync guide shows settings for multi-game testing that will hover between 40 and 60 FPS. It is in such conditions at a refresh rate of 60 Hz that most “freezes” occur with V-Sync turned on. We'll start by comparing three usage scenarios: with V-Sync, without it, and with G-Sync, all at 60Hz.

But remember that raising the refresh rate from 60 to 120/144 Hz by itself makes tearing less noticeable without vertical sync, and with V-Sync it reduces “freezes” from 13 to 8/7 ms, respectively. Is there any real benefit to G-Sync over V-Sync at 144Hz? Let's check this too.

I would like to emphasize that, according to the description, in the case of G-Sync, the refresh rate does not make sense at all. Therefore, it is not entirely correct to say that we, for example, compared V-Sync and G-Sync at a frequency of 60 Hz. V-Sync was really at 60Hz, and G-Sync means screen refresh on demand, not with a certain period. But even with G-Sync enabled, we can still choose the screen refresh rate in the driver control panel. At the same time, FRAPS in games when G-Sync is activated shows that exactly the same frame rate ceiling is in effect, as if V-Sync was working. It turns out that this setting regulates the minimum frame lifetime and, accordingly, the screen refresh interval. Roughly speaking, the frequency range in which the monitor operates is set - from 30 to 60-144 Hz.

In order to enable G-Sync, you need to go to the NVIDIA control panel, find the appropriate link in the left corner of the screen and check the box next to the only checkbox. The technology is supported in drivers for Windows 7 and 8.

Then you need to make sure that G-Sync is also enabled in the "3D Settings" section - it can be found in the Vertical Sync submenu.

That's all: the G-Sync function has been enabled for all games launched in full screen mode, - this function does not yet know how to work in the window. For testing, we used a bench with a GeForce GTX TITAN Black graphics card.

The tests were carried out in the games Assasin's Creed: Black Flag, as well as in Counter-Strike: Global Offensive. We tested the new technology in two ways: we just played, and then hunted for gaps using a script that smoothly moved the game camera, that is, “moved the mouse” horizontally. The first method allowed us to evaluate the sensations of G-Sync "in combat", and the second one - to more clearly see the difference between on/off vertical sync and G-Sync.

G-Sync in Assassin's Creed: Black Flag, 60Hz

Without V-Sync and G-Sync at 60Hz tearing was perfectly visible with almost any movement of the camera.

The gap is visible in the upper right part of the frame, near the mast of the ship

When V-Sync was turned on, the image breaks disappeared, but “freezes” appeared, which did not benefit the gameplay.

The double mast of the ship in the photo is one of the signs of the "frieze"

After enabling G-Sync gaps and "friezes" disappeared completely, the game began to work more smoothly. Of course, a periodic decrease in the frame rate to 35-40 FPS was noticeable, but thanks to the synchronization of the display and the video card, it did not cause such noticeable brakes as with vertical synchronization.

However, as they say, it's better to see once than hear a hundred times, so we made a short video showing the work of the new Assassins with V-sync on and off, as well as with G-Sync. Of course, the video cannot convey the "live" impressions completely, if only because of shooting at a frequency of 30 frames per second. In addition, the camera “sees” the world differently than the human eye, so the video may show artifacts that are not visible in the real world, such as ghosting. Nevertheless, we tried to make this video as clear as possible: at least the presence or absence of gaps on it is quite noticeable.

Now let's launch Assassin's Creed: Black Flag with minimum settings and see what has changed. The number of frames per second in this game mode did not exceed 60 FPS - the set screen refresh rate. Without vertical sync turned on, tearing was noticeable on the screen. But as soon as V-Sync was turned on, the gaps disappeared and the “picture” began to look almost the same as with G-Sync.

When setting the maximum graphics settings, the number of frames per second began to fluctuate around 25-35 FPS. Of course, breaks without V-Sync and “freezes” with it immediately returned. Even the inclusion of G-Sync could not correct this situation - with such a low number of FPS, the GPU itself generates brakes.

G-Sync in Assassin's Creed: Black Flag, 144Hz

With V-Sync and G-sync disabled You could find tearing on the screen, but thanks to the 144Hz refresh rate, there are far fewer of them than before. When turned on v-sync gaps disappeared, but “friezes” began to occur more often - almost the same as with a screen refresh rate of 60 Hz.

Inclusion G-Sync, as before, was able to correct the situation, but the strongest improvement in the picture was noticeable only at high frame rates - from 60 FPS and above. But without lowering the settings or adding a second video card of the GeForce GTX Titan Black level, it was impossible to achieve such a high frame rate.

G-Sync in Counter-Strike: Global Offensive, 60 and 144 Hz

In online games, the gameplay and image quality are affected not only by the video card and monitor, but also by ping - the higher it is, the greater the delay in the "response" of the game. During our tests, the ping was at the level of 25-50ms, and the frame rate during the test fluctuated around 200 FPS.

Picture settings used in Counter-Strike: Global Offensive

Without using G-Sync and V-Sync in CS, as in Assassin's Creed, there were gaps. When turned on V-Sync at 60Hz it became more difficult to play - the frame rate dropped to 60 FPS, and the game character began to run unevenly due to the large number of "freezes".

When turned on G-Sync the frame rate remained at the level of 60 frames per second, but the “freezes” became much less. It cannot be said that they disappeared completely, but they stopped spoiling the impression of the game.

Now let's increase the screen refresh rate and see what changes. With G-Sync and V-Sync disabled at 144Hz There are much fewer discontinuities than at 60 Hz, but they have not completely disappeared. But when turned on v-sync all gaps disappeared, and the “freezes” became almost invisible: it is very comfortable to play in this mode, and the speed of movement does not decrease. Inclusion G-Sync and completely turned the image into a candy: the gameplay became so smooth that even a 25-ms ping began to greatly affect the gameplay.

ULMB mode testing

Ultra Low Motion Blur is enabled from the monitor menu, but first you need to turn off G-Sync and set the screen refresh rate to 85, 100 or 120 Hz. Lower or higher frequencies are not supported.

The practical application of this "chip" is obvious: the text on the sites is less smeared during scrolling, and in strategies and other RTS games, moving units look more detailed.

ASUS ROG SWIFT PG278Q in 3D

ASUS ROG SWIFT PG278Q is the world's first monitor capable of displaying a stereoscopic image at a resolution of 2560x1440 thanks to the DisplayPort 1.2 interface. Also, in principle, no small achievement. Unfortunately, the monitor does not have a built-in IR transmitter, so we took the transmitter from the NVIDIA 3D Vision kit and the glasses from the 3D Vision 2 kit. This pairing worked without problems, and we were able to test stereoscopic 3D properly.

We did not find any effect of ghosting and other artifacts found in pseudo-volumetric video. Of course, sometimes in games some objects were at the wrong depth, but this cannot be attributed to the disadvantages of the monitor. On ASUS PG278Q you can both watch stereo movies and play similar games. The main thing is that the video adapter pulls.

⇡ Conclusions

Not wanting to underestimate the achievements of NVIDIA, it should be noted that in general, G-Sync is such an innovation that comes down to getting rid of a long-standing and harmful atavism - regular updating of LCD panels that do not initially need it. It turned out that for this it is enough to make small changes to the DisplayPort protocol, which, with a snap of the fingers, got into the 1.2a specification and, according to AMD's promises, will very soon find application in display controllers from many manufacturers.

So far, however, only a proprietary version of this solution is available in the form of G-Sync, which we had the pleasure of testing in the ASUS ROG SWIFT PG278Q monitor. The irony is that this is just such a monitor for which the benefits of G-Sync are not very noticeable. Refreshing the screen at 144Hz alone cuts down on notorious tearing to the point where many will be willing to turn a blind eye to the problem. And with vertical sync, we have less pronounced freezes and input lag compared to 60Hz screens. G-Sync in such a situation can only bring the smoothness of the game to the ideal.

Still, synchronizing screen updates with GPU frame rendering is still a more elegant and cost-effective solution than constant update at ultra high frequency. Also, let's not forget that the use of G-Sync is not limited to matrices with a frequency of 120/144 Hz. First of all, 4K monitors come to mind, which are still limited to a frequency of 60 Hz both in terms of matrix specifications and video input bandwidth. Then there are IPS monitors, which are also unable to switch to 120/144 Hz due to the limitations of the technology itself.

With a refresh rate of 60Hz, the effect of G-Sync cannot be overstated. If the frame rate consistently exceeds 60 FPS, then simple vertical sync eliminates tearing just as well, but only G-Sync can keep the frame rate smooth when the frame rate drops below the refresh rate. In addition, G-Sync makes the 30-60 FPS performance range much more playable, either lowering GPU performance requirements or allowing for more aggressive quality settings. And again, the thought returns to 4K monitors, which require extremely powerful hardware to play with good graphics.

It's also commendable that NVIDIA has adopted the pulsing backlight technology for moving object blur removal (ULMB), which we saw earlier with the EIZO Foris FG2421. It's a pity that while it can not work simultaneously with G-Sync.

The ASUS ROG SWIFT PG278Q monitor itself is good, first of all, with a combination of 2560x1440 resolution and a refresh rate of 144 Hz. Previously, there were no devices with such parameters on the market, but meanwhile, gaming monitors with such a low response time and support for stereoscopic 3D are long overdue to grow out of Full HD. The fact that the PG278Q has a TN-matrix is ​​not worth much fault, because it is a really good copy with the highest brightness, contrast and excellent color reproduction, which IPS displays will envy after calibration. The technology is only given out by limited viewing angles. Let's not leave without praise the design befitting such a quality product. ASUS ROG SWIFT PG278Q receives a well-deserved Editors' Choice award - it turned out to be so good.

Only the price in the region of 30 thousand rubles prevents us from recommending this gaming monitor for purchase without any hesitation. In addition, at the time of this writing, ASUS ROG SWIFT PG278Q is still not sold in the Russian Federation, so there is nowhere to see it, as well as G-Sync, with your own eyes. But we hope that ASUS and NVIDIA will solve this problem in the future - for example, by showing G-Sync at exhibitions computer games. Well, the price will probably come down someday...

From the site's file server, you can download the color profile for this monitor, which we received after calibration.

The editors of the site would like to thank Graphitech for providing the X-Rite i1Display Pro colorimeter.

G-Sync technology overview | A Brief History of Fixed Refresh Rates

Once upon a time, monitors were bulky and contained cathode ray tubes and electron guns. Electron guns bombard the screen with photons to illuminate colored phosphor dots, which we call pixels. They draw from left to right each "scan" line from top to bottom. Adjusting the speed of the electron gun from one full upgrade to the next was not very practiced before, and there was no particular need for this before the advent of three-dimensional games. Therefore, CRTs and related analog video standards were designed with a fixed refresh rate.

LCD monitors gradually replaced CRTs, and digital connectors (DVI, HDMI and DisplayPort) replaced analog ones (VGA). But the associations responsible for standardizing video signals (led by VESA) have not moved from a fixed refresh rate. Movies and television still rely on constant frame rate input. Once again, switching to a variable refresh rate doesn't seem necessary.

Adjustable frame rates and fixed refresh rates do not match

Prior to the advent of modern 3D graphics, fixed refresh rates were not a problem for displays. But it arose when we first encountered powerful GPUs: the rate at which the GPU rendered individual frames (what we call frame rate, usually expressed in FPS or frames per second) is inconsistent. It changes over time. In heavy graphics scenes, the card can provide 30 FPS, and if you look at the empty sky - 60 FPS.


Disabling sync causes tearing

It turns out that the variable frame rate of the GPU and the fixed refresh rate of the LCD panel do not work very well together. In this configuration, we are faced with a graphical artifact called "gap". It occurs when two or more incomplete frames are rendered together during one monitor refresh cycle. Usually they are displaced, which gives a very unpleasant effect during movement.

The image above shows two well-known artifacts that are often found but difficult to capture. Since these are display artifacts, you won't see them in normal game screenshots, but our screenshots show what you actually see during the game. To shoot them, you need a camera with a high-speed shooting mode. Or if you have a video capture card, you can record an uncompressed video stream from the DVI port and clearly see the transition from one frame to the next; this is the way we use for FCAT tests. However, it is best to observe the described effect with your own eyes.

The tearing effect is visible in both images. The top one is done with the camera, the bottom one is through the video capture function. The bottom image is "sliced" horizontally and looks misaligned. In the top two images, the left shot was taken on a Sharp screen at 60Hz, the right shot on an Asus display at 120Hz. The tearing on the 120Hz display isn't as pronounced as the refresh rate is twice as high. However, the effect is visible, and appears in the same way as in the left image. This type of artifact is a clear indication that the images were taken with vertical sync (V-sync) disabled.


Battlefield 4 on GeForce GTX 770 with V-sync disabled

The second effect seen in BioShock: Infinite footage is called ghosting. It is especially visible at the bottom of the left image and is related to the screen refresh delay. In short, individual pixels don't change color fast enough, resulting in this type of glow. A single frame cannot convey the effect of ghosting on the game itself. A panel with an 8ms grey-to-gray response time, such as the Sharp, will result in a blurry image with any movement on the screen. This is why these displays are generally not recommended for FPS games.

V-sync: "an sew on the soap"

Vertical sync, or V-sync, is a very old solution to tearing. When this feature is activated, the graphics card tries to match the screen refresh rate by completely removing tearing. The problem is that if your graphics card can't keep the frame rate above 60 FPS (on a 60Hz display), the effective frame rate will jump between multiples of the screen refresh rate (60, 30, 20, 15 FPS, etc.). etc.), which in turn will lead to noticeable braking.


When the frame rate drops below the refresh rate with V-sync active, you will experience stuttering

Moreover, since vsync makes the graphics card wait and sometimes relies on the invisible surface buffer, V-sync can add additional input latency to the render chain. Thus, V-sync can be both a salvation and a curse, solving some problems while causing other disadvantages. An informal survey of our staff found that gamers tend to turn v-sync off, and turn it on only when tearing becomes unbearable.

Get Creative: Nvidia Introduces G-Sync

When starting a new video card GeForce GTX 680 Nvidia has included a driver mode called Adaptive V-sync, which attempts to mitigate the problems of enabling V-sync when the frame rate is above the monitor's refresh rate, and quickly turning it off when performance falls sharply below the refresh rate. While the technology did its job faithfully, it was only a workaround that prevented tearing when the frame rate was below the monitor's refresh rate.

Implementation G-Sync much more interesting. Generally speaking, Nvidia is showing that instead of forcing graphics cards to run at a fixed display frequency, we can force new monitors to run at a variable frequency.


The GPU frame rate determines the refresh rate of the monitor, removing artifacts associated with enabling and disabling V-sync

The Packet data transfer mechanism of the DisplayPort connector has opened up new possibilities. By using variable blanking intervals in the DisplayPort video signal, and by replacing the monitor scaler with a variable blanking module, the LCD panel can operate at a variable refresh rate related to the frame rate output by the video card (within the monitor's refresh rate). In practice, Nvidia has been creative in using the special features of the DisplayPort interface and trying to catch two birds with one stone.

Even before the tests begin, I want to give credit for the creative approach to solving a real problem that affects PC games. This is innovation at its finest. But what are the results G-Sync on practice? Let's find out.

Nvidia sent us an engineering sample of the monitor Asus VG248QE, in which the scaler is replaced by a module G-Sync. We are already familiar with this display. The article is dedicated to him "Asus VG248QE review: $400 24" 144Hz gaming monitor", in which the monitor earned the Tom's Hardware Smart Buy award. Now it's time to find out how Nvidia's new technology will affect the most popular games.

G-Sync technology overview | 3D LightBoost, built-in memory, standards and 4K

As we browsed Nvidia's press releases, we asked ourselves quite a few questions, both about the technology's place in the present and its role in the future. During a recent trip to the company's headquarters in Santa Clara, our US colleagues received some answers.

G-Sync and 3D LightBoost

The first thing we noticed is that Nvidia sent the monitor Asus VG248QE, modified to support G-Sync. This monitor also supports Nvidia's 3D LightBoost technology, which was originally designed to boost the brightness of 3D displays but has long been used unofficially in 2D mode, using a pulsing panel backlight to reduce ghosting (or motion blur). Naturally, it became interesting whether this technology is used in G-Sync.

Nvidia gave a negative answer. While using both technologies at the same time would be the ideal solution, today strobe backlighting at a variable refresh rate results in flickering and brightness issues. Solving them is incredibly difficult, since you need to adjust the brightness and track the pulses. As a result, the two technologies now have to be chosen, although the company is trying to find a way to use them simultaneously in the future.

Built-in G-Sync module memory

As we already know G-Sync eliminates the incremental input lag associated with V-sync, as there is no longer a need to wait for the panel scan to complete. However, we noticed that the module G-Sync has built-in memory. Can the module buffer frames on its own? If so, how long does it take for the frame to pass through the new channel?

According to Nvidia, frames are not buffered in the module's memory. As data arrives, it is displayed on the screen, and the memory performs some other functions. However, the processing time for G-Sync noticeably less than one millisecond. In fact, almost the same delay we experience with V-sync turned off, and it is related to the features of the game, video driver, mouse, etc.

Will G-Sync be standardized?

Such a question was asked in a recent interview with AMD, when a reader wanted to know the company's reaction to technology. G-Sync. However, we wanted to ask the developer directly and see if Nvidia plans to bring the technology to the industry standard. In theory, a company can offer G-Sync as an upgrade to the DisplayPort standard, which provides variable refresh rates. After all, Nvidia is a member of the VESA association.

However, no new specifications for DisplayPort, HDMI, or DVI are planned. G-Sync and so it supports DisplayPort 1.2, that is, the standard does not need to be changed.

As noted, Nvidia is working on compatibility G-Sync with a technology currently called 3D LightBoost (but will soon have a different name). In addition, the company is looking for a way to reduce the cost of modules G-Sync and make them more accessible.

G-Sync at Ultra HD Resolutions

Nvidia promises monitors with support G-Sync and resolutions up to 3840x2160 pixels. However, the model from Asus, which we will review today, only supports 1920x1080 pixels. Ultra HD monitors currently use the STMicro Athena controller, which has two scalers to create a tiled display. We are wondering if the module G-Sync support MST configuration?

Truth be told, 4K displays with variable frame rates will have to wait. There is no separate 4K upscaling device yet, the nearest one should appear in the first quarter of 2014, and monitors equipped with them - only in the second quarter. Since the module G-Sync replaces the zoom device, compatible panels will start to appear after this point. Fortunately, the module natively supports Ultra HD.

What happens before 30 Hz?

G-Sync can change the screen refresh rate up to 30 Hz. This is explained by the fact that at very low screen refresh rates, the image on the LCD screen begins to deteriorate, which leads to the appearance of visual artifacts. If the source provides less than 30 FPS, the module will update the panel automatically, avoiding possible problems. This means that one image can be played more than once, but the lower threshold is 30 Hz, which will provide the highest quality image.

G-Sync technology overview | 60Hz Panels, SLI, Surround and Availability

Is the technology limited to high refresh rate panels only?

You will notice that the first monitor with G-Sync it initially has a very high screen refresh rate (above the level required by the technology) and a resolution of 1920x1080 pixels. But the Asus display has its own limitations, such as a 6-bit TN panel. We became curious, the introduction of technology G-Sync is it only planned for high refresh rate displays or will we see it on the more common 60hz monitors? In addition, I want to get access to a resolution of 2560x1440 pixels as quickly as possible.

Nvidia reiterated that the best experience from G-Sync can be obtained when your video card keeps the frame rate within 30 - 60 FPS. Thus, the technology can really benefit from conventional monitors with a frequency of 60 Hz and a module G-Sync .

But why use a 144Hz monitor then? It seems that many monitor manufacturers have decided to implement a low motion blur (3D LightBoost) feature that requires a high refresh rate. But those who decide not to use this function (and why not, because it is not yet compatible with G-Sync) can create a panel with G-Sync for much less money.

Speaking of resolutions, it's shaping up like this: QHD screens with a refresh rate of more than 120Hz could start shipping as early as early 2014.

Are there problems with SLI and G-Sync?

What does it take to see G-Sync in Surround mode?

Now, of course, you don't need to combine two graphics adapters to display an image in 1080p quality. Even a mid-range Kepler-based graphics card will be able to provide the level of performance needed to comfortably play at this resolution. But there is also no way to run two cards in SLI on three G-Sync monitors in Surround mode.

This limitation is due to modern display outputs on Nvidia cards, which typically have two DVI ports, one HDMI and one DisplayPort. G-Sync requires DisplayPort 1.2 and the adapter will not work (nor will an MST hub). The only option is to connect three monitors in Surround mode to three cards, i.e. There is a separate card for each monitor. Naturally, we assume that Nvidia partners will start releasing "G-Sync Edition" cards with more DisplayPort connectors.

G-Sync and triple buffering

Active triple buffering was required to play comfortably with v-sync. Is she needed for G-Sync? The answer is no. G-Sync not only does it not require triple buffering, since the channel never stops, it, on the contrary, harms G-Sync, because it adds an extra delay frame with no performance gain. Unfortunately, game triple buffering is often set on its own and cannot be bypassed manually.

What about games that usually react badly when V-sync is disabled?

Games like Skyrim, which is part of our test suite, are designed to run at V-sync on a 60Hz panel (although this does make life difficult for us at times due to input lag). To test them, modification of certain files with the .ini extension is required. As it behaves G-Sync with games based on Gamebryo and Creation engines that are sensitive to vertical sync settings? Are they limited to 60 FPS?

Secondly, you need a monitor with an Nvidia module G-Sync. This module replaces the screen scaler. And, for example, add to the split Ultra HD display G-Sync impossible. In today's review, we use a prototype with a resolution of 1920x1080 pixels and a refresh rate of up to 144Hz. But even with it, you can get an idea of ​​​​what impact will have G-Sync if manufacturers start installing it in cheaper panels at 60 Hz.

Thirdly, a DisplayPort 1.2 cable is required. DVI and HDMI are not supported. In the short term, this means that the only option to work G-Sync on three monitors in Surround mode, it is their connection via a triple SLI bundle, since each card has only one DisplayPort connector, and adapters for DVI to DisplayPort do not work in this case. The same goes for MST hubs.

And finally, do not forget about driver support. The latest package version 331.93 beta is already compatible with G-Sync, and we anticipate that future WHQL-certified versions will feature it as well.

test stand

Test stand configuration
CPU Intel Core i7-3970X (Sandy Bridge-E), 3.5 GHz base clock, 4.3 GHz overclock, LGA 2011, 15 MB shared L3 cache, Hyper-Threading enabled, power saving features enabled.
Motherboard MSI X79A-GD45 Plus (LGA 2011) X79 Express Chipset, BIOS 17.5
RAM G.Skill 32GB (8 x 4GB) DDR3-2133, F3-17000CL9Q-16GBXM x2 @ 9-11-10-28 & 1.65V
Storage device Samsung 840 Pro SSD 256GB SATA 6Gb/s
Video cards Nvidia GeForce GTX 780 Ti 3 GB
Nvidia GeForce GTX 760 2 GB
Power Supply Corsair AX860i 860W
System software and drivers
OS Windows 8 Professional 64-bit
DirectX DirectX 11
Video driver Nvidia GeForce 331.93 Beta

Now we need to figure out in what cases G-Sync has the biggest impact. Chances are good that you are already using a monitor with a refresh rate of 60Hz. Among gamers, 120 and 144 Hz models are more popular, but Nvidia rightly assumes that the majority of enthusiasts on the market will still stick to 60 Hz.

With V-sync active on a 60Hz monitor, the most noticeable artifacts appear when the card can't deliver 60fps, resulting in annoying jumps between 30 and 60 FPS. There are noticeable slowdowns here. With V-sync disabled, the tearing effect will be most noticeable in scenes where you need to rotate the camera frequently or in which there is a lot of movement. For some players, this is so distracting that they simply turn on V-sync and endure stuttering and input lag.

With refresh rates of 120 and 144 Hz and higher frame rates, the display refreshes more frequently, reducing the amount of time a single frame persists across multiple screen scans when performance is poor. However, problems with active and inactive vertical sync persist. For this reason, we will test the Asus monitor in 60 and 144 Hz mode with technology on and off. G-Sync .

G-Sync technology overview | Testing G-Sync with V-Sync enabled

It's time to start testing G-Sync. It remains only to install a video capture card, an array of several SSDs and proceed to the tests, right?

No, it's wrong.

Today we measure not performance, but quality. In our case, the tests can show only one thing: the frame rate at a particular point in time. About the quality and experience of use with the technology turned on and off G-Sync they say absolutely nothing. Therefore, we will have to rely on our carefully verified and eloquent description, which we will try to bring as close to reality as possible.

Why not just record a video and give it to the readers to judge? The fact is that the camera records video at a fixed speed of 60 Hz. Your monitor also plays video at a constant 60Hz refresh rate. Because the G-Sync introduces a variable refresh rate, you will not see the technology in action.

Given the number of games available, the number of possible test combinations is countless. V-sync on, V-sync off, G-Sync on, G-Sync off, 60Hz, 120Hz, 144Hz, ... The list goes on and on. But we'll start with a 60Hz refresh rate and active vsync.

It's probably easiest to start with Nvidia's own demo utility, which swings the pendulum from side to side. The utility can simulate a frame rate of 60, 50 or 40 FPS. Or the frequency can fluctuate between 40 and 60 FPS. You can then disable or enable V-sync and G-Sync. Although the test is fictional, it demonstrates the capabilities of the technology well. You can watch a scene at 50 FPS with vsync turned on and think: "Everything is quite good, and visible stuttering can be tolerated." But after activation G-Sync I immediately want to say: "What was I thinking? The difference is obvious, like day and night. How could I live with this before?"

But let's not forget that this is a tech demo. I would like evidence based on real games. To do this, you need to run a game with high system requirements, such as Arma III.

In Arma III can be installed in a test machine GeForce GTX 770 and set ultra settings. With V-sync disabled, the frame rate fluctuates between 40 and 50 FPS. But if you enable V-sync, it will drop to 30 FPS. The performance is not high enough to see constant fluctuations between 30 and 60 FPS. Instead, the frame rate of the graphics card simply decreases.

Since there was no image freeze, there was a significant difference when activating G-Sync not noticeable, except that the actual frame rate jumps 10 - 20 FPS higher. Input lag should also be reduced, as the same frame is not kept across multiple monitor scans. We feel that Arma is generally less "jerky" than many other games, so you don't feel any lag.

On the other hand, in Metro: Last Light, the influence G-Sync more pronounced. With video card GeForce GTX 770 the game can be run at 1920x1080 resolution with very high detail settings including 16x AF, normal tessellation and motion blur. In this case, you can select SSAA options from 1x to 2x to 3x to gradually reduce the frame rate.

In addition, the game's environment includes an antechamber where it's easy to strafe back and forth. Running the level with V-sync active at 60 Hz, we entered the city. Fraps showed that with triple SSAA, the frame rate was 30 FPS, and with anti-aliasing turned off, it was 60 FPS. In the first case, slowdowns and delays are noticeable. With SSAA disabled, you will get a completely smooth picture at 60 FPS. However, activating 2x SSAA causes fluctuations from 60 to 30 FPS, from which each duplicated frame creates an inconvenience. This is one of the games where we would definitely turn off v-sync and just ignore tearing. Many people have already developed a habit.

However G-Sync removes all negative effects. You no longer have to look at the Fraps counter waiting for drops below 60 FPS to lower one more graphical setting. On the contrary, you can increase some of them, because even if you slow down to 50 - 40 FPS, there will be no obvious slowdowns. What if you turn off vertical sync? You will learn about this later.

G-Sync technology overview | Testing G-Sync with V-Sync Disabled

The conclusions in this article are based on a survey of authors and friends of Tom's Hardware on Skype (in other words, the sample of respondents is small), but almost all of them understand what vertical synchronization is and what disadvantages users have to put up with in that connection. According to them , they resort to V-sync only when tears due to a very large spread in frame rate and monitor refresh rate become unbearable.

As you can imagine, the visual impact of turning Vsync off is hard to confuse, although this is highly influenced by the specific game and its detail settings.

Take, for example, Crysis 3. The game can easily bring your graphics subsystem to its knees at the highest graphics settings. And because Crysis 3 is a first-person shooter with very dynamic gameplay, the gaps can be quite noticeable. In the example above, the FCAT output was captured between two frames. As you can see, the tree is completely cut.

On the other hand, when we force vsync off in Skyrim, the tearing isn't that bad. Note that in this case the frame rate is very high and several frames appear on the screen with each scan. So reviews, the number of movements per frame is relatively low. There are problems when playing Skyrim in this configuration, and it may not be the most optimal. But it shows that even with v-sync turned off, the feel of the game can change.

As a third example, we chose a shot of Lara Croft's shoulder from Tomb Raider, which shows a pretty clear tear in the image (also look at the hair and the strap of the tank top). Tomb Raider is the only game in our sample that allows you to choose between double and triple buffering when vsync is enabled.

The last graph shows that Metro: Last Light with G-Sync at 144Hz generally delivers the same performance as with Vsync disabled. However, the graph does not show the absence of gaps. If you use technology with a 60 Hz screen, the frame rate will hit 60 FPS, but there will be no slowdowns or delays.

In any case, those of you (and us) who have spent countless hours on graphics benchmarks, watching the same benchmark over and over again, could get used to them and visually determine how good a particular result is. This is how we measure the absolute performance of video cards. Changes in the picture with the active G-Sync immediately catch the eye, as there is a smoothness, as with V-sync turned on, but without the breaks characteristic of V-sync turned off. Too bad we can't show the difference in the video right now.

G-Sync technology overview | Game Compatibility: Almost Great

Checking other games

We tested a few more games. Crysis 3, Tomb Raider, Skyrim, BioShock: Infinite, Battlefield 4 visited the test bench. All of them, except Skyrim, have benefited from technology G-Sync. The effect depended on competitive play. But if you saw him, you would immediately admit that you ignored the shortcomings that were present before.

Artifacts can still appear. For example, the creep effect associated with anti-aliasing is more noticeable with smooth motion. You will most likely want to set the anti-aliasing as high as possible in order to remove unpleasant bumps that were not so noticeable before.

Skyrim: Special Case

The Creation graphics engine that Skyrim is based on activates vertical sync by default. To test the game at a frame rate above 60 FPS, add the iPresentInterval=0 line to one of the game's .ini files.

Thus, Skyrim can be tested in three ways: in its original state, by allowing the Nvidia driver to "use application settings", enable G-Sync in the driver and leave the Skyrim settings intact, and then enable G-Sync and disable V-sync in the game's .ini file.

The first configuration, in which the experimental monitor is set to 60 Hz, showed a stable 60 FPS at ultra settings with a video card GeForce GTX 770. Consequently, we got a smooth and pleasant picture. However, user input still suffers from latency. In addition, the side-to-side strafe revealed noticeable motion blur. However, this is how most people play on PC. Of course, you can buy a screen with a 144Hz refresh rate and it will really eliminate blur. But since GeForce GTX 770 provides a refresh rate of around 90 - 100 fps, there will be noticeable stuttering when the engine fluctuates between 144 and 72 FPS.

At 60 Hz G-Sync has a negative effect on the picture, this is probably due to active vertical sync, despite the fact that the technology should work with V-sync disabled. Now lateral strafe (especially closer to the walls) leads to pronounced braking. This is a potential problem for 60Hz panels with G-Sync, at least in games like Skyrim. Fortunately, in the case of the Asus VG248Q monitor, you can switch to 144 Hz mode, and despite the active V-sync, G-Sync will work at this frame rate without any complaints.

Disabling vertical sync completely in Skyrim results in much "sharper" mouse control. However, this does introduce tearing in the image (not to mention other artifacts such as shimmering water). Inclusion G-Sync leaves the stuttering at 60Hz, but at 144Hz the situation improves significantly. Although we test the game with vsync disabled in our video card reviews, we wouldn't recommend playing without it.

For Skyrim, perhaps the best solution would be to disable G-Sync and play at 60Hz, which will give you a consistent 60fps on your chosen graphics settings.

G-Sync technology overview | G-Sync - what are you waiting for?

Even before we received a test sample of an Asus monitor with technology G-Sync, we've already been encouraged by the fact that Nvidia is working on a very real problem affecting games that has yet to be addressed. Up until now, you have been able to turn V-sync on or off to your liking. At the same time, any decision was accompanied by compromises that negatively affect the gaming experience. If you prefer not to enable v-sync until image tearing becomes unbearable, then we can say that you are choosing the lesser of two evils.

G-Sync solves the problem by allowing the monitor to scan the screen at a variable frequency. Such innovation is the only way to continue to advance our industry while maintaining the technical advantage of personal computers over gaming consoles and platforms. Nvidia will no doubt stand up to criticism for not developing a standard that competitors could apply. However, the company uses DisplayPort 1.2 for its solution. As a result, just two months after the announcement of the technology G-Sync she was in our hands.

The question is, is Nvidia delivering everything it promised with G-Sync?

Three talented developers touting the qualities of a technology you've never seen in action can inspire anyone. But if your first experience with G-Sync based on Nvidia's pendulum demo test, you're sure to wonder if such a huge difference is even possible, or if the test represents a special scenario that's too good to be true.

Naturally, when testing the technology in real games, the effect is not so unambiguous. On the one hand, there were exclamations of "Wow!" and "Go crazy!", on the other - "I think I see the difference." Best activation effect G-Sync noticeable when changing the display refresh rate from 60 Hz to 144 Hz. But we also tried to test at 60Hz with G-Sync to see what you get (hopefully) with cheaper displays in the future. In some cases, simply going from 60 to 144Hz will blow your mind, especially if your graphics card can handle high frame rates.

Today we know that Asus plans to implement support for G-Sync in the model Asus VG248QE, which the company says will sell for $400 next year. The monitor has a native resolution of 1920x1080 pixels and a refresh rate of 144Hz. Version without G-Sync has already received our Smart Buy award for outstanding performance. But for us personally, a 6-bit TN panel is a disadvantage. I really want to see 2560x1440 pixels on an IPS matrix. We even settle for a 60Hz refresh rate if that helps keep the price down.

Although we are expecting a whole bunch of announcements at CES, Nvidia's official comments regarding other displays with modules G-Sync and we have not heard their prices. Also, we're not sure what the company's plans are for an upgrade module that should allow you to implement the module. G-Sync in an already purchased monitor Asus VG248QE in 20 minutes.

Now we can say it's worth the wait. You will see that in some games the impact of the new technology cannot be confused, while in others it is less pronounced. But anyway G-Sync answers the "bearded" question whether to enable or not enable vertical sync.

There is another interesting idea. After we have tested G-Sync, how much longer will AMD be able to evade comments? The company teased our readers in his interview(English), noting that she will soon decide on this possibility. What if she has something in mind? The end of 2013 and the beginning of 2014 bring us a lot of exciting news to discuss, including Battlefield 4 Mantle versions, the upcoming Nvidia Maxwell architecture, G-Sync, an AMD xDMA engine with CrossFire support, and rumors of new dual-chip graphics cards. Right now we don't have enough graphics cards with more than 3GB (Nvidia) and 4GB (AMD) GDDR5 memory, but they cost less than $1000...

Do you have a G-SYNC capable monitor and an NVIDIA graphics card? Consider what G-SYNC is, how to enable it and configure it correctly in order to take full advantage of the potential and possibilities of this technology. Keep in mind that just turning it on isn't everything.

Every gamer knows what vertical synchronization (V-Sync) is. This function synchronizes the image frames in such a way that the screen tearing effect is eliminated. If you turn off vertical sync on a regular monitor, then the input lag (delay) will decrease, while you will notice that the game will become better responsive to your commands, but thus the frames will not be properly synchronized and will reach screen tearing.

V-Sync eliminates screen tearing, but at the same time causes an increase in the delay in the output of the picture relative to the control, so that it becomes less comfortable to play. Every time you move the mouse, it seems that the effect of the movement occurs with a slight delay. And here the G-SYNC function comes to the rescue, which allows you to eliminate both of these shortcomings.

What is GSYNC?

Quite an expensive but effective solution for NVIDIA graphics cards GeForce is the use of G-SYNC technology, which eliminates screen tearing without the use of additional latency (input lag). But to implement it, you need a monitor that includes the G-SYNC module. The module adjusts the screen refresh rate to the number of frames per second, so there is no additional delay and the effect of screen tearing is eliminated.

Many users, after purchasing such a monitor, only enable NVIDIA G-SYNC support in the NVIDIA control panel settings with the conviction that this is all they need to do. Theoretically yes, because G-SYNC will work, but if you want to fully maximize the use of this technology, then you need to use a number of additional features, associated with the appropriate setting of classic vertical sync and limiting FPS in games to a number less than a few frames than the maximum refresh rate of the monitor. Why? You will learn all this from the following recommendations.

Enabling G-SYNC in the NVIDIA Control Panel

Let's start with the simplest basic solution, that is, from the moment the G-SYNC module is turned on. This can be done using the NVIDIA Control Panel. Right-click on the desktop and select NVIDIA Control Panel (NVIDIA Control Panel).

Then go to the tab Display - G-SYNC Settings. Here you can enable the technology using the "Enable G-SYNC" field. Tag him.

You will then be able to specify whether it will only work in full screen mode, or may also activate in games running in windowed or full screen windows (no borders).

If you select the "Enable G-SYNC for full screen mode" option, then the function will only work in games that have the full screen mode set (this option can be changed in the settings of specific games). Games in windowed mode or fullscreen windows will not use this technology.

If you want "windowed" games to also use G-SYNC technology, then enable the "Enable G-SYNC for windowed and full screen mode" option. When this option is selected, the function intercepts the currently active window and overlays its action on it, enabling it to support modified screen refresh. You may need to restart your computer to activate this option.

How to check that this technology is enabled. To do this, open the Display menu at the top of the window and check the "G-SYNC Indicator" field in it. This will inform you that G-SYNC is enabled when you start the game.

Then go to the "Manage 3D Settings" tab in the sidebar. In the "Global settings" section, find the "Preferred refresh rate" field.

Set this to "Highest available". Some games may impose their own refresh rate on themselves, which may result in G-SYNC not being fully utilized. This setting will override all game settings and will always enable the maximum monitor refresh rate, which is most commonly 144Hz on G-SYNC devices.

In general, this basic setup that you need to complete to enable G-SYNC. But, if you want to fully utilize the potential of your equipment, then you should read the further instructions.

What should I do with V-SYNC if I have G-SYNC? Leave it enabled or disable it?

This is the most common dilemma for G-SYNC monitor owners. It is generally accepted that this technology completely replaces the classic V-SYNC, which can be completely disabled in the NVIDIA control panel or simply ignored.

First you need to understand the difference between them. The task of both functions is theoretically the same - to overcome the effect of screen tearing. But the way it works is very different.

V-SYNC synchronizes frames by adjusting them to the monitor's constant refresh rate. Therefore, the function acts as an intermediary, capturing the picture and thus displaying the frame, so as to adapt them to a constant frame rate, thereby preventing image tearing. As a result, this can lead to input lag (delay), because V-SYNC must first "capture and arrange" the image, and only then display it on the screen.

G-SYNC works exactly the opposite. Adjusts not the image, but the refresh rate of the monitor to the number of frames displayed on the screen. Everything is done in hardware using the G-SYNC module built into the monitor, so there is no additional delay in displaying the picture, as is the case with vertical sync. This is its main advantage.

The whole problem is that G-SYNC only works well when the FPS is in the supported refresh rate range. This range covers frequencies from 30Hz to whatever the monitor supports (60Hz or 144Hz). That is, this technology works in full measure when the FPS does not fall below 30 and does not exceed 60 or 144 frames per second, depending on the maximum supported refresh rate. Looks really good, below is an infographic created by BlurBusters.

What happens if the frame rate per second goes outside this range? G-SYNC won't be able to adjust the screen refresh, so anything out of range doesn't work. You will find exactly the same problems as on a regular monitor without G-SYNC and classic vertical sync will work. If it is turned off, screen tearing will occur. If it is on, then you will not see the gap effect, but iput lag (delay) will appear.

So it's in your best interest to stay within the G-SYNC refresh range, which is a minimum of 30Hz and a maximum of what the monitor can support (most commonly 144Hz, but 60Hz displays do exist). How to do it? Using the appropriate vertical sync settings, as well as limiting the maximum number of FPS.

What, therefore, is the conclusion from this? In a situation where the number of frames per second drops below 30 FPS, you need to leave V-sync still enabled. These are rare cases, but if it comes to them, V-SYNC guarantees that the effect of tearing the picture will not occur. If the upper limit is exceeded, then everything is simple here - you need to limit maximum amount frames per second, so as not to approach the upper limit, at the intersection of which V-SYNC is turned on, thereby ensuring the continuous operation of G-SYNC.

Therefore, if you have a 144Hz monitor, you need to enable the FPS cap at 142 so as not to approach the upper limit. If the monitor is 60 Hz - set the limit to 58. Even if the computer is able to make more FPS, then it will not do it. Then V-SYNC will not turn on and only G-SYNC will be active.

Enable vertical sync in NVIDIA settings

Open the NVIDIA Control Panel and go to the “Manage 3D Settings” tab” (Manage 3D settings). In the Global Setting section, find the Vertical Sync option and set the option to On.

Because of this, V-Sync is always ready to kick in if FPS drops below 30 FPS and a G-SYNC monitor wouldn't be able to handle it.

Limit FPS to a value less than the maximum screen refresh rate

The best way to limit frames per second is to use the RTSS program (RivaTuner Statistics Server). Of course, the best solution is to use the limiter built into the game, but not everyone has one.

Download and run the program, then in the list of games on the left side, check the Global box. Here you can set a common limiter for all applications. On the right side, find the "Framerate limit" field. Set the limit here for 144Hz monitors to 142 FPS, respectively, for 60Hz devices -58 FPS.

When the limit is set, there will be no delay with the activation of classic v-sync and the game will become much more comfortable.

What is vertical sync in games? This function is responsible for the correct display of games on standard LCD monitors with a frequency of 60 Hz. When enabled, the frame rate is limited to 60Hz and no interruptions are displayed on the screen. Disabling it will increase the frame rate, but at the same time, there will be a screen tearing effect.

What is vertical sync in games for?

V-sync is a rather controversial topic in games. On the one hand for visual comfort gameplay seems to be very necessary, assuming you have a standard LCD monitor.

Thanks to it, no errors appear on the screen during the game, the picture is stable and has no gaps. The downside is that the frame rate is capped at 60Hz, so more demanding players may experience what is called input lag, that is, a slight delay when moving in the game with the mouse (can be equated with artificially smoothed mouse movement).

Disabling vertical sync also has its pros and cons. First of all, an unlimited FPS frame rate is provided and thereby completely removes the mentioned input lag. This is useful in games like Counter-Strike, where reaction and accuracy are important. Movement and aiming is very clear, dynamic, every movement of the mouse occurs with high precision. In some cases, we can get a higher FPS rate, since V-Sync, depending on the video card, can slightly reduce hardware performance (the difference is about 3-5 FPS). Unfortunately, the disadvantage is that without vertical sync, we get a screen tearing effect. When turning or changing movement in the game, we notice that the image is torn into two or three horizontal parts.

Enable or disable V-Sync?

Is vertical sync necessary? It all depends on our individual preferences and what we want to get. In multiplayer FPS games, it is recommended to turn off vertical sync to improve aim accuracy. The screen tearing effect, as a rule, is not so noticeable, and when we get used to it, we will not even notice it.

In turn, in story games, you can safely turn on V-Sync. Here, high accuracy is not so important, the first violin is played by the environment, visual comfort, so you should bet on good quality.

Vertical sync can usually be turned on or off in the game's graphics settings. But if we don’t find such a function there, then you can manually turn it off manually in the video card settings - both for everyone, and only for selected applications.

Vertical sync on NVIDIA graphics cards

AT GeForce graphics cards the function is on the Panel Nvidia controls. Right click on the desktop Windows desktop 10 and then select Nvidia Control Panel.

In the sidebar, select the 3D Settings Controls tab under 3D Settings. The available settings will be displayed on the right.

Settings are divided into two tabs - global and program. On the first tab, you can set options for all games and, for example, whether to enable or disable vertical sync in each. Whereas on the second tab you can set the same parameters, but individually for each game separately.

Select the global or program tab, and then look for the "Vertical Sync" option in the list. Nearby is a drop-down box - select forced shutdown or enable vertical sync.

V-Sync on AMD graphics

In the case of AMD graphics cards, it looks exactly the same as in Nvidia. Right click on the desktop and then go to the Panel Catalyst Control Center.

Then open the "Games" tab on the left and select "Settings for 3D applications". On the right, a list of available options will be displayed that can be forced to be enabled from the position of the AMD Radeon graphics settings. When we are on the "System Settings" tab, we select for all.

If you need to set the parameters individually for each game separately, then you should click on the "Add" button and specify EXE file. It will be added to the list as a new bookmark, and when you switch to it, you can set parameters only for this game.

When the tab with the added application is selected or system parameters(general), then find the option "Wait for vertical update" in the list. A selection box will appear where we can forcibly enable or disable this option.

V-Sync on integrated Intel HD Graphics

If using an integrated Intel HD Graphics chip, a control panel is also available. It should be available by right-clicking on the desktop or via the Ctrl+Alt+F12 key combination.

On the Intel panel, go to the Settings Mode tab - Control Panel - 3D Graphics, and then to the user settings.

Here we find a field with vertical synchronization Vertical Sync. You can enable it forcibly by setting the value to "Enabled" or set it to "Application Settings". Unfortunately, there is no force disable feature in the Intel HD card options - you can only enable V-Sync. Since it is not possible to disable vertical synchronization in the video card, this can only be done in the settings of the game itself.

www.instcomputer.ru

Windows 10 small FPS and floating mouse :: Counter-Strike: Global Offensive General Discussions

Counter-Strike: Global Offensive > General Discussions > Topic Details

Windows 10, small FPS, floating mouse

Good day everyone, I recently upgraded my system to Win10 (previous was Win7). After the update, I ran into several problems. The first problem is the relatively low FPS compared to what it was. Actually, after installing Win10, I seem to have lost FPS. I explain the essence of the problem, I have an average system and on the "seven" I had a good FPS 200-300. On the "top ten", my FPS does not rise above 60, either in the menu or in the game itself. I looked almost all over the Internet and did not find a solution to this problem. The second problem is a slight mouse floating, which is barely felt, but at the same time it greatly interferes with accurate aiming. P.S this problem didn't exist before installing 10. My system: GPU: GeForce GTX 660TiCPU: ItelCore i3-3220 3.3GHzRAM: 8GBHDD (on which CS:GO is installed): 2TB Monitor: ASUS VK278 60HzMouse: Razer DeathAdder 2013Pad: Razer Goliathus Speed ​​Keyboard: Razer Black Widow Ultimate 2013

Please share your thoughts on this topic. I will be very happy)

Note: This is ONLY to be used to report spam, advertising, and problematic (harassment, fighting, or rude) posts.

steamcommunity.com

Windows 10 update allows you to turn off v-sync and unlock max fps

11.05.2016 02:22

Game projects optimized for universal Windows platforms(UWP) with DirectX 12 support can be run without activating the V-Sync option. The update also adds support for NVIDIA G-SYNC and AMD FreeSync technologies.

The update will help to avoid twitching and delays on the screen, as well as improve the visual quality of the image.

Microsoft stated that Gears of War: Ultimate Edition and Forza Motorsport 6: Apex will receive patches with this option very soon.

Automatic updating will gradually come to all computers with the "tenth" version of Windows.



Loading...
Top