Install Steam
login
|
language
简体中文 (Simplified Chinese)
繁體中文 (Traditional Chinese)
日本語 (Japanese)
한국어 (Korean)
ไทย (Thai)
Български (Bulgarian)
Čeština (Czech)
Dansk (Danish)
Deutsch (German)
Español - España (Spanish - Spain)
Español - Latinoamérica (Spanish - Latin America)
Ελληνικά (Greek)
Français (French)
Italiano (Italian)
Bahasa Indonesia (Indonesian)
Magyar (Hungarian)
Nederlands (Dutch)
Norsk (Norwegian)
Polski (Polish)
Português (Portuguese - Portugal)
Português - Brasil (Portuguese - Brazil)
Română (Romanian)
Русский (Russian)
Suomi (Finnish)
Svenska (Swedish)
Türkçe (Turkish)
Tiếng Việt (Vietnamese)
Українська (Ukrainian)
Report a translation problem
That's weird, I play battlefield 4 and starcraft 2 on near max setting on my laptop, there's a huge difference between turning Vsync on and off, it does improve performance a lot in my case if you turn it off
Three thoughts on that (I'm assuming you're talking about framerates and not input latency):
1. While VSync does require a small amount of GPU/CPU time, it's almost impossible for players to notice the effort being done to make VSync work, because by design, VSync changes a system's framerate to specific, locked values. This is different from with VSync off where players can watch the framerate to see how each setting affects the work load.
2. By design, VSync limits the GPU output to specific values. If a system with VSync off runs a scene at 200 FPS, with VSync on with a 60 Hz monitor, it'll run at 60 FPS. So, yes, the performance is dropped, but that's intended.
3. If a system can't maintain a framerate value equal to or higher than the monitor's refresh rate, yes, VSync can, by design, drop the GPU's framerate to simple fractions (e.g., 2/3 or 1/2) of the monitor's refresh rate, and this is also intended.
-- Example: if a 60 Hz monitor's system can only do 50 FPS, VSync could drop the framerate down to 40 FPS (i.e., 2/3 of 60) to keep it matched with the monitor.
------------------------
If your laptop isn't capable of maintaining 60 FPS in the games you mentioned, and you don't experience screen tearing with VSync off, then playing with VSync off is probably the correct choice for you.
However, for other systems, using VSync might be very useful for some of the reasons I mentioned in my earlier post.
CA used an evolutionary step of the Warscape engine to create TW: Rome 2.
CA has used versions of the Warscape engine for every Total War game since Empire: Total War.
So, no, it's not a completely new engine, but yes, there are some differences from the previous Warscape versions.
While VSync does help with screen tearing (as I previously wrote above), that is not the only reason to use VSync. As I also wrote above, VSync can be used to keep GPUs from running at high usage. For instance, if a game with VSync off runs at 300 FPS with the GPU at 99% usage, a player can turn on VSync at 60 Hz, and the GPU usage would drop dramatically.
As to GPU power required to use VSync, as I wrote to VN Michael Doan above, the small amount of effort to make VSync work is almost undetectable by players, because the process itself forces framerates to specific values which will be lower than what the system would run with VSync off.
So, while a GPU might use 0.25% more frametime syncing the frames to the refresh rate, that value will almost always be insignificant compared to the framerate drop required to match the refresh rate.
And, since the GPU will be doing less work with VSync on, because the framerate is lower, the overall power used by the GPU with VSync on will be less than with it off in virtually all cases.
I didn't say that VSync improves framerates or input latency.
I said that in order to avoid a visible latency caused by VSync people should use the triple buffering option (in their driver).
Finally somebody puts screenshots side by side. I don't see any blazing differences between the quality of Rome and Shogun that warrants such criticisms of Rome. Can the OP explain?