Is the gtx 1060 6gb fine with this monitor?

Pcenthusiast16

Commendable
Mar 27, 2016
275
0
1,790
The monitor i own is an acer 21.5 s220hql bbid and im looking to buy a gtx 1060 6gb. However i have heard that the 1060 is 'bottlenecked' for lack for a better word by cheaper monitors. I cant twll if my monitor has free sync or gsync or whatever so adivce will be well appreciated.
 
Solution


??
I copied part of your first post, including spelling errors. I'm not sure why you still say you didn't mention Free Sync. It's right there.

As for FAST SYNC, I don't know how you expect this to "help" but I'll tell you how it works.

FAST SYNC:
This has to be forced on in the NVidia Control Panel. It ONLY functions if the FPS is over 2x the Hz rate. For a 60Hz monitor that means your GPU must be pumping out over 120FPS.

What happens is that your GPU spits out a new frame as quickly as it can make one. If you finish TWO inside of 1/60th of a second then the SECOND one is used and the FIRST one is...
Sure it will work. As it is 60hz. Many games will be limited to 60FPS. Only a handful won't achieve a full 60 FPS at max settings. Technically you can hit higher FPS. But it will result in screen tearing and have no benefit. I use VSync to limit that. At least with the 1060 you will have a longer use life at 60FPS. With 6GB VRAM you will have plenty of memory for max settings.

Any lower end card will have less VRAM. So, you won't even be able to use max settings.

If you can comfortably afford the GTX 1060 6GB. I would buy that. Otherwise step down to the Radeon Rx 580 4GB.
 
The only thing that matters is:
a) the number of pixels (1920x1080), and
b) the refresh rate (60Hz)

There is really no issue related to your GTX1060. There are plenty of games that can't hit 60FPS with maximum quality settings at 1920x1080, using the GTX1060.

Every game is different.

Here's some more info: https://www.techpowerup.com/reviews/NVIDIA/GeForce_GTX_1060/21.html

You can look at the games and see which ones can hit 60FPS, but I guarantee you should drop the numbers for many, possibly in HALF. Why?

For one thing the FPS is an AVERAGE so it's going to drop below the value listed. For another, not every test is done at ULTRA settings. I know that because I have a GTX1080 and several of these games.

Please note that a lower-end CPU will produce lower performance depending on the game. The lower the resolution the more likely the CPU is a bottleneck.

All you really need to concentrate on is tweaking the game settings to achieve your goal. For example, using VSYNC ON you should keep the game locked to 60FPS.

If you use ADAPTIVE VSYNC (NCP-> manage 3d settings-> add game...) it turns VSYNC ON and OFF automatically. This allows you to have more drops below 60FPS (for 60Hz monitor).

(You do NOT want to have normal VSYNC ON if you keep dropping below the target as this causes stuttering. It's usually preferable to have some screen tear which is what you get with VSYNC OFF).

GSYNC (NVidia) and Freesync (AMD) is useful but also expensive. Since the monitor is told when to draw a new frame you don't have to worry about maintaining a specific FPS to avoid things like stutter. There's also no screen tear.
 
AMD cards are way overpriced right now due to the new BITCOIN MINING (not sure if it's still called bitcoin). The GTX1060 6GB is the best $250-$300USD card right now.

I would not pay any more than $280 (after rebate, not including tax/ship). This is my favorite card:
https://pcpartpicker.com/product/P8wqqs/asus-geforce-gtx-1060-6gb-6gb-rog-strix-video-card-strix-gtx1060-6g-gaming

There are cheaper cards too, but the $30 or so saved for a single-fan EVGA card doesn't seem worth it to me. The Strix should be able to maintain the clock better and may overclock a little higher.
 


You do not have Freesync. Your refresh rate is 60Hz.

You absolutely WILL have screen tearing if VSYNC is OFF. However, sometimes it's not very obvious. It depends what you're looking at and the relative FPS vs refresh rate is.

The entire purpose of VSYNC is to eliminate screen tearing. Again, if VSYNC is OFF and you do NOT have an asynchronous monitor (Freesync/GSync) you have screen tearing. Full stop. Whether it's obvious is another issue but you DO have it.

BTW, a GTX1060 6GB is only about 30% faster on average, and in some games it's less than that:
https://www.techpowerup.com/reviews/ASUS/RX_470_STRIX_OC/24.html

In DOOM at one point the GTX1060 6GB was only 10% faster than an RX-470. There's some untapped performance in AMD cards (due to ACE etc) that will start being utilized with newer DX12 and Vulkan games.

I wouldn't upgrade unless going higher than a GTX1060 6GB, especially since you can TWEAK game settings to optimize things. 30% isn't that significant if you compare the same game tweaked to say 60FPS. It may just be the difference between 4xMSAA and 8xMSAA or some combination of settings that may not be very significant visually.

I say that because I have a GTX1080 now and had a GTX680 last year. It's better but not nearly as much as I expected. In fact, much of my catalogue ran near 60FPS anyway at high or Ultra settings. I started having issues with a couple games (mainly due to lack of VRAM as I had only 2GB) which is why I upgraded.
 
https://www.pcper.com/news/General-Tech/Donate-PC-Perspective-Mining-Pool-NiceHash-How

BTW you can make money mining. I dislike the entire concept as it's wasteful but I'll link it. I refuse to sign up even though my card could make $5USD per day after electricity.

There's a few links you can follow such as the calculator to estimate how much money you can make.

The craze is back now that there's a new "algorithm" (sort of) that made the dedicated ASIC machines obsolete. It won't last long likely (a couple months?) but for now people are making money with this.

Which is why AMD cards are way overpriced. AMD hates this likely since they sell GPU's mainly and not cards. They have deals in place so the GPU cost won't go up so they are likely not making any profit on this (though Asus, Gigabyte etc are, or at least the resellers are like Amazon and Newegg). After this dies down again their USER BASE will have shrunk because gamers buying new cards are switching to NVidia cards since AMD's are overpriced.

Then when the craze dies the market will flood with used, cheap AMD cards.
 


Actually, what you said was "I cant twll if my monitor has free sync or gsync or whatever so adivce will be well appreciated" and later you mentioned vsync.

Stuttering can be caused by several issues, but when it comes to VSYNC it happens if it's ON and you can't maintain 60FPS (for 60Hz monitor) because the frames end up as multiples of 1/60th second. If you MISS the next refresh then the same image is drawn again with VSYNC ON (thus 2/60th second for that one or two 1/60th of the same image). So in one second you might get frames that show for the following TIME in seconds:

1/60th, 1/60th, 2/60th, 4/60th, 2/60th, 1/60th etc. which looks worse the more the camera pans (sometimes called "JUDDER").

Again, as for SCREEN TEARING I can only tell you that you MUST have VSYNC ON to avoid it for a normal, synchronous monitor. The amount VARIES from not obvious to horrible but it's ALWAYS there if VSYNC is OFF.

*Anyway , I don't want to beat a dead horse. At this point you believe me or you don't. Your concern is getting the GPU then tweaking each game for its best visual quality vs smoothness ratio.
 


??
I copied part of your first post, including spelling errors. I'm not sure why you still say you didn't mention Free Sync. It's right there.

As for FAST SYNC, I don't know how you expect this to "help" but I'll tell you how it works.

FAST SYNC:
This has to be forced on in the NVidia Control Panel. It ONLY functions if the FPS is over 2x the Hz rate. For a 60Hz monitor that means your GPU must be pumping out over 120FPS.

What happens is that your GPU spits out a new frame as quickly as it can make one. If you finish TWO inside of 1/60th of a second then the SECOND one is used and the FIRST one is simply discarded. If you manage THREE frames in 1/60th of a second then FRAPS records 180FPS but regardless of that there is physically only a locked (VSYNC) 60FPS on the monitor itself (no screen tearing).

**The ONLY point of Fast Sync is that you get improved responsiveness. Less "lag" or "sluggishness". It's better than normal 60FPS VSYNC ON, but not as good as 120FPS VSYNC ON with a 120Hz monitor (which would show 120 frames per second).

I'm sure that's a bit confusing at first.

Again, what you really need to do is TWEAK your game settings for the best experience. I'll reiterate some points if I haven't made them:

Start with a GOAL then tweak TOWARDS that goal:
1) VSYNC ON
- 60FPS locked on 60Hz monitor
- tweak to maintain 60FPS almost 100% of the time
- pro: no screen tear
- con: added lag, and STUTTERING if you drop below the target

2) VSYNC OFF
- adjust FPS for best smoothness, responsiveness, and minimal screen tear (twitch shooters for example may benefit from 200FPS+ and may show no obvious screen tear though it still exists)
- FPS varies
- pro: less lag than VSYNC
- con: screen tearing

3) ADAPTIVE VSYNC:
- forced on in NVidia Control Panel (force per game)
- simply turns VSYNC OFF or ON automatically (OFF below 60FPS on 60Hz monitor)
- con: screen tear below the target (which also happens in 30FPS pre-rendered cut-scenes too since they are below the 60FPS)
- pro: no added STUTTERING caused by having VSYNC ON when below the target (this is the ONLY reason this feature exists; I use it for games that have lots of sudden FPS drops/stutters and tweak game settings like shadows, MSAA etc until I maintain 60FPS about 90 to 95% of the time)

OTHER:
Just FYI, but even if FRAPS or other FPS software indicate a solid 60FPS you may still experience stutters or freezes. FPS software is not perfect and records when the game REQUESTS a frame draw, not when one actually finishes. It's common to have dropped frames that are never drawn, sometimes so many that the experience is HORRIBLE. So average FPS is not a good indicator of how smooth the game is. This is why analysis now includes FRAME TIME analysis (ideally one second would have SIXTY frames of exactly 1/60th of a second using VSYNC ON with a 60Hz monitor).
 
Solution