Geforce Gtx 770 bad? Please help! Distorted/blurry Graphics in Game? Have pictures

admaster99

Honorable
Jan 27, 2014
48
0
10,540
I just bought a new video card a couple days ago its a PNY Geforce GTX 770. I have been using it for a couple of days now for my average computer usage like surfing the net and checking emails. But I just tried to play a game today and when I get into game the graphics get all distorted and blurry. Its hard to explain but everything looks terrible to the point the game is unplayable. So I don't know what to do, Is it possible my video card is bad? Also there is a lot of screen tearing even if v sync is on.

I have tried every fix I can think of:
*More then one game
*Changed hdmi cables
*Completely removed all drivers and reinstalled them, plus tried older drivers with no luck
*Checked all the settings on the tv
*Went through every setting on the video card
*Changed image scaling on video card

Some pictures to show whats going on: The last three are the best because I have one while in game and one while out and you can see the image difference.
http://forums.anandtech.com/showthread.php?p=36003146&posted=1#post36003146

System Specs:
i7 950 at 3.07 ghz
16 gig ram ddr3
750 watt power supply - cooler master
PNY Gtx 770
Sony Bravia EX700 (KDL-60EX700)
Asus Sabertooth x58 motherboard
 
Solution
Update: After finally getting my hands on another display I have determined for sure that it is the display and not the video card. Pretty much whats happening is the same thing that happened in this post: http://www.neogaf.com/forum/showthread.php?t=716174
In which I am assuming and hoping that a firmware update will solve the problem. I have how ever not been able to update the firmware because I have to use a exact type of flash drive. Just want to give "Pottuvoi" a proper thanks for this post.

Thank you very much for all the help!

Firmware update resolved the problem looks and runs great now.
Yes I'm using a tv for my display and what settings in the nividia control panel? I have already messed with all of them with no success. I also used a dvi to hdmi cable that made no difference. I don't have another display readily available so I used a standard VGA cord and it not only does it still do it even at low resolution, now the screen goes really dark. It's really acting weird.

This is my tv almost top of the line display I have a hard time believing its the problem. http://www.amazon.com/Sony-Bravia-60-Inch-back-lighting-KDL-60EX700/dp/B0035ER1LE/ref=sr_1_6?ie=UTF8&qid=1390880673&sr=8-6&keywords=sony+bravia+60

I really appreciate the comments I will try and hunt down a monitor tomorrow and hook it up to see what kind of result I get.
 
And when does an image on a HD tv get blurry and distorted.
Well if full HD it uses a progressive scan that means it scans or draws each line of the screen.

However If the output of the card Sends a 1080 i then things become much more blurry because the actual image or full image is not displayed.

It is in fact halved, because every alternate line to one displayed is missing.

And the effect as you see is a blurring. As you can see on the image of the desktop to do with the text the letters are not formed properly by the pixels that should be displayed. The actual ghosting is windows clear type. where it smooths text. it was made for windows with the advent of Lcd monitors.

By the look of it, you in fact have a problem between the settings in the Nvidia control panel, and the actual screen resolution settings of windows.

I am suspecting here that windows may have a setting of 1080 i. and not P
And the two settings are causing a conflict.

Set in the nvidia control 1080p. Then go to screen resolution in windows.
Click on advanced, then click on list all modes.

Select the exact same resolution, and color depth and 1080 p
ect.

There is switching going on between the Nvidia and windows screen resolution settings. And likely causing conflict.
One setting in each if not the same could be the cause.






 
Shaun I did what you described in the settings in nividia control panel are the same as whats in the windows settings. I only run the desktop at 1360x768 I can put it all the way up to 1080p or 1920x1080 and it runs flawlessly no image distortion or problems when on desktop. The problem only occurs when I boot up a game. Any game for that matter then the picture gets distorted like what you see in the pictures. I was in game during the pictures I just minimized to desktop because I was hoping to get a better example by taking pictures of the desktop.

One thing I did notice is when I tried capturing screenshots of the game with screen capturing software the images looked fine and clear. What does that mean? Also same with some videos I was trying to capture to show what was going on. If it was the display what would cause this to happen?
 
To me it means that the tv cannot cope with image updating very fast.
How long have you had the EX700.

I just had a read of something and you know you said it was blurry when playing a game. I was wandering if it may be the cause.

Video processing: The Sony KDL-EX700 doesn't allow much tweaking of dejudder processing, supplying only Off, Standard, and High options for its MotionFlow control. As we expected, we preferred the look of Off best with film-based sources like most Blu-ray movies, which looked too smooth and videolike in the other two settings. However, we did prefer Sony's lowest-dejudder mode, Standard, to the equivalent modes from Samsung and LG because it didn't introduce as much smoothing and delivered a less videolike look.

 
2-3 years give or take. Wouldn't I have notice this on the radeon 6970?

I'm not sure if I would call it blurry, I just said blurry because I didn't know how else to describe it. But almost like pixelated or jagged edges. I'm getting to the point that I think I might turn in my computer for a PS4 kind of tired of this monkey business. I like the computer for the better graphics and freedom but I spend more time getting things to work then actually playing games.
 
I had a similar issue using a TV. My LG 1080 TV would display the image fine, but my Soniq 1080 TV would look weird using HDMI or DVI, but perfect using VGA (shame because that TV is 3d). This was only with AMD cards (7750 & 7950) not Nvidia (GTX 680). No fiddling with settings, refresh rate, resolution fixed it, so I just used the LG until I got 3 monitors instead.
 
May sound a silly question.
But did you uninstall the CCC suite, and the driver.

After that I have to admit im at a loss as to why the Ati card worked ok
But the Nvidia 770 does not, unless its driver related.

It`s a very odd problem. Scratching head now. Like you!
 
I feel stupid for saying this but honestly I did but not until I installed the new video card. So I put the new video card in and installed the drivers for it and completely removed the old ati drivers. Then I developed the problems I'm having now so I did a full driver sweep for both to make sure everything was gone then I reinstalled the latest driver. But I suppose the damage was already done. The more I think about it thats prob what it is. Now I feel kind of stupid, I guess I will be doing a fresh install of windows tomorrow.

I have a spare harddrive so I will just reinstall windows and all drivers and if the problem persists I guess I will try a second monitor or vice versa. And if yet continues then I thank amazon for there great return and replace policies.

Thanks for all the help I have seen almost everything but never something like this so it really stumped me.
 
Just a quick update I just got done reinstalling windows just to find out that it did not work. So as of right now I am assuming its the display I will try a second display tomorrow and see what that does. If that still doesn't resolve the issue I will assume that graphics card is faulty and return it. But if that does resolve the problem which I highly suspect will be the case, I'm not sure what I will do I suppose return it for another radeon or perhaps PS4. I guess we will see thanks again for all your help.
 
Try going through the settings on the tv. many lcd tv's actually have a setting for when they are hooked up to a computer (a 7 year old hitachi and a 5 year old sharp tv both had the setting, so it's likely yours will as well (not sure the exact label of the setting, but it should change things quite a bit for pc play on a tv
 
Update: After finally getting my hands on another display I have determined for sure that it is the display and not the video card. Pretty much whats happening is the same thing that happened in this post: http://www.neogaf.com/forum/showthread.php?t=716174
In which I am assuming and hoping that a firmware update will solve the problem. I have how ever not been able to update the firmware because I have to use a exact type of flash drive. Just want to give "Pottuvoi" a proper thanks for this post.

Thank you very much for all the help!

Firmware update resolved the problem looks and runs great now.
 
Solution


What firmware updates did you download and also how? Iseem to have the same
Roblem you had and I have downloaded any drivers I could AND I am using a monitor and never used a tv...I dont know how to solve it...

I would appritite it very much if you could tell me what firmware updtes did you install,whre from and how?
 
Sorry for not replying for months lol. I just wanted to post a response to this topic in case anybody needs it for a reference. I updated the firmware on my Sony Bravia KDL-60EX700 with the latest one provided from Sony's official website. You need a Sony pen drive compatible with your Sony television in order for the tv to detect the firmware update. This resolved the problem.