Tom's Hardware Superposition Benchmark Thread

Page 18 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


Yep, that's HWInfo and not GPU-Z
 

QwerkyPengwen

Splendid
Ambassador
There are many things I could easily say about you and the way you handle this thread.

One thing I will say is that you were clearly offended by my original post where I was lightly venting about some things. And instead of maybe communicating such things and providing some feedback you just stayed silent and ignored it.

My attitude wasn't poor, not in the slightest. You must have just misinterpreted my intentions.

But again, that's neither here nor there. That post was forever ago and I've moved on from that GPU and caring about the score.

At the end of the day things are what they are. My apologies if you've misinterpreted anything I've said so far.
Conversation and communication generally becomes degraded quite a bit when in text form since there's no way for anyone to feel the context in which things are said.

You have my latest submission. I pushed my current card to it's limits without shunt modding and throwing it into a custom loop and that's what I got for my score.

If my submission isn't too your liking for reasons that make it not fit within your criteria in your original post then please feel free to let me know what needs to change. I did use HWinfo instead of GPU-Z and that's simply because I don't like GPU-Z as a program and find HWinfo to be a more accurate tool for real time monitoring and information.

And I'm not going to go through the process of running superposition multiple times with those overclocks again just to get another score that's identical and then take a screenshot with GPU-Z. Don't see how GPU-Z vs HWinfo makes any difference. Unless there's a very specific reason you want to see GPU-Z in the picture that you failed to state in your original post. Otherwise, it's just to show that I am running the card I say I am and I can do that with HWinfo just fine.
 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


Rules are rules. Show GPU-Z or your submission will not be added.

As for how I handle the thread, it is my thread at the end of the day, and as long as it conforms to the site rules I can add or prevent submissions at my discretion. Tom's is a fairly open and welcoming forum, if you don't like the way I operate my thread (which is still moderated by The Big Guys™) you're probably more than welcome to start a new thread with a different benchmark suite of your choosing.

I am the [strike]keeper[/strike] librarian of this thread and the Passmark thread over in the CPUs forum, which has its own similar rules. Passmark itself is a flawed benchmark, as is Unigine Heaven, Firestrike, Timespy, etc. All synthetic benchmarks are flawed. I can't prevent you from not liking a certain benchmark title, but if you want your score added to the leaderboard, I need the score and a GPU-Z window.

Other submissions do not have this as the rule was added after someone photo-shopped their results and their submissions were grandfathered in.

The only results I remove are from banned or no longer active users. I've had my own moderation history due to my personality type and my knack for strongly disagreeing with others. The Tom's Hardware moderators are very direct. I'm trying to stay here.

Once more so that it's clear:

Your submission must include a GPU-Z screenshot and the Superposition results window within the same frame to be added to the leaderboard. This is non negotiable.
 

QwerkyPengwen

Splendid
Ambassador


That is fine. I understand. I assume then I can't take the saved screen of superposition and open it up in a photo viewer and open up GPU-Z alongside it? I would need to rerun the benchmark until I get my score again?
 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


You will need to run the benchmark again.
 

QwerkyPengwen

Splendid
Ambassador
Here you go. A new screenshot with GPU-Z

Capture.png


And just as an added bonus, here's a couple pictures taken with my phone.

phonepic1.jpg

phonepic2.jpg

 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


>takes picture with phone

RUdPyQP.jpg
 

fagetti

Notable
Mar 1, 2018
919
15
1,165
Got a 5 years old msi 780ti but at stock its same as my 780 Lightning overclocked, i will post once i hit 3000 score stable. .im trying to get 1300-1350 core without raising voltage too much. Im comfortable below 1.3v since its air cooled.

EDIT: Equinehero have you tried raising your mem or gpu voltage on that 1070? What voltage controller your model has? Maybe it can be unlocked like my 780 Lightning and 780ti i can go upto 1,45v without hard mod (anything above 1.25-1.3v needs water cooling)
 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


I'd prefer a hard mod over a soft mod. There's less to screw up.

My 1070 Ti is at +180 core and +500 mem, any higher on the memory and it starts artifacting, and it crashes above 2100MHz, which is already insane for Pascal.

My old 1050 Ti used to get 2150-2200Mhz, but I sold it off a while back
 

fagetti

Notable
Mar 1, 2018
919
15
1,165


Thats same wattage what my 780 Lightning took in furmark with modded bios and voltage 1.3

 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


Too bad Pascal is so locked down, I'd love to unlock my 1070 Ti's power limits and max out my PSU to see how good Rosewill really is
 

Rogue Leader

It's a trap!
Moderator


My Vega 64 LC already draws 330 average and can spike to 360 as is. i can only imagine if i did this mod it would light my PSU on fire. No thanks.
 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


Get an AX200i then
 

Rogue Leader

It's a trap!
Moderator


No thanks, already upgraded my PSU somewhat because 650 wasn't cutting it. Driver updates have increased power consumption. There comes a point....
 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


If I had the money I'd buy an AX1200i in a heartbeat because of the low load efficiency.
 

EquineHero

Reputable
BANNED
Oct 6, 2015
712
0
5,010


Your results would be better if you had a better motherboard and faster memory. I'll update the table later. I'm busy.
 

JaSoN_cRuZe

Honorable
Mar 5, 2017
457
41
10,890


Yeah, Thanks. I'm planning on upgrade next year to Zen2!!
 

binderasaf1

Prominent
Jan 21, 2018
7
0
510
Hello dear superposition people.
There's a mystery i have yet to solve , and I'm super curious as to what's the solution.
Upon browsing through superpositions hof in unigine's website, i stumbled upon this score (attached in the pic).
http://i63.tinypic.com/vpx0xw.jpg

There's a 700ish points difference between us, and 5-6 fps, which is a lot.
Now I'm baffled by this, as both of us have ryzen systems, the only difference being me having a 6 core but running 3.95 ghz (he is running 3.9ghz). The impact on results, regardless, is marginal at best. If anything, my 50mhz advantage should help more than 2 extra cores. We both run dual channel 16 gigs, and my mem is actually clocked higher. Gpu Clockwise, this guy runs his core 50mhz more than me, and 90mhz more on the memory. This difference can never account for such a big difference in score/fps. In the real world its 1 fps at best. I tried all the regular nvidia control panel tweaks, but i can't go past 6346ish....
Am i missing something? What's the secret sauce for this big discrepency in results? I'm pretty sure its not the clocks...
Help anyone ?
 


Yeah, big difference... He OCed his GTX 1080Ti to the moon, look at the memory clocks for the GPU.... 6300 MHz...

That's the difference.
 

binderasaf1

Prominent
Jan 21, 2018
7
0
510


Nope...mine is clocked at 6210mhz...i actually benched at 6300 as well and it certainely wasnt a 5 fps difference. Its also not the "looser memory timings from increased clocks" which degrades the performance, as my scores and fps scales directly with the memory clocks consistently...im simply at 6210mhz as its my actual gaming mem clock. Its Not the memory.

 


SP is all GPU memory bandwidth..

 

binderasaf1

Prominent
Jan 21, 2018
7
0
510


I can promise you, and im saying this from personal experience, the difference between 6210mhz and 6300mhz on the memory is never 5-6 fps. 1fps at most if not 0.5 in sp. Its most certainly not the memory.
 
Status
Not open for further replies.