AMD has officially released FSR 2.0's source code into the wild for anyone to use.
AMD Drops FSR 2.0 Source Code, Takes Shots at DLSS and XeSS : Read more
AMD Drops FSR 2.0 Source Code, Takes Shots at DLSS and XeSS : Read more
So are you some kind of renowned academic in computer science that has the authority to define an industry wide term? Or some kind of philosophy scholar that has clout over the idea of what "intelligence" even is?While machine learning is a thing I want to remind everyone that “AI” is just a buzzword as there is no real artificial intelligence and we are in fact nowhere near it.
While XeSS can run on non-specialized hardware, its performance suffers doing so.The only difference here is the approach and it’s calculated in specialized cores rather than the usual shading cores. XeSS, if intel didn’t lie about it’s performance, could be way better than DLSS since it can run on specialized and non specialized cores as well, making it far more variable.
Considering G-Sync's been around since 2013 and it's still being used in high-end monitors, I don't think DLSS will die any time soon. Plus CUDA is still widely used despite OpenCL also being a thing.DLSS will lose this game on the long run against FSR, it’s FreeSync vs Gsync again and we all know what happened. Proprietary solutions will always lose unless they are way better, which isn’t the case here anymore.
It’s widely known if you read a bit about “AI” and what it really means that it was not achieved yet. Just a few days ago there was a article here about a ousted google engineer that said they had a real AI, he was fired shortly after and google called everything he said back by stating it was in fact not a real sentient AI. Achieving a real AI needs something that can resemble a humans brain, we didn’t yet fully understand how the brain works, and much less able to build one of our own. Go and inform yourself a bit instead of trying to pick my posts apart, which won’t succeed anyway.So are you some kind of renowned academic in computer science that has the authority to define an industry wide term? Or some kind of philosophy scholar that has clout over the idea of what "intelligence" even is?
I wouldn’t call a few percentages “suffering” you’re not well informed here as well.While XeSS can run on non-specialized hardware, its performance suffers from it.
Gsync is barely used in monitors and even most high end monitors don’t use it anymore. Gsync had a short relevant time, now it’s largely replaced by FreeSync, this is also a well known fact. Do you inform yourself on tech, or just in the forum to pick apart things others posted and start useless discussions? I don’t see much sense in your post and I’m not doing your homework for you either.Considering G-Sync's been around since 2013 and it's still being used in high-end monitors, I don't think DLSS will die any time soon. Plus CUDA is still widely used despite OpenCL also being a thing.
3 days? 4 weeks?
How many programmers for that duration?
First of all AI is a buzzword in my opinion, machine learning is just the correct usage of language compared to that, it’s just facts.I have a serious problem with what AMD showed here...
it's either "~4 weeks" or "4 weeks+" (which is analogous to ">4 weeks"). Using both at the same time is stupid and redundant (to a degree). How do you even read that? "It's around over 4 weeks" or "it's around 4 weeks plus"? XD
Oof, that was so much anger. Sorry.
As for what this means, well, I'm not entirely sure, but it is nice when Companies just open the door for the world to improve on what they have started.
As for the debacle of "AI" and "machine learning". They're basically buzzwords that are annoying just like when any other term is over abused and, sometimes, incorrectly used. "Cloud", anyone? "Synergy" anyone? Machine learning is not useless; far from it. It allows to narrow down algorithms based on heuristics that would otherwise take a long time for humans to do by hand. Whenever you have a problem that the exact result is too costly (usually non-polynomial in cost; NP), you want heuristics to help you get a close/good enough result and yadda yadda. So, the "AI" side of things on Consumer is just saying "hey, we have some algorithms based on running heuristics quite a lot and they'll improve things over generic algorithms or less refined heuristics" and most of those are math calculations that run at quarter/half precision (FP8 or FP16) since you are not looking for accurate results (lots of decimals) but fast operations. Is this truly useful? Flip a coin? It is truly "case by case". Sometimes a properly trained algorithm can help a lot, but that takes a lot of iterations and also requires the complexity of the solution is rather high. So, that brings us to the "temporal" solution based upscale. Is it a hard problem to solve from the algorithmic point of view? No, not really I'd say. The proof is in the pudding. How much better is DLSS 2.x over FSR 2.0? Within strike distance of the "generic" heuristics used by FSR, no? And even then, FSR has some image quality wins I'd say (subjective, so I won't argue that I can be wrong here). Maybe nVidia needs to train the AI more per game? Maybe they need to improve the backbone of the Tensor cores more so the heuristics can be more accurate or process more data per pass? Ugh, so much "whatifism", so I'll just stop here.
Anyway, again, good to see more (F?)OSS stuff.
Regards.
I mean, if we want to be pedantic, both are kind of wrong. Unless you believe a machine can have "intelligence" or it can "learn". The act of learning is interesting by itself, as it requires a certain level of introspection and acknowledgement that, to be honest, machines do not have and, maybe, can't have. They could emulate it (exact results vs approximation can be defined) and start from there, I guess, but never at human capacity? As for "intelligence", well, depends on how you define it. Capacity to resolve problems? Capability of analysis? Calculations per second? Heh. I'm not sure as there's plenty definitions out there that can be valid from a psychological point of view. Same-ish with learning, but I'm inclined to use the "introspective" one as it makes the most sense to me: there can only be learning when you can look back and notice a change in knowledge.First of all AI is a buzzword in my opinion, machine learning is just the correct usage of language compared to that, it’s just facts.
Secondly, as far as I know, DLSS 1.0 used trained “AI” via supercomputer and since 2.0 it’s just trained via tensor cores in the GPUs, so it’s very comparable to what FSR2.0 does, if you ask me. Both calculate imagery to upscale lower res to higher res, it’s not that complicated. Tensor cores are also just specialized cores, that can calculate certain data more efficient than the regular shaders.
Yea you’re right I kinda missed the “learning” in machine learning, but I think the “machine” sets it off, it’s just “machine learning” haha. AI on other hand, as you well explained, it’s just nonsense, as machines can’t really learn, they’re just programmed. At the same time they aren’t sentient, which is another important metric to consider before you can compare them to humans or even animals. In Star Trek, yes it’s just fiction I know, they said regular computers were not possible to achieve this, but positronic ones were, in general it was well explained in Star Trek and other sci fi shows, the tech nonsense aside. AI must essentially be indistinguishable from regular life, or even superior, otherwise it’s not real AI.I mean, if we want to be pedantic, both are kind of wrong. Unless you believe a machine can have "intelligence" or it can "learn". The act of learning is interesting by itself, as it requires a certain level of introspection and acknowledgement that, to be honest, machines do not have and, maybe, can't have. They could emulate it (exact results vs approximation can be defined) and start from there, I guess, but never at human capacity? As for "intelligence", well, depends on how you define it. Capacity to resolve problems? Capability of analysis? Calculations per second? Heh. I'm not sure as there's plenty definitions out there that can be valid from a psychological point of view. Same-ish with learning, but I'm inclined to use the "introspective" one as it makes the most sense to me: there can only be learning when you can look back and notice a change in knowledge.
Interesting topic for sure. Worthy of having a BBQ to discuss it over, haha.
Regards.
And don't forget the backbone of it all: imagination.Yea you’re right I kinda missed the “learning” in machine learning, but I think the “machine” sets it off, it’s just “machine learning” haha. AI on other hand, as you well explained, it’s just nonsense, as machines can’t really learn, they’re just programmed. At the same time they aren’t sentient, which is another important metric to consider before you can compare them to humans or even animals. In Star Trek, yes it’s just fiction I know, they said regular computers were not possible to achieve this, but positronic ones were, in general it was well explained in Star Trek and other sci fi shows, the tech nonsense aside. AI must essentially be indistinguishable from regular life, or even superior, otherwise it’s not real AI.
Absolutely, creativity and imagination are central to this definition. For now we only see machines mashing up data to create what it is trained to do, or often just total nonsense, like one of those funny image apps. Siri observing the user to give recommendations, isn’t very intelligent either, but it’s getting there.And don't forget the backbone of it all: imagination.
Regards
The issue I would have with your definition is that it is so broad as to include any computer algorithm, since in principle a program that can play a decent game of tic-tac-toe is no different to a program that plays a mediocre game, or even plays tic-tac-toe very poorly. Not to say this is an incorrect definition, but I would consider it so broad that the term A.I. loses all meaning.A Sentient A.I. is a different concept from simple A.I. Simple A.I. has been around for a very long time, e.g., a computer program that can play a decent game of tic-tac-toe is a simple A.I. in my opinion.
if you're saying this you still did not really understand how DLSS or XeSS work.The only difference here is the approach and it’s calculated in specialized cores rather than the usual shading cores.
Ah the deep learning buzzwords, like AMD said, deeply overrated. And XeSS is just a joke since it’s unproven tech, just like their alleged GPUs.if you're saying this you still did not really understand how DLSS or XeSS work.
Ah the deep learning buzzwords, like AMD said, deeply overrated. And XeSS is just a joke since it’s unproven tech, just like their alleged GPUs.
if you're saying this you still did not really understand how DLSS or XeSS work.
It’s overrated since AMD achieved ~same quality without using tensor cores or alleged deep learning. Do we actually know it uses deep learning? No. It’s probably very comparable to FSR 2.0. The only difference being it’s calculated in specialized cores instead of shaders.over rated or you just did not understand? many people like to compare DLSS like how it was with Gsync vs Freesync. heck people do it with RT before as well. saying that AMD will be able to rival nvidia performance in RT purely on software without needing specific hardware.
with AMD solution there is no AI being calculated even on shader cores. why DLSS end looking much better at lower resolution than FSR? because the AI part augment the upscaling on the missing data part.It’s overrated since AMD achieved ~same quality without using tensor cores or alleged deep learning. Do we actually know it uses deep learning? No. It’s probably very comparable to FSR 2.0. The only difference being it’s calculated in specialized cores instead of shaders.
Than FSR 1.0 perhaps, not 2.0. I stopped caring about 1.0 as soon as 2.0 launched.with AMD solution there is no AI being calculated even on shader cores. why DLSS end looking much better at lower resolution than FSR? because the AI part augment the upscaling on the missing data part.