Soaptrail
Distinguished
1080P gamers rejoice! (and wait) Thanks Intel.
Don't make me laugh when i am drinking my coffee!
1080P gamers rejoice! (and wait) Thanks Intel.
Bigger or smaller doesn't make any difference...We are constantly looking forward. So we are going back to 14nm from 10nm. Bigger is better.
I never said that Intel is in trouble, I said that they're going to lose a craptonne of marketshare. I don't want Intel dead, I want competition and I want it to be about as even as possible. Having said that, since AMD is so far behind in size and mindshare, I'm glad that AMD has defeated Intel in every single metric.AMD can come out with a CPU that is 10 times faster and 10 times cheaper and they will still be no danger to intel.
AMD just doesn't have the capacity to make enough CPUs to make any kind of difference.
A customer that wants a PC now is not going to wait for months for new supply, they are going to make a PC with whatever they can get.
(Which is why AMD has increased sales in the last months... )
Jesus, you really are a fanboy aren't you? What makes you think that just because Intel owns their own fabs that it costs them almost nothing to make? Operation of those fabs is incredibly expensive. Sure, Intel may save some money by not giving profit money to TSMC, GloFo or Samsung but they're still footing the bill for the land, the building, the equipment that maintains the clean environment, the equipment that etches the dice, the robotics that carry the dice and the most expensive thing in any balance sheet, paid labour.Bigger or smaller doesn't make any difference...
You can fit more cores into a smaller space with smaller nodes but if you don't care about space constraints because you are the one that is making the waffers and you are making them for super cheap, then there is no problem, you make it bigger because the cost is practically zero.
You can see that by how much more money intel is making ever since they are putting more cores into their CPUs,they are using more than double the amount of cores and are making about double the amount of money.
I used to think that way but over the years I realised that the more effective way is just to get what you need for the next few years. Let's say for instance that you're looking at two video cards. One is $300 and will last for three years. One is $600 and will last for six years. Well, you could buy the $600 card now and get insane fps (that you won't notice) and it will need to be retired in six years. On the other hand, if you buy the $300 card, in three years when it needs retirement, the remaining $300 will get you a card that is superior to the $600 card from three years earlier. That's why it's always better to get "good enough" instead of "ultimate".It looks like I will have to build a new gaming PC within the next 6 months, as my 9 yo oc'd Xeon W3690 CPU can only just barely muster playing Doom@4K60Hz using the GTX 1060 6GB I added several years ago.
Hold on, this is insane... its probably still good for Diablo II: Resurrected, which is the only thing I am looking forward to (if it gets released).
Actually, I really do need a new gaming PC, since my 14 yo office PC is getting flaky so I will swap it out with the old gaming PC.
I'll just have to force myself to like Flight Sim, Cyberpunk 2077, etc. to justify putting a 3080, or similar in the new one.
Take-away: Sometimes a new PC is just about being shiny new, mostly about having a new GPU (if you already had enough CPU cores), and then trying to get many years of use out of it.
Gaming as a workload? That's my kinda work! LOLWell, in my case I do a lot of VR gaming. I have an Index and pushing 144hz mode is really difficult for my 1080. For my workload in VR, a 3080 would be quite an upgrade. The problem is finding one.
Disappointing? I don't know what you're expecting but you're saying that you'd be disappointed with AMD beating Intel in EVERY SINGLE METRIC while still being less expensive. At what point do you say "Hey, that's amazing!"? It really must suck to be impossible to satisfy.I think this will only be effective if AMD jumps over intel by a small margin.
I waited on the last Zen generation, but it wasn't as good for gaming as Intel's chips, so I thought I'd just continue waiting. If Zen 3 does improve by 15-20% IPC increase and we see the benefit on single and multi-core improvements, there'll be little reason to wait for Intel.
If they edge ahead by a few %, then it will be a disappointing release. Either way, for me, I've been waiting long enough for my upgrade and I'm hoping for stellar results tomorrow!
Not if that's your criteria. There won't be ANY USB4-only boards in 2021. You're going to have to get realistic here. I mean, come on, motherboards today STILL have USB 2.0 on them and will for the foreseeable future. It was less than five years ago when USB 3.0 first appeared and you think that you're going to find a USB 4.0-ONLY motherboard? I'd say "Good Luck" but not even luck would help you there.So, it looks like I will be building a new PC in 2021
I don't want any legacy (i.e. non-USB4) ports on my new PC or monitor.
I don't foresee that scenario ever changing because it never has. It's one of the reasons that I use AMD and ATi. Since I don't have a pie-in-the-sky budget, I get the best performance from Team Red. People who pay more for Intel and nVidia for less performance are out of their minds IMO. I can understand paying when there's no alternative but I can't understand paying more just because the boxes are blue and/or green.I do two things on my computer ( Games and VMs )
I play games at 1440p ( Whatever Intel or AMD)
VMs need cores (AMD has a lower price)
I will remain at AMD until the scenario changes.
This is a pearl of wisdom that should never be overlooked. I agree 100%.Plus, and most importantly, most PC buyers aren't people who visit tech sites 10x per day. They go to the store and get what the store has. Intel just has to convince the system makers, not the buyers, to buy the chip. AMD is great and I love how we have an option, but if I walk into a store today it's still 90% Intel or I have to special order in that AMD system most of the time.
AMD can be 10x better and 10x cheaper but if they cannot get into systems in stores in both the number of brands and volume in stock it won't matter. And that is where Intel is still the undisputed king, convincing system manufacturers to use them in every sku..
So it was, so it is, so it will be!14nm and new inevitably buggy architecture. The worst of both worlds!
Nobody said "Intel is dead" (or at least nobody with a brain). This is mostly about AMD finally showing the world that they CAN and HAVE completely defeated Intel. They haven't slain the beast, they've just sent it packing.You say that to a company that makes billions a year even with stiff competition from AMD.
I half-agree because yes, you're right about GPUs not seeing benefit but man, copying a 30GB directory in less than ten seconds really makes organising hard drives that are measured in terabytes A LOT easier.PCIe 4.0 will bring little to no benefit to gaming. PCIe 3.0 is still not saturated. PCIe 2.0 even isn't.
Storage will but for most consumers except in large file transfers we wont see much benefit.
Yep, I'd go even further and say that it's still all just versions of Sandy Bridge.Its actually not a new uArch its Willow Cove which is in Tiger Lake so its actually the second iteration to be used.
Yep, except since we can see it, it's not all that stealthy. LOLRocket lake is 14nm, yes?
Is this announcement technically another delay to 10nm? Because it feels like another stealth delay to 10nm.
The ADDITIONAL cost is practically zero, or at least that's how it looks.Jesus, you really are a fanboy aren't you? What makes you think that just because Intel owns their own fabs that it costs them almost nothing to make? Operation of those fabs is incredibly expensive. Sure, Intel may save some money by not giving profit money to TSMC, GloFo or Samsung but they're still footing the bill for the land, the building, the equipment that maintains the clean environment, the equipment that etches the dice, the robotics that carry the dice and the most expensive thing in any balance sheet, paid labour.
If you think that Intel's cost on a die is practically zero, you are hopelessly, even hilariously, out of your mind.