AMD's Future Chips & SoC's: News, Info & Rumours.

Page 119 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
From what I'm seeing the 2080 Ti runs at 90+ FPS in MW.
Of course these are mfg.-released numbers. We'll have to wait a bit for independent reviewers to run head-to-head controlled benches.

I dunno where you got that number from, Here's a recent review by WWCCF Tech that includes COD:WM at 4K ultra:

73 FPS for a 2080ti at 4k Ultra, 3080 hits 101 so looks like the new AMD cards sits mid position between those cards in that game. I agree though can't draw much in way of conclusions until we get some real benchmarks.
 

aldaia

Distinguished
Oct 22, 2010
533
18
18,995
x86 isn't going anywhere, due to software compatibility reasons. The last chance to abandon it was Itanium, and we know how that ended.

Really it comes down to the same fundamental problem: Emulating the x86/x64 instruction set costs performance, and consumers have already shown (via Itanium) they aren't willing to accept a short-term performance loss while a replacement architecture matures.

ARM will remain powerful in the embedded market (where it's all but forced out PPC at this point) due to Intel's failure to produce an attractive low-power x86 chip, but the server and desktop markets are software-locked to x86-64, and that isn't changing any time soon.
It looks like Apple doesn't agree.
 

InvalidError

Titan
Moderator
It looks like Apple doesn't agree.
  1. Apple is a negligible slice of the x86 pie
  2. Apple has a mostly platform-agnostic software development environment to facilitate IOS-MacOS development, much like C# for Micro$oft and Java for Android, the native x86 binary-only code base is much smaller than Windows/x86
  3. this isn't Apple's first time changing platforms for their desktops and laptops, which is one of the main reasons for #2 above
 
It looks like Apple doesn't agree.
I'm sure a lot of pro's in the Mac world will love this... Then again, a zealot do what zealot does.

As now ARM is part of nVidia, I wonder if they'll keep licensing it for their stuff.

They can always go RISC-V, right? lol

Or talk to IBM again.

Cheers!
 
I know Apple is planning their own graphics for now, but I would not be surprised to see Nvidia graphics sneak into Apple products for their second round of ARM products.

Perhaps.... although Apple fell out with nVidia in a pretty major way a while back - it's been notable they have stuck exclusively with AMD graphics hardware in their PC / Workstation products for many generations now, despite nVidia having an unquestionable lead in both spaces.

I can't imagine Apple will be very happy about this deal - then again they design their own cores based on the ARM instruction set, so they can probably continue to do so as long as nVidia don't mess around with the licensing terms.
 

InvalidError

Titan
Moderator
I can't imagine Apple will be very happy about this deal - then again they design their own cores based on the ARM instruction set, so they can probably continue to do so as long as nVidia don't mess around with the licensing terms.
I would expect most companies that choose to design their own cores to have very-long-term if not perpetual ISA licenses, so Nvidia's ability to screw with them may be limited to new instructions that aren't covered by pre-existing licenses.
 
  • Like
Reactions: cdrkf
I would expect most companies that choose to design their own cores to have very-long-term if not perpetual ISA licenses, so Nvidia's ability to screw with them may be limited to new instructions that aren't covered by pre-existing licenses.
Agreements and contracts can be broken. It all depends on how profitable it is for nVidia to do so, in this particular case.

That being said, I hope you're right for everyone's benefit >_<

Cheers!
 

InvalidError

Titan
Moderator
Agreements and contracts can be broken.
You may be able to break an agreement but when that decision is unilateral and the other party has the means and motivation to come at you for compensation and damages, it can get absurdly expensive. There isn't much doubt that Apple is more than ready and able to sue the pants and shirt off of Nvidia if that ever happened, same goes for many of the largest ARM licensees.

What happened when Intel tried to pull x86 licenses from everyone else about 20 years ago? The courts ruled that x86 was vital to national security and forced Intel to honor its x86 licenses to ensure the government has more than one viable source. Given how ARM-based chips are pervasive throughout the modern digital infrastructure (between your PC and connected peripherals, you likely have over a dozen ARM-based MCUs running things from your mouse and keyboard to your SSD and monitor OSD), the same national security argument could be made here.

Between the many private sector deep pockets with axes to grind and national security concerns, Nvidia deciding to arbitrarily pull licenses so it can satisfy its own greed would be unlikely to end well for Nvidia.
 
You may be able to break an agreement but when that decision is unilateral and the other party has the means and motivation to come at you for compensation and damages, it can get absurdly expensive. There isn't much doubt that Apple is more than ready and able to sue the pants and shirt off of Nvidia if that ever happened, same goes for many of the largest ARM licensees.

What happened when Intel tried to pull x86 licenses from everyone else about 20 years ago? The courts ruled that x86 was vital to national security and forced Intel to honor its x86 licenses to ensure the government has more than one viable source. Given how ARM-based chips are pervasive throughout the modern digital infrastructure (between your PC and connected peripherals, you likely have over a dozen ARM-based MCUs running things from your mouse and keyboard to your SSD and monitor OSD), the same national security argument could be made here.

Between the many private sector deep pockets with axes to grind and national security concerns, Nvidia deciding to arbitrarily pull licenses so it can satisfy its own greed would be unlikely to end well for Nvidia.
Quote the whole thing for context, please. Your reply ate my reasoning after that which more or less resonates your own and makes my quote incorrect... It's like you're one of those bad TV shows that clips the guest answer to fit the editorial agenda, haha.

And I have no idea why it's a security concern (or was), but ARM is not really used at the controller level everywhere. You have RISC-V and other RISC-based ISAs in fact. I doubt micro-controllers use complex ISAs anyway.

It's all a matter of cost/benefit at the end of the day.

Cheers!
 
ARM currently has a practical monopoly on mobile devices and governments use tons of those, so it would be a national security concern if someone locked up the ARM supply.

Meh, there's still a lot of PPC6/PPC7's floating around, but defense and embedded is certainly moving toward ARM. I don't think I've ever seen a RISC based CPU used in an embedded system.
 

InvalidError

Titan
Moderator
I don't think I've ever seen a RISC based CPU used in an embedded system.
Broadband routers are one class of embedded systems and many used MIPS chips for a long time. Almost anything with a graphical UI or network connectivity has an embedded system of some sort, which will often be either whatever the lead developer(s) felt most comfortable with or whoever offered the best deal if the dev(s) don't mind having to re-tool their workflow and sort out the differences.
 
So... AMD has purchased Xillinx, I don't really know all that much about the latter - anyone got any thoughts on the rationale behind the deal, possible future use of the tech and such?

I'm genuinely interested but not really been able to find anything about it beyond some observations on the financials of the deal (consensus seems to be AMD overpaid)...
 

InvalidError

Titan
Moderator
So... AMD has purchased Xillinx, I don't really know all that much about the latter - anyone got any thoughts on the rationale behind the deal, possible future use of the tech and such?
FPGAs are often used in AI and high speed large-scale routers/switches. Intel bought Altera to integrate some of Altera's FPGA tech in its larger-scale CPUs, Nvidia bought Mellanox to use its high-speed interconnect stuff in its own AI/HPC products and now, AMD is buying Xilinx to avoid getting left behind on high speed scalable "smart" interconnects, may also have plans to integrate some FPGA tech in future EPYC.
 
  • Like
Reactions: cdrkf
So... AMD has purchased Xillinx, I don't really know all that much about the latter - anyone got any thoughts on the rationale behind the deal, possible future use of the tech and such?

I'm genuinely interested but not really been able to find anything about it beyond some observations on the financials of the deal (consensus seems to be AMD overpaid)...

I can speak to this somewhat, since I use Xilinx's toolset on numerous projects.

Xilinx is a fabless semiconductor company that is most well known for developing the first FPGA's and was among the first to create/support programmable SoCs. As a result, they are fairly well established within embedded markets. They have their own Eclipse based development studio that's aimed more for system-level programming then the more established IDEs (MSVC/GCC) to make is easier to program for said HW. [I can personally say the IDE is "OK"; I've had fewer odd compiler bugs then the WindRiver Diab compiler at least...It's not MSVC though, but I guess nothing else really is.]

In short: Xilinx is a major player within the embedded market. Their FPGA/SoC designs and toolsets could potentially be leveraged by AMD to get a foothold in more embedded applications, where x86 is basically non-competitive.

There's also Xilinx's patent portfolio to consider; owning the guys who created FPGA's and a leading SoC designer is sure to bring some useful patents that can be leveraged against other chipmakers (read: Intel/NVIDIA) for some extra cash down the line.

For what Xilinx brings as a stand-alone company, AMD certainly overpaid. But if AMD can leverage what Xilinx brings with what AMD already possesses AMD could potentially make that back and then some.
 
  • Like
Reactions: cdrkf
I can speak to this somewhat, since I use Xilinx's toolset on numerous projects.

Xilinx is a fabless semiconductor company that is most well known for developing the first FPGA's and was among the first to create/support programmable SoCs. As a result, they are fairly well established within embedded markets. They have their own Eclipse based development studio that's aimed more for system-level programming then the more established IDEs (MSVC/GCC) to make is easier to program for said HW. [I can personally say the IDE is "OK"; I've had fewer odd compiler bugs then the WindRiver Diab compiler at least...It's not MSVC though, but I guess nothing else really is.]

In short: Xilinx is a major player within the embedded market. Their FPGA/SoC designs and toolsets could potentially be leveraged by AMD to get a foothold in more embedded applications, where x86 is basically non-competitive.

There's also Xilinx's patent portfolio to consider; owning the guys who created FPGA's and a leading SoC designer is sure to bring some useful patents that can be leveraged against other chipmakers (read: Intel/NVIDIA) for some extra cash down the line.

For what Xilinx brings as a stand-alone company, AMD certainly overpaid. But if AMD can leverage what Xilinx brings with what AMD already possesses AMD could potentially make that back and then some.

Thanks - it sounds like this is like the ATI acquisition - will probably be a number of years before the benefits of the purchase are really apparent. Also good point on the patent side
 
Intel snatched Altera, Nvidia snatched Mellanox, would be awkward if AMD was the only CPU/GPU designer without ludicrous speed interconnects of some sort.

Makes sense; with NVIDIA trying to get into CPUs, and with Intel trying to get into GPUs, you are seeing a major wave of acquisitions between AMD, Intel, and NVIDIA. In short: we're going to see a LOT of contraction as these three gobble everyone up.
 
Makes sense; with NVIDIA trying to get into CPUs, and with Intel trying to get into GPUs, you are seeing a major wave of acquisitions between AMD, Intel, and NVIDIA. In short: we're going to see a LOT of contraction as these three gobble everyone up.

It's interesting - as AMD said many yeas ago "The future is fusion".... it's funny they've gone quiet about all that heterogenous compute stuff but actually that is exactly the way things are going. Graphics cards with dedicated ray tracing, AI and now even storage acceleration, cpu's with dedicated video transcode engines, as well as pretty much the whole motherboard chipset....
 
AMD got SeaMicro's "Freedom Fabric" a long time ago and now the have scaled it to... well, pretty much everything. Their implementations are very interesting.

Also, not that Hypertransport was bad either, but they (I think) bolted it nicely with the IF now.

Cheers!