Sound card incompatible with AMD

JeanPi

Honorable
Jul 30, 2015
35
0
10,530
Hello everyone. I'm making this thread on behalf of a friend who's a producer in the music industry. Just like night, he finished setting up his new PC for music production, which includes and AMD Ryzen 7 1700. Everything is running great on the new hardware, but there's a tiny detail that he missed out on: His sound card plugs in through Thunderbolt and Thunderbolt is incompatible with AMD from what i understand. The question here is: What alternatives does he have besides trying to send back his MOBO and CPU and buy a Intel cpu and mobo?

Thanks!
 

Eximo

Titan
Ambassador


That is good to know. I don't have any AMD systems at the moment but I will keep it in mind for my brother who does music production casually. He's been in the market for a new system for a while, though I think for him a laptop makes more sense at this point.

 
Not here to disagree with the above answers, but wanted to add, while Intel may have made both the licensing royalty free and also open sourced Thunderbolt, a license must still be acquired and all products certified by Intel, unless you like getting sued.

Now that USB 3.1 Gen 2 Type-C supports DP video and USB 3.2 doubles bandwidth, there's little benefit of companies chasing down the TB rabbit hole and making themselves subject to Intel's certification process unless you fall into the niche of needing that 40 GB bandwidth and 2x 4k displays VS 1 @ 60 Hz over a Type-C connector.

TB has been demonstrated working on a Threadripper machine, albeit non-trivial, but it shows there really are no insurmountable technical hurdles for the AMD side of the isle to implement TB if it was interested in doing so.

My suspicion is TB was controlled too tightly for too long and is going to lose out to USB 3.2 Gen 2 Type-C despite Intel's 4th quarter attempt at improving adoption of the port.

Both Gigabyte and ASUS have had TB add-in boards, ASUS having theirs way back in 2012 with AM3+ motherboards ready to go with support for it, but my understanding is the card was never certified by Intel, so...

Looks like peripherals supporting USB 3.1 Gen 2 or USB 3.2 is a cheaper, easier way to go forward, rather than propping up overly restricted standards.
 

InvalidError

Titan
Moderator

The biggest benefit to TB is low latency - it is fundamentally external PCIe, which bypasses the need for USB host/target controllers, USB packetizing and abstraction, the additional extra PCIe hop or two, etc.

IIRC, rumors from earlier this years said the next iteration of USB will either introduce a native PCIe alt-mode or adopt TB outright.
 
While I agree with you wholeheartedly about ThunderBolt's low latency and transport of PCIe beyond the confines of the locality of the motherboard, and that there are some applications where undoubtedly this is ideal, such as external GPUs and some high end audio processing, I really don't see TB as the best interface going forward, and I really don't see the industry jumping on TB, mostly because of political reasons rather than technical merit.

The biggest reason I see TB as having any traction at all is Apple's forcing it down the throats of their users whether they wanted it or not.

TB cables have chips in them to facilitate proper operation and actually need the chips to attain the touted bandwidth numbers, plus a two second search of google shows these cables tend to cost well over $10, which to a lot of the world market is an issue and limits adoption. Are we really concerned about host/target controllers at this point since both the host and target actually end up having some sort of controller anyway? Most smart phones have processing power to spare, which isn't to say I agree with wasting CPU cycles, but I'm also not a proponent of requiring expensive "smart" cables either. Most consumers aren't looking for expensive solutions. This added cost works against TB and keeps it more niche.

You bring up rumors, which I'm not against, but if you want to talk rumors, USB 4 is rumored to have 100 Gb/s bandwidth, and final specs may be announced as soon as 2020. I have run across no rumors about TB 4 and have no idea when the specs are supposed to be announced, if they even have enough market share left to come into fruition.

I think I can understand if Intel is leery of certifying TB equipment for the TR4 platform, as on a lot of Intel equipment, TB is routed through the PCH controller and it's 4 DMI lanes, and as such may show an unfavorable situation on Intel equipment, whereas on TR4, the PCIe lanes route straight from the processor, eliminating that potential bottleneck. I still see even this as niche outside of review sites and select few parties that have an interest in the drama it promotes. How many real world situations are pushing a 40 Gb/s interface to it's limits on a regular basis!

Companies have worked around USB's latency issues since it's inception, and can continue to do so. It isn't as though the multi-track recording industry with effects added began at the onset of TB in the marketplace. Companies that choose to forgo the wider adoption of the USB interface for the TB interface can continue to do so. There should always be plenty of ways to get the same job done.

I can also understand if the market for equipment that truly needs what TB brings to the table is so low that the cost to pursue it for AMD, Gigabyte, ASUS, etc., is higher than what they see it bringing in sales.

I see newer USB revisions as the market's interface going forward as history shows it's been well accepted, despite it's issues. Whether Intel capitulates the need to jump through the remainder of their hoops or not for certification I'm unwilling to wager on. They may be a company made of smart people, but they've made their fare share of silly decisions, and I'm not going to bet that hard heads suddenly become soft, just to save an interface.

The current situation seems unfortunate, as I like the idea of pairing external GPU solutions to small form factor computing solutions. Here's to hoping that the more open USB standard grows to include the functionality of TB, rather than as it is now, the other way around.
 

InvalidError

Titan
Moderator

There is a tentative standard to bring PCIe to SD cards for speeds beyond 100MB/s, you'll want external PCIe of some sort to interface with those at full speed. Thumb-drives / external SSDs will want some form of external PCIe too as ultra-fast devices become both more widespread and affordable. With more mobile devices and laptop capable of supporting docks, a more capable expansion mechanism than USB3 is likely to be much desired too.

You only need chips in the cable plugs if the cable length is going to be significant, unknown or the transmission media requires signal conversion. In the case of a thumb-drive, external SSD or PCIe SD-card reader, the cable length is practically zero and termination can be left to the controller IC. If the cable is permanently attached as is often the case for port multipliers, cable compensation can be built in the main device instead of the plug up to several inches, long enough for most cases.

USB3.0 doesn't go much beyond 10' without active cables or stringing hubs either and this distance will shrink as speed goes up until we go straight optical (with optional power) from port to port.