AMD: Thunderbolt Another Proprietary Standard

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
Status
Not open for further replies.
[citation][nom]Snaggy7[/nom]Let's get one thing clear, it's a hella fast connection that could leverage a larger variety of media devices. AMD's argument is mute. Comparing USB 3.0 (with it's less than ideal current speeds) to "Thunderbolt" is ridiculous. "Thunderbolt's" 10Gps (both up and down with minimal overhead)trumps pretty much everything available now. Don't worry, the devices (Apple has a year before everyone else) are on the way. Also, the option of going optical still remains. So, expect to see optical variations to "Thunderbolt" next year.[/citation]USB is a standard. Intel's solution is not. I'd rather get a device that has USB 3.0 and/or eSATA first, and have a "thunderbolt" connection as a secondary connection. If "thunderbolt" is the only connector, that limits what I can connect it to.

If Intel gave a damn about standards, they would have worked to help improve USB and SATA further. We had a USB 1.1 spec which vastly increased speeds over USB 1.0, I don't see why we couldn't see a USB 4.1 spec with the same connector that is backwards compatible with 4.0 and earlier.

In reality, Intel just wanted another proprietary technology they could license. I mean, they gave it to Apple first and foremost, that should tell you a lot. This is about Control and Profit.
 
AMD does have a point in that the tech is limited in where it would maximize its potential -- namely replacing USB 3.0, but that's one VERY hard sell considering how prominent USB has become.

Though, I do think that fiber optic can be the future with regards to components speaking to each other. But yes, a half-assed, "safe" approach that thunderbolt is you're certainly not going to get much of a WOW factor.
 
[citation][nom]Snipergod87[/nom]I can see the issue with linking displays over light peak when using multiple displays.But as of now there is no other standard with that bandwidth that will support devices like external hard drives.[/citation]
Because external HDs are even close to using up the bandwidth of USB 3.0 right??
 
@AMD - bitter much?

Only time will tell with Thunderbolt but the surprise move that it uses a displayport connector could turn out to be a stroke of genius. For one, any (future) displayport display could for a few bucks extra become a true docking station for a laptop - with USB ports, firewire ports, webcam, audio. It would probably not even cost that much to add in things like esata and ethernet ports. All with one connector to a laptop.

There's also a good chance that GPU manufacturers (well nVidia to start, AMD begrudgingly a little later) would start bundling in thunderbolt chips or thunderbolt pass through capability on their graphics cards.

This is also the kind of technology that could cause even gamers to migrate away from desktops. Laptop CPUs are almost as powerful as desktop CPUs but the true cannot be said for laptop GPUs. If Thunderbolt takes off then I'm sure it won't take too long until someone releases a thunderbolt connected box that has a 16x PCIe slot, audio, USB etc. Something that could take a full size desktop GPU and connect it to a laptop - one cable and your lightweight portable laptop becomes a gaming powerhouse. Even with the current state of Thunderbolt (10GB/s in each direction), a box like that could handle a 16x PCIe 2.x slot and have 2GB/s to spare in each direction for USB peripherals.
 
If 10Gbps can be made affordable enough for consumer desktops and laptops, this should crash prices on iSCSI, blade other HPC infrastructure. This should be a better fit in that environment where upgrade cycles are long and proprietary tech is less an issue anyway.

That said, if TB does speak PCI-E, wouldn't it follow that external desktop-spec graphics cards for laptops are not only feasible but relatively trivial? (Even if they would require their own power brick or two =p)

I imagine a TB-to-PCI-E-16x slot adapter mated to a couple of 200W pico-psu's (the ones used in ITX PCs) should allow a vanilla desktop card to run.

Or I could just be dreaming...
 
[citation][nom]southernshark[/nom]Intel makes good processors for desktop computers. That's about it. The rest of their products are subpar.[/citation]

Intel makes some of the best NIC cards available on the market. Also, their SSD's are a solid performer and in some cases blow the competition away.

Subpar? I don't think so.

I'm not saying all Intel products are great, their IGPs leave much to be desired, but Intel does make some high quality products outside of CPUs.
 
Well the fact is, for thunderbolt to be successful, it has to hit a few different points all at once. Its gotta be pushed by vendors, its gotta be at least a few times faster, and of course it needs to be cheap. It has absolutely no history so trying to jump in with it is going to be extremely hard for Intel. I honestly, doubt its gonna be much of a success. Light Peak seemed interesting, say have 2 optical in and out channels and then throw power into the mix, 5 or 12 volts, doesn't matter, and you would have something that is fairly cheap, and very backwards compatible, the frequencies you can reach with light will always dominate copper, and there you have an insane amount of bandwidth thats limited by either side, but it would never be a problem to have it be backwards compatible with newer technology. Its something that can grow without changing the cable/ports. Thats what we need right now.
 
Everything about a copper based TBolt seems like two steps back. Daisy chained devices into one connection? There's a reason that SCSI isn't a common connection on consumer devices. There's also a reason that docking stations aren't top selling accessories. I mean, what are you trying to tell me, that I can have my graphics, hard drives, and CPU as different devices all chained together through the TBolt connection? That's better than a desktop that keeps everything in one box? It has it's uses for docking laptops, but I can't imagine the whole graphics box type deal becoming a reality until it's an optical connection. Electrical connections will will have issues like interference to bring their speeds and latency down to less than ideal.

Transmission speeds aren't their bragging point, since DP and other connections can release newer specs and up their own speeds. It won't replace USB either since it could never have USB's backwards compatibility. It won't be a big plus for SSDs either until they're built for TBolt, currently they will be dropped to SATA speeds anyway.

There's potential here, especially when these connections become optical. Optical connections have their own issues though, like the fragile and inflexible cables and the fact the light signal has to be changed back and forth from an electrical signal at both points. I think Intel realized that they weren't quite ready to make the optical connection happen, so they're using the copper based tech to try and get the connections out there.
 
[citation][nom]schmich[/nom]Because external HDs are even close to using up the bandwidth of USB 3.0 right??[/citation]

No they arent but they could be. We are already damned close to reaching the SATA 6Gb's max bandwidth. If you get an external drive with two of those drives in a RAID 0 you would need Light Peak. eSATA and USB 3.0 would do poorly
 
Let's get one thing clear, it's a hella fast connection that could leverage a larger variety of media devices. AMD's argument is mute.
heheh.. moot i tell u, moot!
 
Intel had the opportunity like some of the more informed user's have commented on to make some truly astonishing throughput, but then decided to make something just "ok" which is why I hate to say it but AMD may have a point here. I'm not complaining though, SB set-up in a couple weeks for me, BD no where in sight. 🙁
 
If they actually make the "light" aspect and achieve up to the specs of 50 Gb/s I could easily see this product saturating the market.
 
[citation][nom]newbie_mcnoob[/nom]Go Intel! Maybe this proprietary interface will be as successful as your early Pentium 4 motherboards that used RDRAM. Oh wait...[/citation]

Where do you guys come from? RDRAM was EXTREMELY successful for the Pentium 4, so much so that it DDR was consider an inferior solution. RDRAM wasn't very successful on the Pentium III, but that's got nothing to do with the RDRAM. DDR didn't fare any better, because the Pentium III could not handle the extra bandwidth. The Pentium 4 was a different animal, and benefited greatly from the additional bandwidth of RDRAM, and to a less extent DDR.
 
[citation][nom]banthracis[/nom]Problem is, at 10gb/s thunderbolt is slower than DP 1.2. Yes, it's twice the speed of USB 3.0, but considering that SATA III is 6gb/s and already looking short with the new SF SSD's, this is hardly a big improvement. Intel had the opportunity to release a format that could done 50 gb/s or 100gb/s easy and would have been a major revolution. Heck, with 50gb/s it could have replaced both a PCIe x32 link and a DP link and allow cabling of huge lengths without signal loss. It would have made a modular PC setup possible with PC in a (well ventilated) closet somewhere, 1600p monitor on the other side of the house and the GPU replacing your space heater a possibility. The point is that Intel chocked and decided to just stick with copper. Can they upgrade to fiber? Sure, but any feature they release in the future will have to be backwards compatible and work with current devices.Do you really think Apple is ok with their customers coming in next year and asking why their macbook pro can't run a new 3D monitor with thunderbolt despite Intel advertising it as a possibility with the new fiber based thunderbolt?[/citation]
I agree with everything you said except for the last part where you implied Apple cares about their customers. Apple LOVES to make their customers buy new products due to some craptastic marketing ploy, its what they are the best at. It will be firewire all over again.
 
[citation][nom]schmich[/nom]Because external HDs are even close to using up the bandwidth of USB 3.0 right??[/citation]

http://www.overclockers.com/wp-content/uploads/2009/11/HDTune-Vertex.jpg

This is a first generation OCZ Vertex. As can be clearly seen, SATA is crushing USB 3.0 in throughput. One can only imagine what this chart would look like with one of the new Vertex 3 drives. So your snarky attempt at sarcasm just looks like ignorance at this point.

Something else of major importance that can be learned from this chart is another of USB's achilles heals. CPU utilization. USB 3.0's CPU utilization is through the roof compared to any of the other standards. Granted, the chart is showing 10x utilization, which means USB 3.0 is using around 10% utilization instead of the near 100% the chart shows. But it needs to be noted that the test system is a liquid cooled 4.4Ghz Core i7 870 system. Not exactly a typical computer. And again, this is testing a 2+ year old Vertex drive. CPU utilization is not going to get any better connecting a much faster current generation SSD.
 
[citation][nom]TA152H[/nom]Where do you guys come from? RDRAM was EXTREMELY successful for the Pentium 4, so much so that it DDR was consider an inferior solution. RDRAM wasn't very successful on the Pentium III, but that's got nothing to do with the RDRAM. DDR didn't fare any better, because the Pentium III could not handle the extra bandwidth. The Pentium 4 was a different animal, and benefited greatly from the additional bandwidth of RDRAM, and to a less extent DDR.[/citation]

Yes, but early RDRAM was 400MHz at best on a 16-bit bus, and suffered slightly from heat and latency issues as compared to DDR. Sure, Rambus was snuffed out of the market thanks to practices you wouldn't see out of place at, say, Intel, but Intel was gravitating away from them anyway.

I don't see how DDR was considered inferior to Rambus, considering that the first DDR products were faster than the comparable Rambus ones.
 
I'm a little confused on this bit:

The DisplayPort1.2 standard offers up to 17 Gb/s of peak bandwidth for displays. The total bandwidth for a Thunderbolt channel is only 20-percent higher than one PCI Express 3.0 lane and about 52-percent higher than a single USB 3.0 port. "AMD-based platforms support USB 3.0 which offers 4.8 Gb/s of peak bandwidth, AMD natively supports SATA 6Gb/s with our 8-series chipsets," the AMD official claimed./quote]

10 Gb/s is a lot more than 1.52x 4.8 Gb/s. And that's 10 GB/s in each direction.

PCI Express 3.0 bandwidth is great when you've got stuff inside the computer. Less useful when we're talking external peripherals. How is this even relevant?

6 Gb/s is incomparable to 10 Gb/s bi-directional.

I think Apple had a great example where they were reading hundreds of MB/sec of video into the display, and from the display to the computer, then using the same connector for the display. The data transfer didn't touch the display's bandwidth, and in the example above, it would only happen if data was being written to the external drive. Even then, at 3 Gb/s for uncompressed 1080p @60fps, there's still tooons of bandwidth to go around.

With a single port, Thunderbolt lets you get that 6 Gb/s of the newest SATA interfaces with 1080p output, AND lets you input the same stuff SIMULTANEOUSLY.

Consider having an external that can simultaneously read and write 1.2 GB/sec, and having it connected to a computer with a single port. That same port which can be repurposed for an external display or something else as needed.

I see Thunderbolt as a great way to cut down on the number of ports available. In the next few years we're going to see DVI and VGA vanish completely from laptops, which will be nice. We'll have USB 3.0, Gigabit and DisplayPort/Thunderbolt ports as our only I/O. And in theory, Thunderbolt could fulfill the same roles as USB 3.0 and Gigabit in the future.

Here's another way to look at it. You COULD use a DisplayPort connector on a laptop, or you could use a Thunderbolt port to get the same display connectivity (technically less bandwidth from laptop to display, but even a 1080p 120 Hz display won't saturate 10 Gb/s, and realistically only fringe gamers and professionals will use daisy-chained displays) and have the potential to pull down data through the same port, either simultaneously or instead of the display. With the exception of daisy-chaining 4 or more 1080p displays, you lose nothing in swapping a DisplayPort connector out for a Thunderbolt one.
 
[citation][nom]Snipergod87[/nom]No they arent but they could be. We are already damned close to reaching the SATA 6Gb's max bandwidth. If you get an external drive with two of those drives in a RAID 0 you would need Light Peak. eSATA and USB 3.0 would do poorly[/citation]
Hard drives don't even come close of saturating sata 3gb/s, let alone sata 6 gb/s.
 
Aimed at professionals huh? Then why the hell would you put Thunderbolt in an Apple notebook??? All the pros I know use Windows. It seems to be another theoretical marketing tool.
 
It's a good reason for a lot of Mac fans to buy a new one. They can rationalize it easier since they really do want a shiny new one. They're good at putting these carrots out.
 
Status
Not open for further replies.