8350rocks :
jwcrellin :
So you guys think that usb 3.1 type c will take a larger role in upcoming platforms?
If I were a gambling man, I would advise you not to hold your breath for 3.1 type C to catch on. The main draw right now is for charging mobile devices more quickly...not much else. Regular USB 3.2 is likely going to be the way forward, and there are already USB specs that match thunderbolt for bandwidth without the proprietary schtick to go with TB, they just are not mainstream yet.
There is currently no USB spec that is faster than TB, short of the rumored USB 4.0 with a 10GBps bandwidth. The fastest available USB is 3.1 with a 10Gbps bandwidth, slower than just one direction for TB.
I see what Intel is trying to do with TB. USB was supposed to be an "all in on" port but there are still plenty of non USB ports. TB is trying to tie everything into a single port.
gamerk316 :
8350rocks :
gamerk316 :
8350rocks :
jimmysmitty :
gamerk316 :
jimmysmitty :
Thunderbolt is 20Gbps bidirectional (40Gbps total) vs 10Gbps for USB 3.1 Gen 2. Double the bandwidth. Double the bandwidth on a 10Gbe, which there is still no 10Gbe router for the consumer and very few consumer boards come with a 10Gbe NIC built in.
So how are any of those better? Thunderbolt can run on USB Type-C and provide double the bandwidth and is more versatile?
@logainofhades, it also looks like the 7740K will be foregoing the IGP which might benefit in OCing headroom.
You forget Thunderbolt is several times more expensive to implement, and has the other problem of not being universal in the same way USB has become. Seriously people, let's not go back to the days of connector wars; USB is good enough, and will be upraded as needed. We don't need a half dozen different connectors that cover every possible use case.
Yet Intels idea was to use the USB connector to keep compatibility while offering better bandwidth.
I understand sometimes open is better but the biggest problem is they sit idle. How long did it take to move to USB 3? And even then it still had multiple issues, such as encoding overhead that cut 20% of the bandwidth off the top and required another dtep to clean up.
Sometimes a new interface has to come and wipe out the old on. TB 3 can run on USB type-C so to me it is a win win. Sure it is more expensive now but as with anything time makes it cheaper as does implementation.
USB 3 was not cheap to start. I remember when they first hit they were quite a bit more expensive than USB 2. In fact the shop I worked at we waited till the price dropped quite a but before going from ourr USB 2 Xporter drives to the USB 3 Xporter drives.
Then you have things like Display Port in the same breath that make near quarterly updates to spec, and newer versions progress displays with each step. By the same token, on the opposite end, you have things like JEDEC that plod along.
That's because you don't need updates to the base specification that often. Nevermind the cost of retooling production for each individual specification. Unless there's a glaring need, there isn't much reason for all these specs to constantly be adding features at a breakneck pace.
And I'll say again: Displayport has no purpose. It's DOA outside of PC Monitors as HDMI is ubiquitous in Home Theaters now, and HDMI offers pretty much the same feature set at a lower cost. I have no idea what problem Displayport was ever trying to solve.
Not having to pay a royalty to HDMI because it is an open source spec and not a privatized spec requiring a license.
Also, DP allows up to 8K through a single connection, even the newest HDMI does not allow that without 2 cables.
And...who needs 8k again? Call me when 4k becomes standard. And by then, HDMI will update the spec to cover that case.
I am more of a fan of DP than HDMI though. Still plenty of issues to be had with it. I would prefer if DP replaced HDMI but that probably wont happen since HDMI is backed by major electronics conpanies (like Panasonic and Sony) and major television/movie studios like Fox and Warner Brothers.
YoAndy :
There are really no actual performance gains between sandy bridge and kaby lake.
Its all in clock speed, which you could achieve with Sandy Bridge.
Isn't it funny that Ivy Bridge and Haswell and Broadwell couldn't hit the same clocks as Sandy Bridge and now Skylake can? SO that they can create a fake performance increase by slowly reaching the same performance they had before and pretending like their specialized architecture extensions which aren't used by 99% of applications are "gains".
http://www.anandtech.com/show/11083/the-intel-core-i3-7350k-60w-review
Yet a 2c/4t CPU is matching an i7 2600K in most areas.
There are plenty of performance gains outside of clock speed. Problem is that the majority of consumers will not see them or utilize them. The other problem is that Intel doesn;t thrive on consumer sold chips, their bread and butter is in the professional and server/HPC market, that they own over 90% of, where they can make 2-10x the margins per CPU.