NEXT GENERATION MOTHERBOARDS

seerwan

Distinguished
May 10, 2009
203
0
18,690
Dear Tom's,

When will the next-generation motherboards become standard?

What i mean by next-gen, i mean motherboards with all the new connection standards, which are:-

1 - USB 3.0
2 - SATA 6Gb/s
3 - PCI Express 3.0

Thanks.
 

lordszone

Distinguished
Aug 15, 2006
495
1
18,795
hi
for now usb 3.0 and sata 3 6gb/s is already here. dont know when pci express 3.0 will come. these features are only in the motherboard supporting intel core i5 and i7 1136 socket. if u wait a bit, u can get these features on other platforms as well. in my opinion pci express 3.0 is not worth waiting because current vga cards for pci express 2.0 dont show noticeable improvements when running them on pci express 1.0. and i think apart from pci express 3.0 all the rest will be standard in late this year or the next year .
 

sportsfanboy

Distinguished
Power savings is a good thing as long as performance doesn't suffer. It took the auto industry for ever to figure it out maybe the computer industry will as well. A corvette with 500hp that gets 25-30mpg highway... Maybe a video card that can run on one 6 pin connector, and crunch modern games. Wishful thinking perhaps.
 

seerwan

Distinguished
May 10, 2009
203
0
18,690
Yo, OP here.

For USB 3.0 and SATA 6Gb/s, they may have started coming out, but on which boards? On none that i know of...
except some Gigabyte P55 boards, and i just had a look at them, they have 2 SATA 6Gb/s and 6 SATA 3Gb/s and 2 USB 3.0 connections and 6 or 8 USB 2.0 connections...
thats not exactly standard, is it? :s

But doesn't the ATI 5970 saturate the PCI-E 2.0 slot just a little? it said so in its review on Tom's, i believe... as Two 5870's in Crossfire surpassed the 5970 a little as each 5870 had a full 16 lanes each, wheres the 5970 had to share...
 

jennyh

Splendid


I'm playing 5040x1050, using 2 5770's.

Lotro on 'Very High', dx9 though. High on dx10.
WoW runs perfectly on max pretty much.
DA: Origins looks great, no fps totals but its smooth anyway.

Going on that, not much won't run near max on a single 5770 @ 1680 or 1920 resolution.
 

Hellboy

Distinguished
Jun 1, 2007
1,842
0
19,810



Its a shame your AMD processors holding you back....:)
 

seerwan

Distinguished
May 10, 2009
203
0
18,690


Why 2 5770's instead of a single 5850? :??:
 


That would be incorrect.

Gigabyte GA-790XTA-UD4 AM3 790X SATA 6Gb/s USB 3.0 as an example.

PCIe Gen3 has been delayed because PCI-SIG said so.


 

seerwan

Distinguished
May 10, 2009
203
0
18,690


sry, ure right, theres at least 1 gigbyte motherboard with AMD sockets that have them.

However, still only 2 6 SATA 6Gb/s and only 2 USB 3.0 connections... thats not making them standard...



When will all the connections be SATA 6Gb/s and USB 3.0, with PCI-E 3.0? Q4 2010? Q1 2011? Q3 2011?...

 


What I think (which is pretty messed up anyway) is that the debate is raging over wattage for Gen3. We have now hit the 300w spec limit for the slot. That comes out to a 19a rail feeding your card with 75w from the slot.

For some reason I'm thinking the 'powers that be' are pressing for more jigga-watts.




Need more components that meet the higher spec. Chicken or Egg? :p

 

rodney_ws

Splendid
Dec 29, 2005
3,819
0
22,810


I'm curious... are you saying that future GPUs will require more than they require now or that things are getting a little ridiculous with plugging in multiple power cables to our video cards? Just for the sake of simplicity, I'd like to see more power going to the GPU slots just so we don't have to worry about the 6 and 8 pin connections.
 

BadTrip

Distinguished
Mar 9, 2006
1,699
0
19,810



Simplicity? Yes, but I dont want that much power going through my board, if it's even possible.
 

DarkMantle

Distinguished
Aug 6, 2008
131
0
18,690


Exactly!!!; you might want to read this review Hellboy:

CPU scaling with the radeon 5970
Round 2: Phenom II x4 scaling.

http://www.legionhardware.com/document.php?id=869&p=12

"The Phenom II X4 results were quite different to those recorded when testing with the Core i7 processors, though this was not necessarily a bad thing. When operating at lower clock speeds, the Phenom II X4 did not fair all that well, as we saw a sharp decline in performance. However when clocked at 3.0GHz and beyond, the Phenom II X4 really picked up the pace, and in many cases was able to outclass the Core i7.

In games such as Wolfenstein, Call of Duty: Modern Warfare 2, Tom Clancy’s H.A.W.X, BattleForge and Far Cry 2 the Phenom II X4 processors were actually faster when clocked up near 4GHz! This is quite amazing as out of the 9 games tested, the Phenom II X4 series was faster than the Core i7’s in 5 of them. Although the margins were very limited, the Phenom II X4 was found to be faster, and had it just managed to match the Core i7 series with the Radeon HD 5970, we would have been impressed."
 
My guess is, keep an eye on this gens high end gfx cards. Currently it a downclocked 5970, at 288 watts.
If nVidia wants that halo product, theres nothing holding back a non pci standard card, much like the MARS card, sporting 2 full 285s, if this continues, itll move the current metric for power
 

Kewlx25

Distinguished
There's 1500MB/sec SSDs coming out soon. How long until 1500MB/sec is standard?

I can see a a large change in computers in the next 10 years that will dwarf any historical computer tech related jump.

Some people see PCIe 2.0 and say "why do we need 3.0?". Well, in order reach those radical tipping points, you need to take the smaller evolutionary steps.

As IO bandwidth increases and latency drops, locality starts to become less important. What sounds stupid now may be a major breakthrough in a few years.

Some day we may even see a complete merger of GPUs and CPUs. Computing on the GPU may become about as transparent as computing on the CPU.

There's a huge push to help simplify the basics and management of multi-threaded programming. I bet some of these new new libraries may soon be able to detect GPU friendly work loads and transparently schedule work on the GPU(s).

I can't wait to see what we have in 10 years.
 

TRENDING THREADS