Look Who's Here: Matrox With Two Graphics Cards

Status
Not open for further replies.
Given that they are using AMD graphics chips, and AMDs cards with those graphics chips are probably cheaper. I don't see what they are hoping to gain. If they wanted to limit themselves to being used for things like this also, why didn't they just continue there own line? It doesn't need to be competitive with high end GPUs, but if they still wanted in this low end stuff why not at least have there own product.
 
Man I miss Matrox! My first real GPU setup was a G550 paired with an RT2500; that thing was a 2D beast back in 2001, and I remember having issues getting HDDs fast enough to feed those cards for video editing. To think that now my phone has more video editing capability than that costly and power hungry setup.

*sigh* remember when Matrox tried to get into gaming... that was fun times!
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010
If Matrox can get these things to drive that many displays @ 60Hz, and in one case do it with a passive cooler, why can't AMD and Nvidia? You can't drive 3 4K displays at 60 Hz with their top end consumer grade dual-GPU solutions, and those require beastly coolers.

My guess is that these cards aren't worth a flip at gaming, that's the biggest difference. On the other hand it shows to me that both the major desktop graphics card companies are either holding back or thinking inside a box that prevents them from truly innovating. We are told by the hardware press that we are going to have to be satisfied with a slower GPU development cycle because of the difficulties involved in continued die shrinks, but, frankly, I think they just got lazy and depended on the die shrinks so they could have a smaller R&D budget.

I don't buy it for a second.

And we can certainly make use of serious performance boosts. 2560x1440 @ 144Hz. 4K @ 60Hz. Now is not the time for Nvidia and AMD to drag their proverbial feet.
 

sykozis

Distinguished
Dec 17, 2008
1,759
5
19,865
These cards aren't aimed at gamers. Most of what Matrox does, is 2D, which is quite easy to render at 4K. Even my R7 240 can handle 2D at 4K..... the "depth of field" is what kills a graphics card trying to render a 3D scene at 4K.
 

Innocent_Bystander

Honorable
May 2, 2013
76
0
10,640
$$$$Thousands. And they will be lapped up by the target audience simply because Matrox's professional line of products is awesome.

I only used their framegrabbers but I never had a complaint about their performance or the company's service when I needed it.
 
For those asking about why 60 vs 30 / ect.. and why don't our current batch of consumer cards do this, it has to do with the RAMDAC's on the cards. We have long sinced moved away from analogue video signals to digital and so "RAMDAC" is kind of a misnomer now as it's really just the chip that converts the contents of the video frame in memory into a digital signal going out to that individual display, usually with TMDS. The speed and quality of the RAMDAC is what determines maximum output resolution and signaling rate. What matrox has done is take a typical dGPU and put several high end professional RAMDAC's on them. There are still limitations and so that's why you get 3x60 or 6x30 at maximum resolution, I'm willing to bet there are three RAMDAC chips capable of running in split channel mode.
 

Poul Wrist

Distinguished
Jul 16, 2013
33
0
18,540
I actually do want to know the price :p I have a need for something like this in a build of a security cam driving PC I am looking at constructing for a client.
 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010


Yes, I said that in my post (though you did go into more detail).

I suspect those that down-voted me didn't thoroughly read what I wrote.

 

Marcus52

Distinguished
Jun 11, 2008
619
0
19,010


This points to the kind of thing I'm talking about.

If the power can't come directly from the GPU, then find another way to do it. There is no reason at all that separate RAMDACs for gaming graphics can't be designed and built (there is actually some of that going on in current GPUs made by Nvidia and AMD) to assist the GPU in transferring more data faster.

Instead of thinking "well it can't be done with current hardware designs", I want them to think "How CAN we do it?"

 

zhunt99

Reputable
Mar 10, 2014
79
0
4,640
First time I'm hearing "power-efficient" and "AMD" in the same sentence.

Congratulations to AMD on the progress.

You must not recall the HD 5000 and 6000 series that trumped their Nvidia counterparts in power efficiency, or the Athlon X2 vs Pentium 4 days
 


The old FX Nvidia chips were awful on power as well. So were the 8800 and 9800 GTX+
 


If you mean the FX as in the Nvidia 5000 series those things were pretty darn weak and hot to begin with. The fastest one in the series only had an 8-pipeline configuration and it was a really hot running card with extra power added so it had to be horribly inefficient. Its kinda like the dark days of graphics cards between the good quality and performance for the time and low power consumption cards of the Voodoo 3, Geforce, Geforce 2, up to Geforce 4, with the start of the ATI Radeon cards. Then you get the Nvidia FX 5000 series and it seems for the time being both are kinda lagging in terms of performance and power consumption, then you get the Nvidia 6000 series and the ATI X800 cards and all of a sudden performance is top notch and power is low again.
 
Yup, I'm using the FX 5950 Ultra in my PIII rig (a bit overkill really). (see sig for details).

It's also one of the earliest dual slot cards, but it's also the first to use the current naming conventions.
 
Status
Not open for further replies.