News Intel Arc A770 and A750 Limited Edition Unboxed

saunupe1911

Distinguished
Apr 17, 2016
203
74
18,660
Wow the lack of full 48gb HDMI 2.1 is a huge let down! The HTPC and small form PC folks would have ate this up. Extremely missed opportunity. C'mon Intel!!!!
 
Wow the lack of full 48gb HDMI 2.1 is a huge let down! The HTPC and small form PC folks would have ate this up. Extremely missed opportunity. C'mon Intel!!!!
It has 3x DP 2.0 ports (I believe all of the Intel Limited Edition GPUs have DP 2.0) on it AFAIK so there might be (likely expensive) an active adapter down the road which would do the trick. I do agree it'd be great if it had HDMI 2.1 as well, but I'm assuming they didn't have the spec ready when the silicon was originally done.
 

waltc3

Reputable
Aug 4, 2019
423
226
5,060
Very good to mention that AMD makes GPUs, too--otherwise someone might think these Arcs can't compete with any AMD GPU. Sort of a bizarre oversight by Intel PR, imo. One day soon, I imagine, you'll get the green light to test them, and won't that be fun? I would imagine that Intel is hard at work trying to put its drivers into shape before the NDA lapses, which I imagine is why you can't test it right now. So many rumors about these GPUs, it's hard to separate the wheat from the chaff, so I'll be glad when you get the green light to test.
 

Co BIY

Splendid
Am I the only one still flabbergasted people are celebrating the emergence of what are essentially entry level gaming cards at over $300?
While I sympathize with this sentiment . $300 isn't what it used to be.

The 960 was $199 at release in 2015 - $1 in 2015 is $1.25 today. That puts the 960 MSRP at $250 today and inflation would account for half the price increase. Large performance increases in the new cards and diminishing returns to "Moore's Law" make these prices seem more reasonable.

People are also celebrating a third entrant into the GPU marketplace. The market dynamics with three competitors is much better than with two.
 

Warrior24_7

Distinguished
Nov 21, 2011
31
18
18,535
Very good to mention that AMD makes GPUs, too--otherwise someone might think these Arcs can't compete with any AMD GPU. Sort of a bizarre oversight by Intel PR, imo. One day soon, I imagine, you'll get the green light to test them, and won't that be fun? I would imagine that Intel is hard at work trying to put its drivers into shape before the NDA lapses, which I imagine is why you can't test it right now. So many rumors about these GPUs, it's hard to separate the wheat from the chaff, so I'll be glad when you get the green light to test.

Why is it good to mention AMD? Nvidia makes the best cards, that’s why Intel targeted them. If anybody should be afraid of more competition in this space is AMD. When using AMD, the pros as well as AMD fanboy’s use AMD CPUs with Nvidia GPUs! So far, Intel’s very first card seems impressive!
 
  • Like
Reactions: JoBalz

KyaraM

Admirable
If the drivers are anything enar as buggy as Intel's previous releases, write me.

I'll send you, your favorite alcohol in empathy.
GN mentioned in one of his videos about 2 months back that Intel fixed half the bugs he had found within a couple weeks. Iirc that was over 100 bugs. So they do take it seriously.
 
Last edited:
  • Like
Reactions: JoBalz

JoBalz

Distinguished
Sep 1, 2014
101
42
18,640
Well, interesting news just as my MSI RTX 3060 arrived Friday. Actually, I knew ARC was about to be released, but I was highly hesitant to be an early adopter, only to experience problems with buggy drivers. If they had been released for six months to a year and the drivers had the bugs worked out, I might have considered it. I do like the idea there might actually be come competition with NVIDIA and AMD.
 
I recently came across several reports here on the different tech-channels and that Intel dropped their entire discrete GPU program and called it quits. Invested over $1-Billion just to getting it started. Of course none of this information came officially and or directly from an Intel corporate communication, but nevertheless it was prominently posted. Why would anyone report and write such a negative thing as another or third major GPU OEM with deep pockets is good for the consumer. Thoughts?

Intel has a history of pumping and dumping. Do you know how many "leading edge" products Intel invested a ton in, just to dump it as it was released?

They have a really bad rep this way.

That said, Intel will likely keep Raja for corporate AI processor work. Less driver headaches and bigger profit margins in AI.

If anything Raja seems competent for decent compute throughput, even if he lacks in gaming, planning and resource management.
 

KyaraM

Admirable
I recently came across several reports here on the different tech-channels and that Intel dropped their entire discrete GPU program and called it quits. Invested over $1-Billion just to getting it started. Of course none of this information came officially and or directly from an Intel corporate communication, but nevertheless it was prominently posted. Why would anyone report and write such a negative thing as another or third major GPU OEM with deep pockets is good for the consumer. Thoughts?
Because Intel-haters need their daily dose of bs about Intel to keep hating. If the rumor is true or not is irrelevant, they need to be fed.
 

shady28

Distinguished
Jan 29, 2007
430
299
19,090
Because Intel-haters need their daily dose of bs about Intel to keep hating. If the rumor is true or not is irrelevant, they need to be fed.

^^^ That.

Plus the leaker, "MLID" aka Moore's Law Is Dead, basically got debunked by Gamers Nexus who called their contacts at Intel (far more credible I might add) and essentially got laughed at.

MLID is a fake.

I wasn't real sure about him ever since he had an "anonymous Intel Engineer", that mostly said "uhhmmm" and couldn't seem to string a complete sentence together, show up on his channel.

At this point though, pretty much everything that guy has said in the last 6 months has turned out to be garbage.
 

Co BIY

Splendid
I recently came across several reports here on the different tech-channels and that Intel dropped their entire discrete GPU program and called it quits. Invested over $1-Billion just to getting it started. Of course none of this information came officially and or directly from an Intel corporate communication, but nevertheless it was prominently posted. Why would anyone report and write such a negative thing as another or third major GPU OEM with deep pockets is good for the consumer. Thoughts?

Because sometimes the traders and hedge funds need to create ripples to trade on. They don't need waves when they can leverage a ripple they started.