News Vivian Lien Joins Intel to Build Arc Discrete Graphics Business

InvalidError

Titan
Moderator
Great, Intel throwing more money at marketing to "fix" its hardware and driver development issues. Maybe things would work better if Intel made sure it had something actually worth marketing before ramping up marketing staffing. In this market of out-of-control GPU prices, all Intel really needs to do to "market" its GPUs is undercut the competition with the best GPUs it can realistically put out for a given price until they are good enough to actually compete.
 
  • Like
Reactions: RodroX

cyrusfox

Distinguished
How will MLID respond to this contradiction to his prognostication of the end of dGPU consumer Arc? Hope Intel can stick around as a third player, it is hard to break in to an established market.
 

InvalidError

Titan
Moderator
Hope Intel can stick around as a third player, it is hard to break in to an established market.
Breaking into the GPU market when your company already has 20+ years worth of patents to ward GPU patent trolls with shouldn't be too hard. Where Intel really screwed itself with Alchemist is by aiming too high at the high-end and too low at the low-end, leaving with nothing really worth talking about at either end once all of the driver woes are factored in unless you bought it for one of the few things it does better than AMD and Nvidia.

The A380 needed to be about 20% faster to consistently make sense next to the RX480, RX5500, RX6500, GTX1650S, etc. which would have enabled Intel to charge $20-30 extra for it and the $290 A750 makes very little sense next to the $220 RX6600.

Had Intel been a little more aggressive at the low end where it came closest to making perfect sense in this crazy GPU market, it could have been great. The low-end is also likely more tolerant of less-than-perfect drivers and game compatibility in bleeding-edge games.
 

ikernelpro4

Reputable
BANNED
Aug 4, 2018
162
69
4,670
The problem is people are bat-crazy stupid.

How can companies and footballers nowadays be considered lying when they are literally THE source??

I am sick and tired of people painting their own picture despite the very source, they are specularting about, giving a definitive answer.

"Hey, instead of believing the people in question, let's trust some random twitter and internet users with zero affiliation to anything. and continue to speculate despite there being a DEFINITIVE answer".

People are either dumber or as deaf and ignorant as a coat of paint...
 
  • Like
Reactions: cyrusfox
The problem is people are bat-crazy stupid.

How can companies and footballers nowadays be considered lying when they are literally THE source??

I am sick and tired of people painting their own picture despite the very source, they are specularting about, giving a definitive answer.

"Hey, instead of believing the people in question, let's trust some random twitter and internet users with zero affiliation to anything. and continue to speculate despite there being a DEFINITIVE answer".

People are either dumber or as deaf and ignorant as a coat of paint...
There's a reason why trials exist: you can't trust everyone when they have a reason to lie.

I'm not saying MLiD is right on this one, but your premise is already flawed. Otherwise scams wouldn't exist.

As for the news themselves. Well, I can't help but have the same exact knee-jerk reaction as InvalidError. On the other hand, they probably just can't throw more engineers are the problems they have? I don't know. Intel won't tell us anything or their shares will drop and people will get sued XD

Regards.
 
  • Like
Reactions: bit_user

InvalidError

Titan
Moderator
As for the news themselves. Well, I can't help but have the same exact knee-jerk reaction as InvalidError. On the other hand, they probably just can't throw more engineers are the problems they have?
Depends on what sort of bottleneck they are bashing their skulls on. Based on the sorts of issues Alchemist ran into at launch, I'd guess Intel came tragically short in the QA and validation testing departments. Probably needs a couple of test and validation engineers to create more test cases.
 
  • Like
Reactions: -Fran-

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,457
1,002
21,060
Depends on what sort of bottleneck they are bashing their skulls on. Based on the sorts of issues Alchemist ran into at launch, I'd guess Intel came tragically short in the QA and validation testing departments. Probably needs a couple of test and validation engineers to create more test cases.
I think they need alot more than just that, they're just 20+ years behind nVIDIA & AMD/ATi.

It's going to take a sisphyean effort to climb that mountain in record time.
 

InvalidError

Titan
Moderator
I think they need alot more than just that, they're just 20+ years behind nVIDIA & AMD/ATi.
While it may be possible to break large problems into smaller pieces so you can have more people working on them, there are still limits to how far that can go before you have so many people that they cannot get anything done without tripping on someone else's feet. Question is how close is their current setup to those practical limits.
 

Kamen Rider Blade

Distinguished
Dec 2, 2013
1,457
1,002
21,060
While it may be possible to break large problems into smaller pieces so you can have more people working on them, there are still limits to how far that can go before you have so many people that they cannot get anything done without tripping on someone else's feet. Question is how close is their current setup to those practical limits.
That's something we won't know, despite the army of GPU Engineers that Intel has hired.

I'm still rooting for them, but from everything I've heard from insider leaks, the GPU division is on VERY THIN ICE and the upper management wants to pull the plug on them while Intel CEO Pat Gelsinger is fighting to keep them.

The Consumer gaming division is the one specifically that is on "VERY THIN ICE".

The Professional/Enterprise GPU side is fine and isn't in danger at the moment.

The Professional/Enterprise GPU side will continue marching to the beat of their own drum as long as they can find buyers for their GPU's.
 

bit_user

Titan
Ambassador
One how to wonder, How many more coders could have gotten Intel for the same salary they will pay to this new marketing role?

Besides that, really Dell and Alienware Gaming?, I bet there are a few ex-evga gpu experts looking for new job opporunities....

Never cease to amaze me how big old school companies are so "stiff" and short sighted when looking for new people.
 
One how to wonder, How many more coders could have gotten Intel for the same salary they will pay to this new marketing role?

Besides that, really Dell and Alienware Gaming?, I bet there are a few ex-evga gpu experts looking for new job opporunities....

Never cease to amaze me how big old school companies are so "stiff" and short sighted when looking for new people.
Well if intel can't open up new opportunities to sell cards in the future then more coders won't have anything to do because the departement won't exist anymore...

Having a good product and not being able to sell it is just as bad if not even worse.
With a well networked person that buyers trust you have more opportunities to sell your improved products, they have to find the right balance between the two and we are in no position to even know how the balance is like.
 
Well if intel can't open up new opportunities to sell cards in the future then more coders won't have anything to do because the departement won't exist anymore...

Having a good product and not being able to sell it is just as bad if not even worse.
With a well networked person that buyers trust you have more opportunities to sell your improved products, they have to find the right balance between the two and we are in no position to even know how the balance is like.

Yeah, at this point I really doubt any gamer or professional who use GPUs, knows who this woman is, outside the media and the few of us reading this news.
Maybe she can make a name buyers know in the future, trust on the other hand is really hard to get, and is going to take a lot of time (allow me to have doubts this will be the end result since I never heard anything from her in the past, not from Asus, nor from Dell/Alienware)

As you pointed out intel probably want not only an improved product, but also a working product with drivers that don't give a headache for every minor setting users want to apply on their system.

Now considering Intel and DELL long time relationship over the many years they been doing business, is really not a surprise they got someone from a "known source" lol.

I can only think of one case in the mass consumer pc tech, of a person that many buyers knew and sorta trust (even if they didn't knew exactly why they trust him), Steve Jobs, and hes long gone.
 

rluker5

Distinguished
Jun 23, 2014
914
595
19,760
Breaking into the GPU market when your company already has 20+ years worth of patents to ward GPU patent trolls with shouldn't be too hard. Where Intel really screwed itself with Alchemist is by aiming too high at the high-end and too low at the low-end, leaving with nothing really worth talking about at either end once all of the driver woes are factored in unless you bought it for one of the few things it does better than AMD and Nvidia.
Intel has experience with igpus that are very hardware limited and are doing ok at 30 fps. Their dgpus are much less hardware limited and expose the driver disorganization and overhead that was concealed by their previous slower hardware.

You can see this when you increase resolution. The Intel card gains relative strength because it's frame time is a combination of a (scaling with resolution) fast gpu hardware part and a slow, static driver overhead part. When you increase the gpu hardware part of the frametime the static driver overhead part of the frametime has relatively less detriment.

They probably thought they could get their driver house in order, but haven't yet. From what I've seen, if Intel can get their static chunk of driver dead time down to reasonable levels the A770 will compare to the RX6700xt
The A380 needed to be about 20% faster to consistently make sense next to the RX480, RX5500, RX6500, GTX1650S, etc. which would have enabled Intel to charge $20-30 extra for it and the $290 A750 makes very little sense next to the $220 RX6600.
What was the RX6600 price at arc launch? Just because the market has decided AMD cards are of little value and the price of them has utterly cratered doesn't mean Arc cards have been mispriced. Their price has just held up much better.
The RX6600 also only compares to the A750 at low resolution high framerate uses due to the A750 driver overhead. Go to 1440p, 60 fps scenarios and the 6600 is way behind.
Had Intel been a little more aggressive at the low end where it came closest to making perfect sense in this crazy GPU market, it could have been great. The low-end is also likely more tolerant of less-than-perfect drivers and game compatibility in bleeding-edge games.
Bleeding edge games is what Arc does, it is some of the older ones it does badly in. For example AC Odyssey plays better than Valhalla, but Origins is unplayably slow. Basically the same game graphically. Also you get more frames in W3 than W2. But there are a lot of older ones that run fine at 4k, as older ones should with a new midrange card.

TBH, the fastest fix Arc can have for it's driver overhead problem is that frame interpolation stuff that is coming out. Should scale that overhead down. Hopefully find another use for those XMX cores that are right in the pipeline.

The arch is basically sitting in the relatively high latency 60 fps anyways. Not as much to lose.
 
  • Like
Reactions: cyrusfox
Yeah, at this point I really doubt any gamer or professional who use GPUs, knows who this woman is, outside the media and the few of us reading this news.
Maybe she can make a name buyers know in the future, trust on the other hand is really hard to get, and is going to take a lot of time (allow me to have doubts this will be the end result since I never heard anything from her in the past, not from Asus, nor from Dell/Alienware)

As you pointed out intel probably want not only an improved product, but also a working product with drivers that don't give a headache for every minor setting users want to apply on their system.

Now considering Intel and DELL long time relationship over the many years they been doing business, is really not a surprise they got someone from a "known source" lol.

I can only think of one case in the mass consumer pc tech, of a person that many buyers knew and sorta trust (even if they didn't knew exactly why they trust him), Steve Jobs, and hes long gone.
With networked and buyers I didn't mean the common gamer, all the OEMs have people making the decisions on what to use in their product lines and if Vivian Lien has a good relationship to these people that could help intel tremendously.
 
With networked and buyers I didn't mean the common gamer, all the OEMs have people making the decisions on what to use in their product lines and if Vivian Lien has a good relationship to these people that could help intel tremendously.

Oh I see, that makes sense in that context. Still when OEM wants to sell their products (with an included dgpu from intel), they probably want the GPU to be decent and the drivers to cooperate lol.

But yes, I can see your point.
 

InvalidError

Titan
Moderator
What was the RX6600 price at arc launch? Just because the market has decided AMD cards are of little value and the price of them has utterly cratered doesn't mean Arc cards have been mispriced. Their price has just held up much better.
Alchemist launched ~18 months later than originally planned and based on Intel's "treasure hunt" prize valuation, the A770 "Limited Edition" original MSRP was intended to be closer to $700. So the A7xx already had their prices cut by ~50% due to the combination of failing to produce consistent performance, being massively late to launch and GPU prices collapsing in general before launch.

Given Intel's poor reputation with graphics drivers from their first IGP and the i740 to present, I can't imagine the A7xx being particularly popular at $300 beyond enthusiasts who wish to tinker and don't mind losing $300 if Intel ultimately fails to deliver decent drivers.
 
  • Like
Reactions: -Fran-

cyrusfox

Distinguished
I can't imagine the A7xx being particularly popular at $300 beyond enthusiasts who wish to tinker and don't mind losing $300 if Intel ultimately fails to deliver decent drivers.
There is enough interest in top Intel SKU to deplete stock. Asrock always seems to have stock though :)

Still trying to obtain an Intel Limited edition A770, if nothing else to put on my wall with all these Optane sticks (Currently have some H10, M10 and a engineering sampled M15 I got 2nd hand). A770 is currently listed for $450-550 on ebay(completed listing averages $430), newegg is always out of stock(A750 is in stock right now though). Will be near a Microcenter for Thanksgiving so will try that as well but currently that location no stock (Columbus has 25+ though!)

This card will be doing AV1 encode and professional application(Content creation) more than gaming for my use.
 
  • Like
Reactions: bit_user

rluker5

Distinguished
Jun 23, 2014
914
595
19,760
Alchemist launched ~18 months later than originally planned and based on Intel's "treasure hunt" prize valuation, the A770 "Limited Edition" original MSRP was intended to be closer to $700. So the A7xx already had their prices cut by ~50% due to the combination of failing to produce consistent performance, being massively late to launch and GPU prices collapsing in general before launch.

Given Intel's poor reputation with graphics drivers from their first IGP and the i740 to present, I can't imagine the A7xx being particularly popular at $300 beyond enthusiasts who wish to tinker and don't mind losing $300 if Intel ultimately fails to deliver decent drivers.
At 18 months earlier, graphics cards cost twice as much, and that first prize included other things. Yes, Intel has also reduced their prices. The market has changed a lot since the latest mining craze. I wish I would have waited 18 months on the 3080 I bought.

Right now on Newegg, the Asrock Challenger A750 costs the same as the Asrock Challenger 6600XT. The AMD card is faster at 1080p, about the same at 1440p, and slower at 4k. It also has better drivers, more consistent gameplay, less features, and being based on a mature arch has less headroom for driver improvement and definitely less "new" novelty. Seems like a fair price for both. The AMD card is better for those who value it's strengths more and the same for Intel's.

But you got me, I'm one of those enthusiasts who like to tinker. Right now I have a reference A750 in this pc and it is not perfect, but good enough where it counts.
core500-jpg.2579943

But it also isn't my main gaming gpu. Just serves the living room.
 
  • Like
Reactions: bit_user

bit_user

Titan
Ambassador
I can only think of one case in the mass consumer pc tech, of a person that many buyers knew and sorta trust (even if they didn't knew exactly why they trust him), Steve Jobs,
I think he was overrated. He was a little bit visionary, extremely demanding, and a decent showman.

On the other hand, Apple had a fairly toxic work culture, under him. Lots of paranoia and secrecy. He could be impulsive and sure knew how to harbor a grudge. I suspect he was also a skilled manipulator of the people around and under him.

The way he left his first wife & daughter to fend for themselves, in spite of all his wealth and success, also shows he wasn't a very nice person.

Had his health not failed when it did, I doubt he'd be regarded so fondly.
 

bit_user

Titan
Ambassador
Bill Gates was a legit nerd back when he was the figurehead of MS and the nerds that used his products trusted him because of it.
https://www.buzzfeednews.com/articl...95-bill-gates-trapped-himself-inside-doom-and
Even more than a nerd, he was ruthlessly competitive.

Being a nerd doesn't make someone trustworthy. It might make their technology trustworthy, but nerds tend to be antisocial and can violate social norms without even realizing it. Even if they're more emotionally intelligent than that, just having good technology doesn't mean your interests or priorities align with mine. I think Google is a prime example of this.
 
  • Like
Reactions: rluker5