News Intel Goes on Game Dev Hiring Spree Before Alchemist Gaming GPU Launch

When I saw the headline, I was hoping Intel had started a game development studio to produce PC games optimized for their new GPU hardware. Looks like what really happened is that a new outreach team has been stood up. Not quite what I was hoping for. Looks similar to what Nvidia and AMD do with PC games that are biased to perform better on one brand's hardware.

I think Intel has the most incentive to start up a dedicated PC gaming studio of all the GPU manufacturers. They have no console business like AMD and Nvidia have. The industry desperately needs games that fill the gap between over-monetized AAA casino games and indie experiences.
 
  • Like
Reactions: Howardohyea
When I saw the headline, I was hoping Intel had started a game development studio to produce PC games optimized for their new GPU hardware. Looks like what really happened is that a new outreach team has been stood up. Not quite what I was hoping for. Looks similar to what Nvidia and AMD do with PC games that are biased to perform better on one brand's hardware.

I think Intel has the most incentive to start up a dedicated PC gaming studio of all the GPU manufacturers. They have no console business like AMD and Nvidia have. The industry desperately needs games that fill the gap between over-monetized AAA casino games and indie experiences.

Sorry, but I take the consumer perspective here. And I am sick and tired of all these shenannigans which aim at (vendor) exclusivity and (consumer) restriction of choice. A separate studio cries exclusive titles and I'd rather want consoles to choke on bad ones than thrive. If I had any say, I'd in fact enforce cross-platform title compatibility and ownership (they say "buy" dammit, not "get suckered") across platforms.

I'm not sure I actually like Intel entering the dGPU business, when we all know they mainly aim at getting more of AMD's and nVidia's slice of the dGPUs cake. But if they do and I buy, I certainly want their GPUs to deliver the best performance they are capable of, if only to keep AMD and nVidia sharp, on their teeth and hopefully keeping some of their hardware readily available for gamers, too, not just miners and HPC: after all it was gaming which got GPGPU to where they are today.

In today's reality it means supporting game engines, not games. Of course, some tuning for a hardware/engine combination will still be required as long as they need to wring performance from different strenghts and capabilities: Less demanding legacy titles should just run with anything pushing pixels.

But I certainly wouldn't want to have to buy an AMD, Intel and nVidia GPU each (with distinct systemes?) just to be able play every game that catches my fancy: their choice and life-cycle must be completely independent!

Hardware, game-engines, game-shops and game studios should be strictly kept apart, never to use vertical lock-in to skew or restrict consumer choice. I want to be able to play my Steam titles on my PC (any GPU), my X-Box, my PSn, on Nvidia/Google/AWS clouds and on mobile via remote rendering of any mixture thereof without having to repurchase any title, as long as my inheritors care to play the games I bought.
 
Sorry, but I take the consumer perspective here. And I am sick and tired of all these shenannigans which aim at (vendor) exclusivity and (consumer) restriction of choice. A separate studio cries exclusive titles and I'd rather want consoles to choke on bad ones than thrive. If I had any say, I'd in fact enforce cross-platform title compatibility and ownership (they say "buy" dammit, not "get suckered") across platforms.

I'm not sure I actually like Intel entering the dGPU business, when we all know they mainly aim at getting more of AMD's and nVidia's slice of the dGPUs cake. But if they do and I buy, I certainly want their GPUs to deliver the best performance they are capable of, if only to keep AMD and nVidia sharp, on their teeth and hopefully keeping some of their hardware readily available for gamers, too, not just miners and HPC: after all it was gaming which got GPGPU to where they are today.

In today's reality it means supporting game engines, not games. Of course, some tuning for a hardware/engine combination will still be required as long as they need to wring performance from different strenghts and capabilities: Less demanding legacy titles should just run with anything pushing pixels.

But I certainly wouldn't want to have to buy an AMD, Intel and nVidia GPU each (with distinct systemes?) just to be able play every game that catches my fancy: their choice and life-cycle must be completely independent!

Hardware, game-engines, game-shops and game studios should be strictly kept apart, never to use vertical lock-in to skew or restrict consumer choice. I want to be able to play my Steam titles on my PC (any GPU), my X-Box, my PSn, on Nvidia/Google/AWS clouds and on mobile via remote rendering of any mixture thereof without having to repurchase any title, as long as my inheritors care to play the games I bought.
you do have a point, but it's how the market works. Plus the different vendor promoted games doesn't have that much of a difference between them, maybe you'd loose 20% performance at most, but that won't make too much difference.
 
Sorry, but I take the consumer perspective here. And I am sick and tired of all these shenannigans which aim at (vendor) exclusivity and (consumer) restriction of choice. A separate studio cries exclusive titles and I'd rather want consoles to choke on bad ones than thrive. [. . .]
At no point did I mention a desire for exclusivity or for Intel to produce games that only work on its GPUs. You are making an incorrect assumption that I am calling for exclusives. I hate exclusives and very much prefer to play on my platform of choice (PC).

I want to reinforce that I'd like to see Intel, Nvidia, and AMD all open up their own game studios. The goal is not to lock specific titles to specific products. Sure, optimizing for a specific brand is commonplace in the PC gaming industry. Exclusivity? No. The PC gaming ecosystem has a huge hole in the middle. That hole is for games better than indie experiences, but not the over-monetized, over-marketed, over-hyped, rehashed-yearly, AAA titles. During normal times, I think the GPU vendors have an incentive to fill that hole to help move their products. Of course, we're not in normal times right now -- we're in the shortage era, but it won't last forever.
 
This isn't about making games though. This is about hiring people who know how to make games to learn about how the GPU works, so that they can provide support to developers on how to best use the GPU architecture. If they know how game development works, there's less of a communications issue.

If the hardware you're selling doesn't have a strong software team behind it to support people making stuff for it, developers aren't going to like using that hardware. For Intel, not having a strong software support team is basically shooting the horse before the gates open.

The only time I would like to see the hardware manufacturer develop consumer software is if it's open sourced (or at the very least, "shared source"), because it provides an example on how to use the hardware. Otherwise what's the point?