News China's ACCEL Analog Chip Promises to Outpace Industry Best in AI Acceleration for Vision Tasks

Status
Not open for further replies.

Admin

Administrator
Staff member
A new research paper in Nature published by Tsinghua University describes a new analog computing chip that beats Nvidia's A100 at 3,000 times its performance and 4,000 million times higher energy efficiency at computer vision tasks. Considering the potential markets for devices such as these and the potential for further miniaturization within some aspects of ACCEL's architecture, scaling might want to find a place for its fabrication.

China's ACCEL Analog Chip Promises to Outpace Industry Best in AI Acceleration for Vision Tasks : Read more
 
  • Like
Reactions: George³

George³

Prominent
Oct 1, 2022
228
124
760
I wonder, with such superior performance, why it doesn't run the tests without errors. The article describes a supremacy with a huge performance advantage. What is missing in this case? Will Yottaflops of performance be needed for 100% recognition? And how flops which is digital measure are related to analog chip?
 
  • Like
Reactions: P.Amini

bit_user

Titan
Ambassador
I'm sure there are serious limitations in the optical and analog portions. I think it would make most sense to use for realtime computer vision problems, where the optical and analog portions comprise only the first few layers of the network. That's where most of the compute load tends to occur. For the downstream layers, you could then transition to digital processing and benefit from the greater degree of flexibility and reduced resilience to noise that it provides.

However, right away, I have two questions:
  1. How susceptible is it to damage by bright light? If it were used in a security camera or robot, would it be permanently degraded by direct sunlight? Laser pointers?
  2. What's the effect on spatial resolution? This could have a serious impact on applicability. Like, maybe you could use it in a package-sorting facility, where you could ensure all of the labels showed up at a reasonable size, but not really in video surveillance applications, where important image features can be relatively small.

If these problems can't be addressed in a cost-effective way, then it seems like a nice theory but of limited applicability. Perhaps it can still be useful for robotics, even at a reduced resolution - insect eyes have shown you can do a lot with relatively little spatial resolution.
 
Last edited:

bit_user

Titan
Ambassador
I wonder, with such superior performance, why it doesn't run the tests without errors. The article describes a supremacy with a huge performance advantage. What is missing in this case? Will Yottaflops of performance be needed for 100% recognition?
IMO, some loss of accuracy is probably inherent in a pure analog system. A better way to limit that is to make it optical/analog/hybrid, where digital layers can compensate and filter some of the analog layers' noise.

You'd probably never get to 100% accuracy (I'm not aware of any digital networks which do, either). But, you only need "good enough". It's not like humans are 100% accurate, on the same datasets.

And how flops which is digital measure are related to analog chip?
That's pretty simple. You look at how much computation would be needed to perform the same fundamental operations, in the digital domain.
 
Nov 4, 2023
1
3
10
I find myself wondering if you have taken into account that Chinese University's and Laboratories have a habit of publishing papers about things that are sketchy at best. I have read several articles on this, maybe you should to?
I think it may be endemic to their culture from their governmental oversight and direction. I would not put my name on an article spouting about a paper from a Chinese source, until it was absolutely proven by outside sources. But hey, I am just a hillbilly from Missouri.
 

bit_user

Titan
Ambassador
It’s perhaps interesting to witness how even with all sanctions being applied to China, the country’s research and development is allowing it to catch up – and in some ways, apparently improve upon – whatever it was that they were being impeded of. Ingenuity – being able to go around limitations – is undoubtedly the way China is thinking about sanctions.
@Francisco Alexandre Pires , this line of research has been going on longer than that:

When you're tempted to take such a leap, it's a good idea to check yourself.

It’s also important to understand that this generation of photonics-based analog chips is being worked on at extremely relaxed lithography levels. ACCEL, for instance, was manufactured on a standard 180-nm CMOS technology for the Electronic Analog Computing unit (EAC) – the brains of the operation. Naturally, further performance, clock frequency, and efficiency improvements could be gained from further miniaturizing the process towards lower CMOS nodes (Nvidia’s H100 is fabricated at a 4 nm process).
I'm not too sure about that. Analog ICs tend to use much older nodes. It's unclear how much they're held back by process node, especially when you consider that running analog circuits on newer, smaller nodes should have much greater issues with noise.
 

bit_user

Titan
Ambassador
I find myself wondering if you have taken into account that Chinese University's and Laboratories have a habit of publishing papers about things that are sketchy at best. I have read several articles on this, maybe you should to?
It's published in Nature, which is a highly-respected, US-based academic journal. That means it will have passed a gauntlet of reviewers familiar with the domain. So, I trust they have some real innovation and plausible data behind it.
 

vanadiel007

Distinguished
Oct 21, 2015
376
368
19,060
Wait, I thought we were baking a super conductor in the Chinese kitchen sink last time I visited.
Good thing it's 4,000 million billion times more energy efficient. I mean, 4,000 million. Errr, 4 billion, 0.004 trillion? umm, very big much better!
 
  • Like
Reactions: mantleo

Dementoss

Prominent
Oct 14, 2023
73
60
610
It's published in Nature, which is a highly-respected, US-based academic journal. That means it will have passed a gauntlet of reviewers familiar with the domain. So, I trust they have some real innovation and plausible data behind it.
I agree Nature is a highly respected scientific journal but, can't help thinking these extraordinary performance claims, smack more of Chinese government hyperbole, than scientific fact. If these devices are ever made available for independent testing outside China, then we may find out what they can really do.
 

aberkae

Distinguished
Oct 30, 2009
133
43
18,610
I agree Nature is a highly respected scientific journal but, can't help thinking these extraordinary performance claims, smack more of Chinese government hyperbole, than scientific fact. If these devices are ever made available for independent testing outside China, then we may find out what they can really do.
This happens to drop just went their are sanctions placed on Nvidia's hardware. This sounds like a quote from Sun Tzu, The Art of War
"Appear weak when you are strong, and strong when you are weak."
I remember such claims during the previous crypto mining craze coming from that region and non of them came to fruition. The best card they have to offer is the MTT8 S80 which performs worse than a gtx 750ti as per the Gamer's Nexus review. I guess if want to catch Nvidia's attention you use their marketing. 😅
 

bit_user

Titan
Ambassador
This happens to drop just went their are sanctions placed on Nvidia's hardware.
The underlying research takes years, especially to fab & refine prototype implementations. Even the publication process takes months, since it involves submitting the paper which puts it into a queue. Then reviewers have to read it and submit their feedback & questions to the authors. That repeats for a couple iterations, before a final decision is made to publish it.

So, the timing must be coincidental, although China's push to spur its own innovation has probably made more resources available to such researchers, in recent years.
 

aberkae

Distinguished
Oct 30, 2009
133
43
18,610
The underlying research takes years, especially to fab & refine prototype implementations. Even the publication process takes months, since it involves submitting the paper which puts it into a queue. Then reviewers have to read it and submit their feedback & questions to the authors. That repeats for a couple iterations, before a final decision is made to publish it.

So, the timing must be coincidental, although China's push to spur its own innovation has probably made more resources available to such researchers, in recent years.
Nvidia also claims 1 million x performance gains every 10 years so there is that. Competition is definitely needed. Strange with all the resources and knowledge that Intel has even they couldn't achieve such things.
 

Giroro

Splendid
South Korean kitchen for superconductor. Bad memory, sir.
The South Korean research had been around for years, yet the publicity dollars and media cash grab strangely only lasted for the exact amount of time people thought that fraudulent Beijing video was real.

I think similarly this story about this chip is only being picked up right now because "China tech good, west tech bad" stories always generate a lot of ad revenue, which comes from somewhere, for some reason.
 
  • Like
Reactions: mantleo

bit_user

Titan
Ambassador
Nvidia also claims 1 million x performance gains every 10 years so there is that. Competition is definitely needed. Strange with all the resources and knowledge that Intel has even they couldn't achieve such things.
The comparison between this thing and Nvidia's A100 is weird, because they're not solving the same sorts of problems. The approach described in this paper doesn't scale well, nor can it be used for training. It's clearly aimed at low-power, embedded computer vision applications, which is not something you'd ever do with an A100.
 

Joeio

Reputable
Sep 22, 2019
3
6
4,515
It's published in Nature, which is a highly-respected, US-based academic journal. That means it will have passed a gauntlet of reviewers familiar with the domain. So, I trust they have some real innovation and plausible data behind it.
A highly respected UK based journal. FTFY.

Regarding the article above, I find a ton of problems. The reproduced figures refer to simulated data. The speculation about moving to other process nodes from 180nm for the photonics element seems to disregard that, for photonics, you need to interact with light of a specific wavelength. Moving to a 10nm process, for example, means that the X rays you're using would go right through the chip.

As you say, what is said in the paper has passed peer review so is likely to hold water. But I doubt that this article accurately reflects the expected impact.
 

Giroro

Splendid
I'm not entirely sure what this chip is supposed to do (image recognition? ) or how it's supposed to do it (fancy camera? )

But I'm still pretty confident in saying it's not a general purpose GPU, can't do the same things as a GPU, and is therefore completely inappropriate to compare its performance to a GPU outside of the context of the single, extremely specific thing this chip is custom designed to do.

Especially when it comes to things like number of operations, frequency, and compute performance. Its unclear how and why those metrics could even be applied to analog image processing.
 
  • Like
Reactions: mantleo

Rob1C

Distinguished
Jun 2, 2016
111
20
18,685
It's a legitimate technology and the way of the future, but they weren't first:
 

gg83

Distinguished
Jul 10, 2015
756
355
19,260
I find myself wondering if you have taken into account that Chinese University's and Laboratories have a habit of publishing papers about things that are sketchy at best. I have read several articles on this, maybe you should to?
I think it may be endemic to their culture from their governmental oversight and direction. I would not put my name on an article spouting about a paper from a Chinese source, until it was absolutely proven by outside sources. But hey, I am just a hillbilly from Missouri.
No truer words have been spoken!
 

deesider

Distinguished
Jun 15, 2017
308
147
18,890
For this article, I'm sure the analog chip is good at what it is specialized for, but like others have said, comparing this kind of technology to a GPU is pretty irrelevant. That and how this type of photonics and analog computing has been around for a while, this is more an evolution of what already exists, not revolutionary at all.
The Nvidia A100 isn't really a GPU either (it doesn't have a display output)
 

2+2

Prominent
May 11, 2022
28
35
560
Just wondering how the efficiency of this analog chip compares with the efficiency of the analog human brain.

How many decades will it be before semiconductors are at parity with the brain?
 
The comparison between this thing and Nvidia's A100 is weird, because they're not solving the same sorts of problems. The approach described in this paper doesn't scale well, nor can it be used for training. It's clearly aimed at low-power, embedded computer vision applications, which is not something you'd ever do with an A100.
Plus one could argue this tech is an analog ASIC which is never a fair comparison to a general purpose GPU.
 
Status
Not open for further replies.