D-Wave’s ‘Quadrant’ Machine Learning Does More With Less Data

Status
Not open for further replies.

bit_user

Titan
Ambassador
This sort of goes without saying. When you can find the optimal solution to a multi-variate problem and express machine learning in such a form, then you can use fewer variables to achieve a similar quality result. In machine learning, the number of training samples required is largely a function of the size of the model. So, the smaller number of required training samples follows naturally from enabling smaller models.

What they're not saying is the down side - that they don't scale well to larger models. Otherwise, perhaps they'd put Nvidia's cloud business out of business virtually overnight. It will probably be a while before Nvidia or anyone building training-oriented machine learning chips has to worry about these guys.
 
Status
Not open for further replies.

TRENDING THREADS