News Shopify CEO: teams must prove AI can’t do a task before asking for new hires, resources

Classic MBA thinking:

"The Shopify CEO says that the e-commerce platform has been growing by 20% to 40% annually — and that its people must also “improve by at least that every year just to re-qualify.” More than that, Lütke thinks that this performance isn’t terribly ambitious anymore, specifically because of the availability of AI tools,"
This clown is going to burn out his team and create a lot of unhappy customers. Within a year, they'll either have a new CEO or he'll be backpedaling and saying that he was too optimistic.

Yes, AI is good at some things. It's not equally good at everything, and it takes work to use. That makes this quote a particular knee-slapper:

"Lütke believes that AI is a tool that multiplies productivity by 10x, and when paired with people who “contribute 10x of what was previously thought possible,” then Shopify could get 100x the work done."

At first, I thought he was double-counting the 10x. Then, I realized he thinks there are some people who are 10x as productive as others. I've seen people who are maybe 2-3x as productive as the average tech worker, but not 10x. And if you now make them spend all their time trying to coax usable work products from AI, it's going to bog them down to the point where they're not a whole lot more productive than most of their peers, who are doing the same thing.

The saddest part is that you just know loads of his fellow CEOs are going to jump on this particular bandwagon, if they haven't done so already.

Lastly, not that I'm in the market for Shopify's services, but I certainly wouldn't go near them at this point. And, if given a choice between a business who uses Shopify vs. one of their competitors, I'm going with the business who's not using Shopify.
 
Last edited:
I think AI as a development tool is awesome, and does improve productivity if used right. But I think he went all in too soon. Unless there is a lot of review and tinkering with AI code, the chance that something goes bust is too large for any business. He should do it with one team and analyse the results, before pushing it company-wide.
 
  • Like
Reactions: bit_user
He clearly hasn't "chatted" with the useless AI bots that are appearing all over on websites.

I've been using Copilot in Visual Studio with some pretty impressive results. However, I have to pay very close attention to what it recommends, as it's regularly 95% accurate, and that last 5% can be easily overlooked.

It's much like the AI photo generating systems that create cats with 6 legs and birds with 3 legs.
 
AI Can't do the job because AI can't really do much of anything useful, with any reasonable certainty that the end result will be good, or even comprehensible
Unless Shopify's business model is based around unreadable information-devoid word-salad spam blogs and semi-mangled pictures of celebrities in compromising positions.
...But I'm pretty sure Shopify's business is mostly just based on selling scammers tools to create and host unmoderated fake storefronts where they can sell imaginary products at too-good prices and generate fake tracking numbers.

Maybe AI could help with that, but I'm pretty sure the scammers will just keep doing it themselves, for free -even without shopify.

the-solution-is-9b4d1b9662.jpg


Good Luck proving a negative to a stubborn, irrational dummy.
 
  • Like
Reactions: snemarch
I have used chatGPT3.5 to assist me in programming tasks, but gave up using it because its error rate was too high. How do they get around this problem when using it for business tasks? Perhaps using one AI to check the output of another AI.

Same here. I caught chatGPT flat out lying multiple times.

That said, it is helpful.

This is why I say AI is mostly hype that will eventually turn into Search Engine 2.0. That is, helpful, but not accurate.
 
So employees should provide data on how he can eliminate their current jobs? Lol, good luck with that.
AI is a tool, not a crutch, and certainly not a replacement for human critical thinking or originality.
 
So employees should provide data on how he can eliminate their current jobs? Lol, good luck with that.
Well, what he said is that he wants data justifying why a new hire would be needed and not just more AI. Otherwise, a manager would not be given permission (and budget) for the position they want to fill.

For existing employees, he expects them to be 10x as productive as they were before embracing AI, and then become another 20% to 40% more productive per year. That last part is particularly silly, because fast-growing businesses traditionally hire more employees to sustain the growth. Not sure where he thinks the additional productivity is going to come from, unless he decided that's coincidentally the amount of improvement AI will be accruing, annually.

Like I said, this guy sounds like a typical B-school D-bag.
 
That's fair, I overreached with my own thought process in hindsight.

I think we could agree he likely wouldn't stop there if given the chance though.
Like you said, B-school D-bag thinking he's going to find an easy way to jack up margins.
 
Last edited:
This clown is going to burn out his team and create a lot of unhappy customers. Within a year, they'll either have a new CEO or he'll be backpedaling and saying that he was too optimistic.
Yeah, that "must improve 20% to 40% annually" is downright toxic. I wouldn't be surprised if they also emply stack ranking...


At first, I thought he was double-counting the 10x. Then, I realized he thinks there are some people who are 10x as productive as others. I've seen people who are maybe 2-3x as productive as the average tech worker, but not 10x. And if you now make them spend all their time trying to coax usable work products from AI, it's going to bog them down to the point where they're not a whole lot more productive than most of their peers, who are doing the same thing.
Oh, the 10x engineers definitely do exist. You'll usually find them at workplaces that have performance KPIs... if you're of average intelligence, it usually isn't too hard to figure out how to game that kind of system.

If you're doing anything *worth* doing, the current ML tools (f**k calling it AI) can provide some modest improvements. Anybody claiming 10x or more increase of <whatever> metric was producing slop even before adding LLM to the slop-mix.