Yes, I've heard of LISP machines. The machines I'm talking about were designed for ALGOL, which is sufficiently general enough to execute most other languages efficiently, including what were the best Simula and APL compilers. The instruction set is:
https://en-academic.com/dic.nsf/enwiki/711647
"FWIW, I think you draw a somewhat artificial distinction between hardware and software, when you suggest that a meaningful difference exists between implementing security in hardware vs. at the language level."
I can't see what you are getting at there. Security should be at system level below which programmers can program or languages can generate instructions. That is the instruction set itself is secure. How much that is in hardware or microcode is an open question.
Many operations are very common, like string operations. Instead of compilers generating the same sequence of RISC instructions over and over, put that sequence of instructions in on-chip cache, although that is an implementation detail. But it is system organisation that has more to do with speed than instruction set, which is where the RISC instruction set was somehow magically supposed to be faster. It had more to do with getting everything on a single chip.
We rely far too much on compilers generating safe code. The fact is that if compilers can generate bad code, then hackers can get at that level. Security must be built in at lowest levels or we are always trying to catch up, adding after-the-thought utilities to make up for the weakness.
For general computing, memory accesses must be tested to be within bounds. That check should be done in the microinstructions. The RISC processors (or any processors) provide the microinstruction level. All processors just execute a stream of instructions — they don't know about branches or loops or calls. Part of the system schedules those actual instructions for execution. Some native instructions can be used in the system instruction set.
There are many ways to design such systems, we are just fixed on one particular way from most simplistic 'computer architecture' courses.
"Anyway, I think this isn't the sort of stuff Linus was talking about."
He was certainly making a point about the infrastructure behind supporting an architecture like x86. However, we are finding that big companies are losing their monopoly there and there are many chip fabricators who will build what we want. That means we can have different system architectures for different applications.
For general multi-purpose, multi=programming computing, which is the majority of user systems these day, we can do much better than x86, or raw RISC. We should be doing much better than C as a system language. We are stuck in this 50-year-old mindset. I suspect Linus is somewhat himself.
However, he made the point about the semantic gap from hardware to software and the hardware people should design what is required in software. What is required in software these days — thinking at the system level — is security.
"As for what system-level architecture is ultimately best, I think it's really best to start at the OS level and figure out what kind of OS architecture & constructs you want to streamline. Then, find the impedance mismatches vs. current hardware and think about ways to refactor it so it can more naturally and efficiently implement the target OS."
Yes, but we need to stop this 'one-size-fits-all' mentality. Linus did mention IoT devices. Certainly these are single-purpose simple systems not needing what general systems need.
It should not be x86, RISC-V, ARM, Windows, or C everywhere. That is certain sectors trying to impose a monopoly. Computing has grown beyond special purposes of 1960s and 70s to be in common use where hardware is cheap and guys in hushed voices and white lab coats attend machines behind security walls.
Perhaps Linus did not realise it, but he has brought up an issue where we can start to think differently to solve the most pressing problems in modern computing. We need to be more imaginative in our approaches — and yes that imagination has existed since 1961.