I believe you need to do far more research into the subject and you’ll come to the opposite conclusion. This is a huge thing with the big thinkers. It is not known how it works. So for you to say that is completely off base.
Okay, maybe I overestimated you. Just "thinking about thinking" indeed probably won't tell you much. You need to actually look at what's being discovered about thinking and cognition, in the fields of neuroscience and AI. Of course, AI is a lot more accessible, since they're not constrained by complex biological systems that have evolved over millions of years, but can rather focus on fundamental concepts that can be described and implemented in very abstract ways.
It's like how some of the best Ancient Greek philosophers thought
very long and hard about the fundamental nature of life, and yet all they came up with were ridiculous notions of earth, wind, fire, and air as its basic elements. The only way to understand something as complex as life or intelligence is to build on scientific principles and research that have been built up, over many years. However, if you do that, even someone of unremarkable intelligence can gain working knowledge and insight into these complex subjects.
Without consciousness, there is no intelligence. Period.
algorithms can’t think or feel. They can’t be conscious. Not computers either so there’s no intelligence.
This mystical thinking is a dead end. You're essentially saying: "it's too complex for me to understand (even if I didn't try very hard), therefore it must be mystical and fundamentally incomprehensible". That has never been a winning argument against science. I don't buy it.
https://www.wordnik.com/words/intelligence
Read the first line.
"noun The ability to acquire, understand, and use knowledge"
... I assert that computers will never understand the knowledge therefore, there is no intelligence.
That hinges on your definition of "understanding". I think your error is in treating it as some mystical process, when the reality is much more straight-forward. It basically involves fitting information into a framework or mental model, so it can not only be recalled but also related to other facts & information and reasoned about. At no point does "feeling" or "consciousness" enter the picture.
In a computational system, there's typically some sort of processing applied to inputs. Sometimes, it's very elaborate. If you simply mean that data needs to be processed in order to be used in some way, then I'd agree. Otherwise, I think you need to explain how knowledge is different than simply the result of some transformation that enables information to be acted upon and reasoned about.
You could say that about people, too. Everything we say or do is composed
mostly of things we've seen, heard, read, etc. Maybe there's some original thought or idea, but that's filtered through all of our skills that we've learned by observation and training. I don't see how the same isn't applicable to AI.