You know computing, but you lack programming knowledge. Every computer needs to be told what to think, linear or not. The fact that the AI doesn't follow the linear pattern and is more of a abstract calculating machine still doesn't negate that you need to have an external >>input somewhere along the line for it to actually do its thinking. There must be a beginning somewhere.
Actually i have been programming for 20+ years so i do NOT lack programming knowledge.
A piece of software can be written with general guidelines (rules?) and allowed to learn associations and methods via experimentation.
The programmer didnt TELL the program what it learned, they told it HOW to learn. This is the same way that nature equiped humans with the builtin ability to learn, and even pre-installed some knowledge (ie physical things like pain is bad, how a heart beats, the difference between quiet and loud, etc.
Also do you really think that humans think prior to conception? Hello... talk about turning on the computer!
Take a neural network with enough connections, connect it to some input devices (nerve endings to skin, muscles, taste, hearing, sight), preconfigure some basic rules (our hind brain is pretty much hard-wired and does not need to learn how to work), add the ability to store + replay nerve recordings (remember tastes, smells, images) and preset some valuations (ie recoil from pain, eat to avoid/remove hunger pangs, etc), and you have a primitive brain.
Emotions developed side by side with intelligence, not before it. To make an AI "feel" you need to have some sort of calculation or algorithm or formula actually lay out the ground work for the emotion.
Emotions developed long before SAPIENT intelligence. My comment was that so many authors and even programmers assume that an AI would have emotions AFTER developing complex thinking. You restate my opinion that emotions are REQUIRED for any intelligence beyond the simplest. They are how creatures react and evaluate their reactions. ie angry means attack, sad means regret something and learn to change actions, scared means run away, jealousy, lust, etc. The more involved the emotion, the more intelligent the creature can be.
Look at Data from Star Trek. By all means the most advanced "thinking" machine but yet still he needed a "Pre-programmed" emotion chip to be installed that had algorithms that specifically reacted to situations and such through a mathematical manner.
This is the biggest example of my point. Data HAD to have some emotions to act anywhere near as human as he did prior to getting the chip. Emotions are really a mathematical bias of how a creature reacts to stimuli, and a brain is based on AMOUNTS of input from section to section of the brain. Thats why chemical stimulants and depressants effect us the way they do. They enhance or depress the affects of various areas of the brain, or increase or decrease the reaction to certain hormones in the brain.
Actually hormones are the precursors to emotions. They are the cause and the carriers of the various emotions.
This is why emotions are neccessary for an AI. Without emotions an intelligence has no valuation on actions. Emotions are mental state. And implementing human/primate emotions are not that much harder to implement than simple pain/fight/fear/mate emotional states. They are in fact easier to implement than the more advanced intellectual states.
After all, all animals react with some degree of emotion, and more advanced mammals share many/most of human emotions (dogs react with hate, love, fear, jealousy etc).
Your basic fallacy is your hang up on software needing to be told what to do. A good AI just needs to be programmed with some rules, abilities, and valuations, and then can learn what they can do, evaluate what they should do or want to do, and the react accordingly. Just because most current software isn't programmed this way doesnt mean computers are forever incapable of being programmed this way.