I'm gonna get flamed to no end for this but...

Page 2 - Seeking answers? Join the Tom's Hardware community: where nearly two million members share solutions and discuss the latest tech.
what if memory could be made from a calculation or the answer to a calculation is repeated over and over again while another processor reads the calculation... this means you would need a heck of a lot processors to 'make' memory where there is none, thus the system is able to 'function' without memory even though you have 'memory'

Ara

WTF -- This qualifies under the category of "not even wrong".

Again, in order to do a calculation you need to have memory.
 
Problem: Processors have memory, technically your FSB has memory (1=on, or a 0=off). Memory is the only way that anything on a computer works. The overheated VIDEO MEMORY is what causes artifacts, not usually an overheated GPU, so your entire theory is flawed. Sorry.
 
Can some hardware person help point this guy to the name of such "basic computing" such as "descreet logic" or other "non programable" "COMPUTING"?

Unfortunately am not the foremost in digital design expertise. However, I am familiar with a certain class of hardware which fits the input -> process -> output model, Finite State Machines. Finite State Machines are the building blocks of digital controllers. Digital controllers are used extensively in modern processor designs. Guess what: finite state amchine building blocks are fip flops, the exact structure which is the basis of all computer memory. So...this was no loophole either. Memory is necessary for doing computation.

OH and BTW any experienced technician with proper equipment could eventially figure out the fauilure mechanisms for a given IC and predict the outcome from overheating. Of course it would require figuring out the manufacturing idiosyncracies of that particular chip or at least the electrical byproducts of them. It's not magic being witnessed during this experiment but the expected failure which shows up as seemingly random due to the imperfect manufacture processes of the silicon electronics.
 
computer cannot function without the memory as you cannot function without a brain!!! PERIOD!!!

THERE IS NO GHOST IN THE MACHINE...there is only programing (you can program ghosts though🙂

ohhhh......and before computer dies...it gives up one :wink:

i really suggest that you drop this insanity....otherwise it looks like you have too much time on your hands.....you need to get out more.
 
You are taking away a lot of what I would consider part of a "general purpose" or "programable computer". Without memory of some type to store the program to execute... then it becomes a set of "hard wired circuits" that does only a set of pre-defined tasks.

RoundSparrow seems to have hit the nail on the head with this one oolceeoo. A know of many computers without 'memory' they are called light switches.

I still dont really see how heating up a computer to the point that it is destroyed links to its use of memory? Obviously Im completly missing the point of this. However I am inclinded, at this point, to agree whole heartedly with Ned Flanders
 
Now you will agree silicon is man made material where sand is super heated and then refined.
Humans are composed of organic material severly more complex than simple silcon.
The thing that happens when you overheat CPU is that are on a certain clock speed, meaning that there is a certain amount to time it takes for a CPU to do a calculation. During this process when heat is applied this calculation takes more time to be completed because the resistance in the CPU is increased causing two calculations to be mixed. Hence artifacts or even random outcomes. Not to say anomilies no not happen even during normal temp because they do. To find out why these happen one who need to analize the million for transistors that are in a microprocessor which is not feasible. To do this you would need to repeat the same calculation over and over and come up with different reasults, then annalize witch transistors behaved differently and why.
What happens when a microprocessor reaches what we will call dieing point is the silicon transistors are fusing together creating a solid connection. This point of fusion depends on the material used, the quality of material used, process in which it was manufactured.
 
Try to ignore those who are calling you an idiot. They have no imagination.

After thinking on this topic further, I felt it necessary to elaborate further on your experiment.

Indeed, if you could create a computer with no memory you would have an extremely powerful system. Everything would run in real time. Memory creates latency. Latency = waiting. Thus your theory is indeed an interesting one. The problem with creating a computer with absolutely no memory is this: every single component would absolutely have to be in complete sync. Even a nanosecond of de-sync would cause the computer to crash. Memory exists so that components can operate out of sync- a necessary evil.

Furthermore, if you were to create a computer without memory, it would be impossible for the computer to evolve and become self aware/intelligent. Our memory defines who we are. If we could not remember anything, what good would we be? We would not create, learn, or even survive. We'd all forget (or not know) that things like fire are dangerous.

Memory is probably the single most important capability that we humans have. It would be logical to assume the same of computers. Thus, while a computer without memory may be a powerful one, it's probably an impractical one. It's only use would be doing things like calculating PI which would still require memory in some form or fashion.

-mpjesse
 
first off your an idiot..

input, processing, and output

how would the "proccessing" work without memory?

ok lets put it this way, i wouldnt be able to type this msg without memory, cause each character is being stored.. if i where to print this page it would goto a buffer(memory) and be sent to my printer which has another buffer of memory... see the problem.
 
OK, apparently my previous comment was removed. I appologize for calling anybody an idiot. I do, however, have quite a good imagination.

The original hypothesis states that "computers can run without memory".

Our distinguished and honorable oolceeoo is asking the community how he could test this hypothesis. So far the method he has chosen to test this hypothesis is to run old computers in a hot environment to the point of thermal breakdown. While I could certainly imagine a table full of steaming hot 386's speaking to each other in their best Turing meta-language... somehow pragmatism gets the best of me.

A computer without memory is like a square circle. It makes no sense, even conceptually. Think of the theoretical Turing machine. It still needs tape, doesn't it? Take the tape away and you don't have a computing machine anymore.
 
I do get out often enough. Please don't take any jabs at my character, keep it to my idea. I appreciate all those who offer constructive criticism to my theory. I'll read through each of your posts again since I didn't expect so many replies.

I can only defend a new idea so far, and as of right now it is nothing more than that. I'm using my imagination, something that I'm finding few people still use today.

No one I meet in real life is going to care about this idea except for my close friends or professors. It is extremely difficult to continue when everyone either doesn't understand, doesn't care, or just thinks its crazy. But I believe in it and this is just the beginning.
 
If you do not already, smoke some reefer...you will be flooded with many like ideas...and if you do it with the company of some friends, you will find the majority thinking along with you at how deep and introspective you are. :?
 
But I hope that maybe atleast one reader will take me seriously.

Hi, everyone!
(Just a remark: i'm new at this forum so i haven't read that much... which - partially - explains my extactic further statement: 'It almost cut off my breath!')
I find it one of the best threads i've been through, so far, because it defies common-sense logic.com
I'm not an expert in computing, although a very interested... non-expert (...)
Actually, the basic statement "computers can [exist], [work], [function] without memory", can be proven both ways, 'right' & 'wrong' simultaneously, without stepping into metaphysics. It's the modus operandi which is - definitely - wrong, according to the second law of thermodynamics (see 'entropy'.) [Reductio ad absurdum: take the device's temperature down to [close] absolute zero and infer what happens then.] Perhaps surprisingly, it's the same 2nd law taken into the quantum realm that does not forbid both solutions: 'right' & 'wrong'.
It all comes down to electrons: taken as a current (voltage differential in a transistor's gate), it only basically matters their charge, their energy and the path width on whatever medium. It doesn't take into account (yet!) their spins, for instance (see 'spintronics'.). For a given amount of time, charge (current) can be stored on whatever medium, allowing output coherency with the voltage input submitted. Then, you can say you have 'memory': a coherent, expected result. But, you know nothing about each of the electrons in the process (it gets tougher with photons...).
Now (it's already happenning...), suppose the transistor's gap width narrows in such a way that, only one electron suffices to change the transistor's state. According to the Uncertainty Principle of Quantum Mechanics, even if you're able to store a single electron (and you are), you cannot rely on the output result since it will not be coherent with the given input. The end result might be utterly unexpected, to say the least (of course, there's Quantum Computing, which has limitations too.). Although everything is there, physically, can you now say that you have 'memory'?
After all, even randomness must be stored, somewhere & somewhen. But, can we call it 'memory'?

I'm not proposing to give a quantum physics course, here; i'm not a physicist, anyway.

[Here are some references, if you're interested: Einstein-Podolsky-Rosen paper (EPS paper); The Mach's Principle [Max, Ernst]; John S. Bell inequalities; Allain Aspect's experiment (1978); and, of course, lots of quantum thermodynamics!]

As for a device working without memory, i don't know what the future will bring (i do have some thoughts though...); as for your modus operandi, i think you should stick just with the concept and try to address some expert, technical advice.

Hope this [rather long but not ended!] dissertation helped, in any way.
 
I'm afraid all the artifacts you mention amount to is the famous room with thousands of monkeys at typewriters - sooner or later one will turn out the complete works of William Shakespeare. ie, sooner or later those graphics artifacts will, for a split second, resemble your current desktop, the Mona Lisa, or something meaningful. (Ignore that the paper the monkeys type on can be considered memory.)

As somebody mentioned earlier, I also appreciate people thinking "outside the box." People called some of our best known inventors idiots (probably worse), but they persevered. While this particular implementation might be waaay beyond left field, keep on playing outside of the box with different ideas. Learn from the ones that don't work & press on.

As an aside, I've considered the opposite of heat, cold - superconductivity - (and scientists are coming up with new materials that super-conduct at warmer and warmer temperatures all the time) holds many answers to bringing computers to the next level, whatever that will be. 😀
 
When I mean memory, I mean ALL forms of memory. hdd, ram, rom, cache, registers, everything. A computer is comprised of input, processing, memory, and output. Take out memory from that list and you would get input, processing, and output. I'd venture to say that this would make a pretty powerful computer indeed!

Dude some one beat you to this idea LONG ago - they called it your *BASIC* (Electric Typewriter) not the ones that had memory and stored docs and word check

and before that they even invented one that even elliminated the *processing* they called it (The Manual Typewriter)

THINK "real" hard about this for a little while, please
 
this is NOT to say you are stupid - but my thinking is mabey you are barking up the wrong tree

compared to what technology could really be .... we have enough dreamers out there (watch sci fi television for a while) a little star trek or somethin

a computer is a device - when it comes to the process of building a computer its designed for something very specific - regardless of the technology that it requires to make the computer, i know everything is really small and its cool to think about real AI - real anomolies being something significant

but its as simple as this - what you are doing is short circuting a very simple machine - CPUs arent anything at all - they respond they way we tell them to - even an a Blue Screen of Death is something we are telling it to do - unintentional, yes, but anyway

shorting out a battery is a malfunction but it doesnt mean anything besides a dead battery or a fire(in this case)....

the current technology isnt advanced as your thinking
 
All of these responses give me a lot to contemplate. I think that I'm being misunderstood somewhat or I'm not wording right what I'm trying to do. It's sort of like the man whose name escapes me that proposed that parallel lines could indeed intersect, and he invented this new geometry that lead to the development of the atomic bomb.

I know theres no little magic man inside computers that is going to pop out. I'm going to try and see if I can better explain this after some thought.
 
ya know this really sorta does have my interest piqued
i've had to reread everything you've said so far several, several, several times to get my brain to latch onto your basic idea here (your fault 😛 )

i dont think removal of all memory is the key

since all computers have memory built into the CPU the test you are trying to do isnt an exact science.

removal of the BIOS might be an idea though since thats the basics of all computers - then at least the test will be non tainted by the motherboard telling the computer what to do - this way nothing will be recognized or operable - then aplly variable heat/cold/warm to see what type of output you get - in your case the temperature being the input - process is seriously missing from this equation - because with technology being what it is - this will be very hard to reproduce and get any kind of standard output

but it could lead to something alot more - mabey you understand my vagueness (is this a word?)
 
A human-example of what you're talking about. Consider what kind of 'computing' you would be able to do if you couldn't remember more than 1 second of your life at a time. Perhaps you could do simple math, but you'd have no idea that the reason you were doing it was to file your tax return, finish a test, etc. You couldn't do it.

Take it one step further. You're computing the sum:
43 + 89. First of all, where do you store the numbers 43 and 89? Where do you store the fact that you want to add them? When you add 9 + 3 to get 11, where do you store the 1 in the ones place, and the 1 in the ten's place?

Basically, I'm afraid you couldn't get very far like this. (In fact, strictly speaking, you couldn't get anywhere, per the example above. Processors themselves must store the instruction to be executed somewhere in order to process it) Both computers and humans require context for almost all non-trivial applications. If your word processor has no memory, where is your term paper? If you don't have any memory, how could you even spell a word, let alone form a coherent thought.

In the end, lack of memory may work if the entire problem can be solved in a single step, but that is about it.

Sorry.

OMFG!!! Hey MacCleod... I can't believe someone else hasn't jumpped all over this already.... I think you have to pull the heater away from your own memory module. I think your math processor is a little overheated. 9+3= ... Ummm let me take my socks off.... well, it ain't 11!!!

:twisted:
 
Being an electrical engineer i would have to say that this is without a doubt the dumbest idea i've ever heard... are u serious?

This idea should go in the trash can next to some really bad robin williams movie..... like bicentennial man.

I dont understand why people think computers can be brought to life...
my best guess it that its from their lack of general principles on which they work and most of all loneliness.

A computer is a machine, kinda like a car.
If you heat up the engine of the car will start burning oil and malfunction nothing good will ever come of it. It'll certainly not start think on its own.
It only does what we engineers have designed it to do... in the case of a prcoessor theres a whole lot of logic gates that process commands that have a certain desirable outcome. It is all programmed.

Im sure that with enough complicated circuits you could mimick human thought but this wouldn't be a randomly created machine.. it would take years if not decades to make something like this. It would also require far more advanced technology than we have today.

You could say that a million monkeys typing on a type writter could write shakespeare... i dont think thats gonna happen any time soon. Same with your idea.

Even if we could bring a computer to life... you should ask yourself why?
What good would this do?

Just the thought of a computer impersonating human emotions or thinking on its own makes me cringe... probably because i've watched way too many Terminator movies.

But seriously,
what would be the benefits of having a machine that acts like a human?
Humans make mistakes machines don't, how do you account for that?
would you want a machine that makes mistakes?
Afterall, that's the reason we made calculators... to not make mistakes and do it faster.

Another thing to think about...
If you make a machine better than a human... wouldn't the theory of evolution state that the human would be weeded out? :?

This is why i really hate people who try to make computers think for themselves. In a sense you're trying to eliminate humanity, but you don't know it.

And if you were in a certain movie.. i'd hunt you down with a double barrel shotgun on a harley w/ cool sunglasses. 8)