NASA and USAF Looking for a Next-Gen Space Processor

Status
Not open for further replies.
Oracle's Larry Ellison or one of Apples "top brass" with their $100m salaries should be able to finish the $2m space applications in 1 week. Are they worth so much or just forward the work to someone else?
 
IIRC, processors used in space are on a whole other level from computer processors. This is because they get exposed to environments/radiation that we don't worry about on earth. Also they must be pretty much flawless, it's a little difficult to RMA a processor that is in orbit around another body. 😉
 
IIRC, processors used in space are on a whole other level from computer processors. This is because they get exposed to environments/radiation that we don't worry about on earth. Also they must be pretty much flawless, it's a little difficult to RMA a processor that is in orbit around another body. 😉
 
@kunzite. You recall correctly. This isn't about performance so much as it's about getting the processor to perform the same in space as it would on the ground where it's shielded by the Earth's magnetic field. A high end processor from four years ago would accomplish the computational tasks they'd put to it, not a problem. Radiation plays serious havoc the lower the nanometer. Transistors that small are easily influenced by radiation. That's why critical computers that maintain the power grid are so shielded; solar flare activity. Even then it's hardly 100% perfect. In space they need a near perfect solution.
 
Why does NASA need a space processor, are we planning to sell it to the Russians for 'their' rockets since we don't launch any of our own anymore? 18 Billion was the actual spending of the 2012 space budget, 17 billion is this years proposal.... that's a lot of dollars and cents for a country with zero active rockets
 
I vote the AMD FX. It's the world's first 8-core desktop processor. And the world's first 8-core Space processor. Perform 8 space experiments simultaneously!
 
Wow, a whole lot of stupid out today :pfff:

As kunzite already put out, processors for space based and radiation-hardened applications have a completely different set of requirements than their commercial terrestrial counterparts.

Radiation wreaks absolute havoc on unhardened electronics and even a single error in computing can spell disaster in a mission critical system.
Extreme ECC usage, exotic manufacturing techniques and robust design principles for rad-hard chips virtually ensure that a chip of any reasonable size will not offer performance comparable to commercial counterparts.

Cooling is also much more complex in space as you have to deal with extreme temperature swings and there is no external moving substance to conduct heat away from the spacecraft.
to remove excess heat from the system, you need to either use an ineffective radiative system or have a somewhat quickly depletable store of cooling gas/fluid stored onboard.
Increasing performance on a produced chip generally leads to increased its power usage and cooling requirements; both of which are hard constraints that need to be carefully balanced in the system.

Additionally, QC needs to be extremely high, to the point of producing 100% defect free chips (probably to the effect of having 90%+ scrap rates on less than perfect chips).
Some redundancy should be built in to deal with the inevitable damages that occur from long usage in hostile environments, further increasing chip complexity and reducing die area for pure performance gains.

en.wikipedia.org/wiki/Radiation_hardening

science.nasa.gov/science-news/science-at-nasa/2001/ast21mar_1/
 
I agree outlw, kunzite, and shloader....

Come one people....... Did everyone forget that space is a different beast than what it is on Earth?

You just cant "throw in" a everyday computer hardware into space and "hope" it last......
 


Nope... Vacuum = heat not transferring (efficiently) into it's environment.

Now it can still cool down via radiating heat (like the sun heating earth) but.... that's a highly ineffective way of cooling our cpu's as all our cooling designs is based on transferring heat from some mass to another mass (Heatsink to air).

In a vacuum, to simply put, If your going to let the cpu live, your going to have to do the exact opposite of overclocking.
 


Not completely, they're typically processors that have been tested for a quite a while, typically older processors, that are known to have longevity. Case in point, when I was in college my program was working with Motorola on their Iridium satellite network (this was early 90s). We were irradiating their chips to simulate time in space. I asked one of the guys coming down every other week or so what the processors were and he mentioned that they weren't a new design, just tweaks to an older design (the chips were originally designed in the mid 80s/mid-late 80s).

Typically what you will see is a processor that has larger transitors (likely using processes like 65nm or 90 nm for the transitors compared to the present running at 32nm and soon 22nm). Because they're large transitors, they aren't as likely to be harmed by the gamma and x-rays flying around in space or take longer to degrade than some of the newer processes would.
 
The answer is obvious, ultra low power but very powerful mobile processors used on smartphones, pads and other ultra thin form factors would seem the blatantly obvious choice. There should be no need for spending millions on researching and contracting a whole new processor.
 
2 million funding... watch them come with an Intel Celeron with a fancy name or cooled using space dust. ...now where's that extra 20million cuz our execs are expecting bonuses this year all expense paid trip to....
 
As it happens, I have some experience in this area. I know NASA and the AFSC have looked to try and adapt "commodity" computing hardware to space operations with varying degrees of success (laptops on manned missions, x86 on Hubble, TI DSPs on various photorecon sats). However, if you look at the history of their "let's pick a processor for the next 10-20 years" initiatives, you'll see a some particular things when they choose processors:

1) Must be able to be space rated. That means wide thermal tolerances (learn some basic thermo...vacuum makes shedding heat much, much harder) and the ability to be radiation hardened (either through manufacture or shielding).
2) Must have a wide power profile to cater to wildly different mission power budgets.
3) Traditionally, it needs to be available in multiple fabrication processes and from multiple vendors.
4) Traditionally, there's been some bias for solutions brought forward by companies with historical military industrial ties.
5) They often have significantly more redundancy than ground based systems (ECC is just a start).

So you can look at a couple of processors that made the grade:

MIL-STD-1750A: The classic. An Air Force proposal for a standard-for-everything, 16-bit processor architecture. Implemented by dozens of companies in a number of novel fabrication technologies including SoS (Silicon on Sapphire) for intrisic radiation hardness. Optional, standardized MMU & FPU. Still flying in air and space on a bunch of platforms, even though obsoleted by the USAF in 1996.

RAD6000/RAD750: These are radiation-hardened, space-flight qualified versions of the IBM Power/PowerPC processors from IBM Federal Systems Group. They're on literally hundreds of space craft and other military systems.

CDP1802: Not really an official choice, but very suitable for space use. A weird but interesting 8-bit processor from RCA & Harris (deep military ties). Not only was it CMOS (power friendly), it could be clocked down to DC (0Hz) and put into suspended animation. The simple design allowed it to be fabbed on SoS. It's on Galileo, among others.

Now...if I had a bunch of money laying about, I'd grab an ARM license, poach some folks from the big ARM fab shops and couple of old timers who know SoS and ECL and such and make a run at it. I strongly suspect however that with IBM going to an ARM-like licensing model, that the advantage is with a RAD750 descendant.

Intel and AMD don't have anything that's even in the ballpark.
 
Status
Not open for further replies.