Originally Posted by Nautilus
I just don't get it. Why use such an old architecture when you could get like 10x (probably more) perfomance with an extremely power efficient ARM cpu or perhaps an ULW x86 cpu? I'm sure strengthening ARM or x86 against radiation is also possible.
Isn't a powerful CPU a must for scientific research? Assuming that analyzing the chemical compounds, taking hi-res pictures requires a lot of cpu power, why would they go with an ancient IBM cpu?
It's about what the guys I worked with for a while in the industry call 'heritage'.
You want something that has a proven track record, and you know it's as reliable as you can make it.
Modern processors are more complex, therefore more to go wrong.
The (fairly crude) example they gave me when we were talking about their main communications satellite design they sell is this:
Imagine expecting a car to run 24/7 for a lifetime of 15-20 years, with zero servicing and zero repairs.
It's no good sending your $2.5 billion rover to mars only to find it doesn't work when you get there. You don't need the power of more modern CPUs when an older, more reliable one will do. And as I said, the increased complexity is seen as a disadvantage.