Class on 11/28 started off with a 30 minute talk on the subject of space computing. The first subject was a brief overview of control sequences and simulation of satellites presented by Cody who had previous experience with this field as an intern at the Jet Propulsion Laboratory. As space missions are very costly and sensitive, it was shared that all command sequences sent to spacecraft are typically simulated on ground computers before being uplinked and executed on the spacecraft itself.

The conversation then moved to talk about the effects of radiation levels found in space on computers and how computers and software on spacecraft have to be designed to be radiation hardened. The key reason stated for this is that in space there are very large amounts of charged particles flying about, trapped in planetary magnetospheres, or in cosmic wind. It was discussed that when these particles strike computer components, they have a tendency to cause unexpected changes to stored data and program states, in what we learned is called a single event upset or SEU. We also learned that continued exposure to radiation can cause permanent damage to electronics. Due to these effects a process known as radiation hardening is important to keep computers in space operating reliably for long periods of time. The basics of radiation hardening were covered, including the use of different materials in integrated circuits, using less susceptible designs for particularly sensitive components, hardware redundancy and error checking, and careful software design.

Dr. Wagstaff spoke about a project that she had worked on at JPL called Binary Instrument Toolkit for Fault Localized Injection Probabilistic SEUs, or BITFLIPS for short. This project is a set of programs for testing spurious radiation effects on software to simulate the space radiation environment where bits in memory may be flipped unexpectedly. Despite measures like these for testing, there are usually unexpected problems encountered during actual space missions.

On the topic of software, debugging problems on distant spacecraft was also brought up with the Mars rover Spirit as an example. Dr. Wagstaff told the story of Spirit’s flash anomaly that occurred after it landed on Mars. Communication was lost with the rover but ground stations still picked up occasional signals from Spirit. Through debugging on the ground it was found that the file system on the MERs routinely had indexing overflows that caused unexpected system restarts. After finding this, the MER team developed a workaround but could not fix the fundamental problem.

Another topic discussed relating to radiation hardening is how it tends to lag behind current computer technology. An example of this would be the main on-board computer on the recently launched Mars Science Laboratory, a.k.a. Curiosity. Its $200,000 RAD750 computer sports a radiation hardened version of IBM’s PowerPC 750 core, clocked at 200MHz. Although it is the year 2011, this hardware is similar to that of a very dated first generation Apple Macintosh G3. Older missions, like the Mars Exploration Rovers, Spirit and Opportunity, have even more limited 20MHz RAD6000 computers that might be on par with a fast calculator. Despite these challenges, even the MERs were capable of basic autonomy feats such as image based obstacle avoidance.

Curiosity, running a limited 200MHz Power750 processor

Aside from the harsh radiation environment and limited computer hardware, the communications side of space computing was also discussed in class. Unlike terrestrial networks where data can be transferred across the globe fast enough to be mostly unnoticeable; communicating with spacecraft outside of Earth orbit involves longer delays due to the finite speed of light. One example is that at its furthest point from the Earth, round-trip communications with Mars probes may take upwards of 40 minutes. In addition to the long delays, communications with deep space require large and complex radio equipment, such as the very large dish antennas of the Deep Space Network that was discussed in class. On top of all of this, the point was made that the data rates between Earth and places in deep space are commonly low, restricting the amount of data that can be sent to and from spacecraft far from Earth.

70m Antenna - Deep Space Network, Madrid Station

After our discussion on space computing, the class quickly transitioned over to student presentations on various topics they have been researching this term. Austin Sharp made the first presentation on early digital computers in the USSR, including the Strela and the BESM, made during the mid-1950’s for artillery and nuclear weapons calculations respectively. Although the Soviets were catching up with the U.S. in many other fields during the time, these attempts at digital computers ultimately ended up failing to meet their goals. One reason given for this was the high level of competition between the Strela and BESM teams instead of cooperation. Austin noted that cooperation between von Neumann, Goldstine, Eckert, and Mauchly in the U.S. ultimately resulted in ENIAC and the start of many successful computer projects that could not be rivaled by the USSR at the time.

Cody Hyman made the second presentation, on general purpose analog electronic computing following World War II. This talk covered the importance and common use of electronic analog computers following WWII. Analog electronic computers are devices that use analog circuits to model other systems and typically have the abilities to solve certain classes of problems faster than the digital computers of the day. Some of the first analog electronic computers were designed specifically for simulating guided missiles but they quickly became more generalized and went into mass production. While almost entirely extinct today, analog computers were presented as an important and widely used tool in science and engineering between 1950 and 1970. Analog computers were used in applications ranging from the flight computers on the Apollo lunar landers to ICBMs to cooling simulators for nuclear reactors to designing airplanes.

Austin Valeske made the third and final presentation of the day on the Airy tape, one of the first noted instances of debugging. This now familiar technique in computer science came about when Maurice Wilkes, the creator of EDSAC found that a program to evaluate the Airy integral (the solution to the differential equation y(x)’’=xy(x)) contained 20 errors in its 126 line entirety. This led to investigation of techniques including peeping, where one looks into the memory after each instruction, post mortem debugging where the memory is saved after the program terminates, and using interpreters to step through the program.