• About the class
  • Assignments
  • Bibliography
  • Extra Credit
  • Syllabus and Schedule

The Evolution of Computing and its Impact on History

The Evolution of Computing and its Impact on History

Author Archives: Cody Hyman

Class Summary: 11/28

29 Tuesday Nov 2011

Posted by Cody Hyman in Class Summary

≈ 1 Comment

Class on 11/28 started off with a 30 minute talk on the subject of space computing. The first subject was a brief overview of control sequences and simulation of satellites presented by Cody who had previous experience with this field as an intern at the Jet Propulsion Laboratory. As space missions are very costly and sensitive, it was shared that all command sequences sent to spacecraft are typically simulated on ground computers before being uplinked and executed on the spacecraft itself.

The conversation then moved to talk about the effects of radiation levels found in space on computers and how computers and software on spacecraft have to be designed to be radiation hardened. The key reason stated for this is that in space there are very large amounts of charged particles flying about, trapped in planetary magnetospheres, or in cosmic wind. It was discussed that when these particles strike computer components, they have a tendency to cause unexpected changes to stored data and program states, in what we learned is called a single event upset or SEU. We also learned that continued exposure to radiation can cause permanent damage to electronics. Due to these effects a process known as radiation hardening is important to keep computers in space operating reliably for long periods of time. The basics of radiation hardening were covered, including the use of different materials in integrated circuits, using less susceptible designs for particularly sensitive components, hardware redundancy and error checking, and careful software design.

Dr. Wagstaff spoke about a project that she had worked on at JPL called Binary Instrument Toolkit for Fault Localized Injection Probabilistic SEUs, or BITFLIPS for short. This project is a set of programs for testing spurious radiation effects on software to simulate the space radiation environment where bits in memory may be flipped unexpectedly. Despite measures like these for testing, there are usually unexpected problems encountered during actual space missions.

On the topic of software, debugging problems on distant spacecraft was also brought up with the Mars rover Spirit as an example. Dr. Wagstaff told the story of Spirit’s flash anomaly that occurred after it landed on Mars. Communication was lost with the rover but ground stations still picked up occasional signals from Spirit. Through debugging on the ground it was found that the file system on the MERs routinely had indexing overflows that caused unexpected system restarts. After finding this, the MER team developed a workaround but could not fix the fundamental problem.

Another topic discussed relating to radiation hardening is how it tends to lag behind current computer technology. An example of this would be the main on-board computer on the recently launched Mars Science Laboratory, a.k.a. Curiosity. Its $200,000 RAD750 computer sports a radiation hardened version of IBM’s PowerPC 750 core, clocked at 200MHz. Although it is the year 2011, this hardware is similar to that of a very dated first generation Apple Macintosh G3. Older missions, like the Mars Exploration Rovers, Spirit and Opportunity, have even more limited 20MHz RAD6000 computers that might be on par with a fast calculator. Despite these challenges, even the MERs were capable of basic autonomy feats such as image based obstacle avoidance.

Curiosity, running a limited 200MHz Power750 processor

Aside from the harsh radiation environment and limited computer hardware, the communications side of space computing was also discussed in class. Unlike terrestrial networks where data can be transferred across the globe fast enough to be mostly unnoticeable; communicating with spacecraft outside of Earth orbit involves longer delays due to the finite speed of light. One example is that at its furthest point from the Earth, round-trip communications with Mars probes may take upwards of 40 minutes. In addition to the long delays, communications with deep space require large and complex radio equipment, such as the very large dish antennas of the Deep Space Network that was discussed in class. On top of all of this, the point was made that the data rates between Earth and places in deep space are commonly low, restricting the amount of data that can be sent to and from spacecraft far from Earth.

70m Antenna - Deep Space Network, Madrid Station

After our discussion on space computing, the class quickly transitioned over to student presentations on various topics they have been researching this term. Austin Sharp made the first presentation on early digital computers in the USSR, including the Strela and the BESM, made during the mid-1950’s for artillery and nuclear weapons calculations respectively. Although the Soviets were catching up with the U.S. in many other fields during the time, these attempts at digital computers ultimately ended up failing to meet their goals. One reason given for this was the high level of competition between the Strela and BESM teams instead of cooperation. Austin noted that cooperation between von Neumann, Goldstine, Eckert, and Mauchly in the U.S. ultimately resulted in ENIAC and the start of many successful computer projects that could not be rivaled by the USSR at the time.

Cody Hyman made the second presentation, on general purpose analog electronic computing following World War II. This talk covered the importance and common use of electronic analog computers following WWII. Analog electronic computers are devices that use analog circuits to model other systems and typically have the abilities to solve certain classes of problems faster than the digital computers of the day. Some of the first analog electronic computers were designed specifically for simulating guided missiles but they quickly became more generalized and went into mass production. While almost entirely extinct today, analog computers were presented as an important and widely used tool in science and engineering between 1950 and 1970. Analog computers were used in applications ranging from the flight computers on the Apollo lunar landers to ICBMs to cooling simulators for nuclear reactors to designing airplanes.

Austin Valeske made the third and final presentation of the day on the Airy tape, one of the first noted instances of debugging. This now familiar technique in computer science came about when Maurice Wilkes, the creator of EDSAC found that a program to evaluate the Airy integral (the solution to the differential equation y(x)’’=xy(x)) contained 20 errors in its 126 line entirety. This led to investigation of techniques including peeping, where one looks into the memory after each instruction, post mortem debugging where the memory is saved after the program terminates, and using interpreters to step through the program.

 

Class Summary: 11/7

08 Tuesday Nov 2011

Posted by Cody Hyman in Class Summary

≈ Comments Off on Class Summary: 11/7

Continuing on from the last class, we began the morning discussing objections and shortcomings of the Turing test, the state of artificial intelligence, and more on the proceedings of previous Loebner prize competitions.

While the judges are still not being convinced by machines, one such argument against the Turing test brought up in class is its inability to discern between intelligence and the appearance of intelligence. Searle’s Chinese Room, a thought experiment, was brought up to illustrate this point. The experiment imagines a person in a room with no knowledge of the Chinese language tasked with writing responses to messages written in Chinese using a book of every necessary Chinese response to any set of inputs. If the entire process is done from the book the person would be conversing in Chinese but would not have any understanding of what they are reading or writing. Likewise, the machines attempting to win the Loebner prize simply respond in a simple programmed fashion and do not yet understand exactly what they are saying. The use of the Turing test as a metric for machine intelligence was also questioned as it is fairly subjective, and as we saw with the Loebner prize, depends on the test giver’s experience with pseudo-intelligent conversation machines.

Other AI related topics were also discussed, relating to the history of AI. The term AI originated in 1956 with John McCarthy’s proposal for the Dartmouth conference, a short summer long research period to study the learning capability of machines. McCarthy was unable to reach his lofty goals, as many of them have not even been attained since then. We also learned about the general slump in AI research (Artificial Intelligence Winter) that continued on up through the 1990s until adequate computer hardware to tackle many of the problems started to become available.

After concluding our discussion on artificial intelligence, the Loebner Prize, and the Turing Test, we transitioned into talking about advancements in computer architecture leading to microcomputers.

A modern integrated circuit

We began this discussion outlining the evolution in computers from vacuum tubes into discrete transistors, into integrated circuits, and eventually microcomputers. Integrated circuits are entire circuits created on a single semiconductor through the process of photolithography. It was discussed how this method of making circuits is advantageous over hand assembly as it can produce smaller, cheaper, more efficient, and less error prone circuits, a necessity for the creation of microcomputers. Today almost every computer made is mostly constructed from integrated circuits.

Along with the introduction of the integrated circuit, we discussed the necessary advances in memory that have made the modern computer possible. Two categories of memory were analyzed in class, including serially accessed storage memory and random access memory (RAM).

On the subject of RAM, we watched a short video discussing three types of RAM including magnetic core memory, static RAM (SRAM), and dynamic RAM (DRAM). Although antiquated today, core memory was predominant in the early ages of electronic computing up through the 1970’s. We learned how core memory utilizes wires hand woven between magnetic rings (toroids) and electric currents to store bits of data magnetically. Being hand assembled, these devices commonly only held up to a few kilobits. We also briefly discussed the two other forms of RAM that are commonly seen today SRAM and DRAM. SRAM that use transistors and capacitors respectively to store bits of information.

After talking about RAM, we moved on to cover how information has been stored on computers, and how the methods of doing so have changed since the introduction of electronic computers. The first method, after the era of punch cards to catch on was the use of data tapes. These tapes stored data magnetically on large reels. One example was the 1.6kilobit/inch tape shown during the previous week, where one large 700 inch reel could only hold 140kB. Another storage device was the floppy disk, which progressed from monstrous 8” 80kB disks down to 5.25” and later 3.5” 1.44MB disks. The class then examined the innards of a 3.5” floppy disk to see the magnetic film disk inside the cartridge where the data is actually stored.

Similar in function to a floppy disk are hard disk drives, which were also discussed. Using spinning multiple magnetic disks and read/write heads on movable arms, hard disks are able to store very large amounts of data in comparison to other storage media. Two old and disassembled hard disk drives were passed around the room to get a hands on look at how the devices work (which hasn’t changed much in recent decades).

On a side topic, we also conversed about the increasing use of non-volatile solid state (semiconductor based) storage drives in place of hard disks. These devices replace the need for moving parts with integrated circuits allowing for faster operation, however these devices have not yet matched the storage density of traditional hard disks.

At the end of the conversation we also discussed the origins of the magnetic platter hard disk with the IBM 350 “RAMAC” disk drive. Utilizing 50 2 foot diameter plates spinning at 1200rpm and a single read/write head, the IBM 350 could hold 5MB of data. Of course in comparison to modern disk drives, this seems outlandish, but this device was a computing breakthrough when it was introduced in 1956.

♣ Topics

  • Ada Lovelace Day
  • Alternate History
  • Class Summary
  • News
  • People
  • Personal History
  • Reading Summary
  • The Future
  • War is in the air

♣ Archives

  • December 2011
  • November 2011
  • October 2011
  • September 2011

♣ Recent Comments

  • Randy orton on RIP Steve Jobs
  • Kevin on Computers can be hacked and so should life
  • Kiri Wagstaff on Alan Turing’s “Computing Machinery and Intelligence”
  • Sarah Fine on Class Summary: 11/21/11
  • Sarah Fine on Class summary: 11/16

♣ Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org

Proudly powered by WordPress Theme: Chateau by Ignacio Ricci.