• About the class
  • Assignments
  • Bibliography
  • Extra Credit
  • Syllabus and Schedule

The Evolution of Computing and its Impact on History

The Evolution of Computing and its Impact on History

Monthly Archives: October 2011

Computers: The Ultimate Machines of War

31 Monday Oct 2011

Posted by Nathan Hinkle in War is in the air

≈ 1 Comment

War is the driver of history – many of our most prolific inventions have been the result of advances in military technology. Wars have influenced the development of computers in particular, both necessitating their advancement and at times delaying researchers’ progress. The earliest computing machines were used primarily for generating mathematical tables which could be applied to nearly any industry, and for tabulating massive amounts of data. Military leaders quickly recognized how computers could be applied to their needs though. One area in which this was particularly evident was with cryptography – the business of encrypting and cracking secret messages. The enigma machine was one of the first electromechanical devices used for the encryption and decryption of secret messages. Built by German engineers Arthur Scherbius and Richard Ritter in the 1920s, it used a series of rotors with integrated circuits to route keypresses through the machine, encoding each letter. After each letter had been encoded, it would advance the rotors, so each letter in the message would be encrypted differently. With various other mechanisms to complicate reversing the code,  there were about 10,000,000,000,000,000 possible combinations (Singh, pg. 136). The enigma machine was used by Nazi Germany in World War II to encrypt nearly all of their radio transmissions. Deciphering this information was of crucial importance to the British military, and an entire campus at Bletchley Park was established as a base of operations for the cryptoanalyists. One of the foremost researchers, Allan Turing, developed his own modifications of the enigma machines, designed to brute-force the ciphers based on known pieces of information. These machines, which he dubbed “bombes”, were another significant step in the progress of computin. By the end of the war, 49 of them were in use at Bletchley Park (Singh, pg. 181). War did not solely advance the progress of computers though. With significant resources being put into fighting World War II, available computers were almost exclusively purposed for wartime calculations. An example of this was the Harvard Mark I, often regaled as the first ever fully functional and stable electronic computer. Built by IBM under the direction of Howard Aiken, it was donated to Harvard University for their use in research.

When the Mark I was completed in 1944, IBM gave it to Harvard as a gift. That spring it was installed at the university but was immediately leased for the duration of the war by the US Navy, desperate for gunnery and ballistics calculations. Aiken, a naval reserve officer, was put in charge of the Mark I for the Bureau of Ships – Williams, pg. 112

Although the actual construction of the Mark I was completed rapidly, it was quickly commissioned by the Navy, and researchers at the University were shortchanged of the opportunity to truly take advantage of these new computing resources. Had it not been for the war, computer research may not have been quite so rapid, but the technology would have gotten into the hands of civilians much faster.

The role of women in the history of computing goes back as far as Babbage’s era. Frequently, women were employed in the task of completing the manual calculations (human “calculators”) for men’s work in engineering, astronomy, and other fields. The division was largely a matter of sexist beliefs that only men should do the actual innovation, but that manual computation was a waste of their time (Ceruzzi, pg. 240). Nevertheless, jobs as “computers” were very popular with women, as they were still a step up from the common secretarial work which was oftentimes the only other option. The women who took these jobs were often very proud of their work, for though the labor was menial, it did require significant mathematical abilities (Ceruzzi, pg. 239).

It is therefore unsurprising that many of the first computer “programmers” were women as well – it was merely an extension of the existing tradition of men determining what needed to be calculated, and women executing the calculation. Many of the women programmers were hired because they were mathematicians – one of the few fields women could study at the advanced level – even though they had absolutely no formal training with computers (Williams, pg. 113).

Computers are crucial in modern warfare, to a far greater extent than Aiken or Hopper could have ever predicted. Fighter jets are flown by advanced electronics, computer-powered satellites beam high-resolution imagery to military intelligence agencies, and cyber-espionage is of increasing concern for governments worldwide.

One of the areas in which military technology has come to depend extensively on computers is with unmanned aerial vehicles (UAVs), often referred to as “drones”. The United States Air Force and other government agencies have been using UAVs since the mid 1990s. Originally, they served for reconnaissance missions, but have since been adapted with powerful missiles and other weapons (USAF, 2010). Some have questioned the ethical implications of such dangerous weapons being controlled by soldiers who operate them across the world from where the fighting occurs. Most of the approximately 700 Predator Drones in Iraq are controlled from an Air Force Base in Nevada, where soldiers may have limited perception of what is actually happening on the ground when they fire their missiles (Harris, 2006).

Another area in which computers are of increasing importance in modern warfare is not on the battlefield, but with cyberwarfare. A highly publicized example was the Stuxnet virus, which infected industrial control systems running uranium enrichment systems in Iran. It is unknown to this day who was responsible for the virus, but investigations have implicated both the Israeli and American governments as possible perpetrators of the attack. The virus caused centrifuges to spin out of control, resulting in serious damage to their enrichment systems. This problem set back their nuclear program by years, which may have disrupted Iran’s attempts to build an atomic bomb. With various nations establishing specific cyberdefense units in their military, this type of sabotage may become the next major forefront of global wars.

John von Neumann was quoted as saying,

If you say why not bomb them tomorrow, I say, why not today. If you say at five o’clock, I say why not one o’clock. – Rheingold

In this quote, von Neumann is referring to “them” as the USSR. He was very closely involved with the development of atomic weapons before and during the cold war, and was a strong advocate of a preventative strike against the USSR. This was at a time of intense fear of global war, and von Neumann believed that if the United States did not attack the USSR first, they would surely be decimated. Nevertheless, his views were seen by many as being extreme. A majority of the American population was not advocating for a preemptive attack against the USSR. Looking back at the cold war era from our current vantage point gives us perspective on just how devastating an all-out nuclear war would have been – which von Neumann’s proposed attack surely would have provoked. Yet had I been alive in that time, it is difficult to say whether I would have agreed or disagreed. To be sure, there was significant fear of the Soviets, but as a generally pacifistic individual, I believe I would have advocated against starting a war with such a formidable opponent.

 

Sources:

CERUZZI, P. E. (1991). When Computers Were Human. IEEE Annals of Computing History, 13(3), 237-244.

Harris, F. (2006, June 2). In Las Vegas a pilot pulls the trigger. In Iraq a Predator fires its missile. The Telegraph.

Rheingold, H. (2000). Johnny Builds Bombs and Johnny Builds Brains. In Tools for Thought.

Singh, S. (2000). The Code Book: The Science of Secrecy from Ancient Egypt to Quantum Cryptography.

USAF. (2010, July 20). MQ-1B Predator. Retrieved October 29, 2011, from US Air Force Information Factsheets: http://www.af.mil/information/factsheets/factsheet.asp?fsID=122

Williams, K. (n.d.). Improbable Warriors: Mathematicians Grace Hopper and Mina Rees in World War II.

 

Class Summary: 10/26

27 Thursday Oct 2011

Posted by John Diebold in Class Summary

≈ 1 Comment

We began class by talking about the paper, “When Computers were Human”, that was written by Paul Ceruzzi. Dr. Wagstaff started by asking us whether or not we felt that the reading questions were helpful. There was a general lack of response due to it being early and several students having not yet arrived. We then went on to talk about the paper itself, which talked about how in the early 1940s, humans did all of the necessary scientific and engineering calculations by hand or with mechanical calculators. This was a barrier in many scientific fields, as it took an increasing long time as calculations became more complex. Many research facilities had to keep hiring more and more human computers to keep up with all the calculations they had to do. Even doing this did not give them big increases in computing power, because unlike electronic computers which have exponential increased in power, human computers cannot increase exponentially.

We then began discussing what the job of being a human computer was like. Most computers were woman and there job usually consisting almost entirely of just doing math problems and not any design or testing. This was left up to the engineers, who were usually men. To us today, the job of being a computer sounds pretty boring, but women at the time considered it a good job. This was due mostly to the fact that it was better than most other jobs they could get at the time and it paid well. It also allowed them to contribute to the war effort and the Navy employed many women computers who were given ranks and titles.

The video: ENIAC was then shown. The part of the video we watched was an interview with a woman who worked on ENIAC. She talked about how they had to program with patch cables and how reliability was always a big issue and you could never be sure that the machine was working correctly. To deal with this, they would run a test program before and after they ran the actual program so that they could make sure the machine was working properly before the program and that something hadn’t gone wrong while they ran program.

Next we watched the video: FIRST COMPUTER ENIAC. The part of the video we watched talked about how they had to physically wire the machines. ENIAC was built in a circular room, so the lead programmer would stand in the middle yelling instructions to women who were standing next to certain parts of the computer who would wire the machine. They also showed someone double checking some of the calculations with an abacus because at the time that was much more reliable then the computer.

We then discussed the UNIVAC computer, which was finished in 1951. Computers need the ability to do logical operations and they also need memory. In today’s computers, operations are done in the CPU. At the time of UNIVAC, they were done with vacuum tubes. Vacuum tubes are large and burned out a lot, so they were not ideal. Later computers used transistors instead of vacuum tubes, which are much smaller and more reliable. Today many transistors can be fit onto one silicon chip. Vacuum tubes are not used very much anymore, although they are still used in certain things like amplifiers.

For memory, the UNIVAC used mercury delay lines. These sent acoustic waves through tubes full of mercury. Mercury was used so that the waves would propagate slowly. The tube of mercury had to be kept at a constant temperature; otherwise the waves would propagate too slowly or quickly.  A few years after UNIVAC, they started using magnetic tape as memory, which was a huge advance at the time. They had machines to convert a deck of punch cards into a magnetic tape. This magnetic tape could then be read much faster by the computer than punch cards.

We then watched the video: UNIVAC: Remington-Rand Presents the UNIVAC. The video talks about how to program the UNIVAC.  Programmers at the time would write a program and then had to compile it themselves. Then typists would type the compiled program into a console that would put the program on a magnetic tape. This machine was also backwards-compatible with punch cards.

Dr. Wagstaff then gave each of us an unpunched card which we attempted to write a short message on. This took a while to do, even for short messages, and we talked about ways that the process could be made easier. One of these would be to have the more commonly used letters, such as e and a, be easier to punch than other letters. This concluded our discussion for the class and Dr. Wagstaff asked us to find and bring in a fact about IBM’s Deep Blue computer or Watson, the computer on Jeopardy.

What if the Difference Engine existed in the 1800’s?

24 Monday Oct 2011

Posted by Kiri Wagstaff in Alternate History, News

≈ Comments Off on What if the Difference Engine existed in the 1800’s?

The second assignment for this class focused on speculating about alternate history.

What if Charles Babbage had completed his Difference Engine in the 1830’s, and the engines were then mass-produced, spreading outward into all areas of calculation?

Students each selected a key event from the period 1830 to 1880 and discussed how it might have been altered by the availability of Difference Engine technology. These events included political, scientific, technological, and financial happenings that together provide an eclectic view of the time period:

  • Charles Darwin’s journey on the HMS Beagle (1831)
  • The Kowloon Incident and China’s Opium Wars (1839)
  • The discovery of Neptune (1846)
  • Italian revolutions from Austrian control; Austrian hot-air balloon attacks (1848)
  • The collision of the S.S. Arctic and the Vesta (1854)
  • The use of cryptography in the American Civil War (1860’s)
  • The recovery of Lee’s Special Order 191 in the American Civil War (1862)
  • Building the U.S. Transcontinental Railroad (1863-1869)
  • William Shanks’s calculation of pi to 707 digits (1873)
  • The Vienna Stock Exchange collapse (1873)

The alternate histories make for fascinating readings. Links are provided above to submissions the students have shared publicly. Read on!

Class Summary 10-24-11

24 Monday Oct 2011

Posted by Sarah Fine in Class Summary

≈ Comments Off on Class Summary 10-24-11

We started class reviewing the feedback from last class—it seems that everyone is pretty satisfied, which is great!  From now on, we will be doing about a half and half mix and reading review and class discussion, which is very similar to the previous class format.  A new element will be focused reading questions (non-graded) to aid the reading process, given out via email.

There will be a new blog post with the class’s topics for Assignment #2, as we all kicked butt on the assignment with an average of 9.3/10.

We ended with Hollerith last time, so today we continued right where we left off.  It was mentioned that IBM started on their path to creating the modern computer, at this moment in history, making calculating machines.  The company began as CTR, and became successful selling punch cards to other companies.  This was a very lucrative venture because each punch card can only be used once.  Howard Aiken designed the first real computer in the modern sense, produced by IBM, which cost $1.5 million in modern dollars, which wasn’t even a commercial venture.  The machine was given to Harvard University, and named the Harvard Mark I (although officially it was called Automatic Sequence Controlled Calculators).

This brings us to today’s real topic, Grace Hopper.  She worked as a mathematician for the navy, and when Aiken requested women from WAVES to do calculations (women were often used for calculations during this time, which I found surprising), Hopper started working for IBM.  We read her paper, “The Education of a Computer,” in which she talked about programming computers.  Programming in her time was very low level in comparison to modern programming.  She envisioned using programming languages to speed up and enhance the accuracy of computer calculations, because at the time, only raw numbers were able to be entered into computers.  To demonstrate this process, we played a game called “Robo Rally” in groups of four.

Each team of two was given a robot pieces, and told the following instructions:

000: Forward 1

101: Forward 2

010: Turn right

011: Turn left

100: Back up 1

The goal of the game is to reach the goal marker in 10 moves or less.  The conveyor belt spaces move you one or two spaces post turn, (depending on the number or arrows in the space), the gear spaces rotate you 90 degrees post turn, and the black spaces are death holes that you fall through and die.

The actual game, of course, does not work this way.  Instead of binary codes, players are given cards with symbols on them which represent possible moves.  This is better because sequences of zeros and ones have no semantic meaning to us, and it is very easy to make clerical errors.  Hopper rightly saw that a language to program computers would make programming far more human-friendly.

Grace Hopper did eventually create a programming language, which she called FLOW-MATIC.  At the time, programmers used flow charts to accurately use binary, which inspired the name FLOW-MATIC for her language.  IBM advertised the change thusly, “Mastering the knowledge of the complicated techniques and symbols of conventional computer flow charts requires a long training period.  Flow-Matic charting, however, can be easily grasped by anyone with knowledge of the application to be programmed.”

A fun anecdote about Grace Hopper was the story of the “first bug,” which happened to Harvard Mark II in 1947.  The word bug was used to describe as a flaw in physical design at the time, but after a moth was actually found in the machinery, disrupting a computer program, the term “bug” became a term for a programming flaw or mistake, and “debugging” became the process of fixing this mistake.

Finally, we watched this video:

Grace Hopper 60 Minutes Interview in 1982

For next time, the readings are available on the syllabus.  Questions will be sent out via email, including a request for a saying, less than 80 characters, to be brought in by each student.

Class Summary: 10/19

19 Wednesday Oct 2011

Posted by Nick Lowery in Class Summary

≈ Comments Off on Class Summary: 10/19

Class today began with a short feedback session, in which we filled out short notes stating something we thought was going well in the class, and something that we thought could be improved.

From there we jumped stright into sharing interesting quotes and passages from the reading, “Johnny Builds Bombs and Johnny Builds Brains”. Topics of favorite quotes were very diverse: how von Neumann managed to win vast amounts of financial support from the government, likely due to his charisma and well-placed connections (rather unlike our old friend Babbage); the somewhat lucky rise of Mauchly and Eckert, and their fortuitous partnership with Goldstine, who had grown increasingly frustrated with army policies; von Neumann’s diverse and rather charmed existence, with his intellectually star-studded parties and major contributions to the fields of game theory, quantum physics, operational research (and later life itself and the construction of automata, to be called von Neumann machines); and finally the mess about who came up with which ideas first during this period of extremely rapid innovation.

This last topic started the discussion over rights and patents during this period. This started with the innovation of the stored program (and the infamous First Draft which lead many to give the credit solely, and perhaps unjustly, to von Neumann, who likely put only his name on the manuscript because it was only the draft version), as well as the arguments between von Neumann and co.’s ENIAC machine and Atanasoff and Berry’s ABC. The disagreement stemmed from a short visit by Mauchly to Atanasoff, where the exchange of ideas eventually leading to construction of the ENIAC may or may not have taken place While that disagreement was “solved” by Minnesota courts in 1973 (in favor of the ABC), discussion is still ongoing and unclear about who was responsible for which ideas during this time of extremely rapid innovation.

Discussion then flowed into an attempt to organize the figures and machines that took part in the computer revolution. What we came up with as a class was sort of a mish-mash of connected events and tangled ideas; however, this disarray was actually reflective of the time, in which many people were sharing ideas with others, as well as coming to similar conclusions through independent work. Dr. Wagstaff has graciously organized this information by hardware technology and chronologically:

1. Mechanical computers: Differential Analyzer (1931)

2. Electromechanical computers: Z3 (1941), Harvard Mark 1 (1944)

3. Electronic computers:

– ABC (1942, first vacuum tube logic, 300 tubes, binary representation, not programmable)

– Colossus (1944, 1500 tubes, limited programming with cables)

– ENIAC (1946, 18,000 tubes, decimal representation, programmed with cables)

– EDSAC (1949, Cambridge, 3000 tubes, binary, first stored program, using mercury delay line memory)

– Manchester Mark 1 (1949, first stored program, using cathode ray tube memory)

– ACE (1950, 1450 vacuum tubes, mercury delay line memory, 1 MHz)

– EDVAC (1951, 6,000 tubes, mercury delay lines)

– UNIVAC (1951, 5,200 tubes, mercury delay lines, first commercially available computer in US)

A bit more detail can also be found on this Wikipedia page, which includes a fully chronological table of events from the 1940’s. There were also several theoretical constructs included in this discussion, including self-replicating von Neumann machines, universal Turing machines, and how one could turn a Turing machine into a von Neumann machine by adapting the Turing machine’s output with a robotic construction device.

As for the figures, we discussed how the various groups formed and influenced one another. This ends up being somewhat of a web, so I shall arbitrarily choose Turing as our starting point. Turing made his initial contributions while at Princeton studying under Church, a mathematical logician. It was here that he initially came into contact with von Neumann, a student of Hilbert’s, though the contact didn’t foster much in the way of later partnerships; Turing returned to England to aid with code breaking during the war, and later drew up the plans for ACE. ACE was eventually built after Turing left the NPL (at the time the fastest computer in the world at 1 MHz), while Turing oversaw the construction of the Manchester Mark I. On the other side of the pond, von Neumann joined with Goldstine, Mauchly and Eckert in efforts that eventually lead to construction of the ENIAC (and this group’s interactions with those that constructed the ABC have been mentioned earlier), which later blossomed into the EDVAC. Other somewhat more independent mentions were Aiken, who was responsible for the Harvard Mark I, and the group at MIT who constructed the Differential Analyzer.

And, finally, any discussion mentioning von Neumann machines would be incomplete without the thoughts of philosopher Randall Munroe.

Class Summary: 10/17

19 Wednesday Oct 2011

Posted by Austin Valeske in Class Summary

≈ Comments Off on Class Summary: 10/17

We picked up right from where we left off the previous Wednesday, and as part of a brief review we discussed the moral and political consequences of code breaking.

The primary example of this dilemma involved the British losing fifty to eighty of their ships a month. As this was basically as fast as they could build them it was an obvious problem, with the loss of life compounding the issue. There was an Enigma encoded message that the British intercepted from the Germans that, upon decoding, revealed the location of nine German warships. The Royal Navy didn’t want to let on that they had broken Enigma, lest the Germans change the cipher again, so the British destroyers were only told the location of seven of the ships, with command fearing that if all the ships were sunk the Germans would catch on. The destroyers, however, ran into the other two ships on their way to sink the seven, and sunk them as well. Despite this, the Germans were so sure that Enigma was unbreakable that they assumed there was a spy somewhere in their ranks. They didn’t consider the possibility that Enigma could be at fault.

Moving on, we discussed another decrypting machine that was employed in the British war effort: Colossus. Colossus was a giant computer that was designed not to crack Enigma, but to crack the Lorenz cipher, a code that was used between Hitler and his field generals. In some ways it was more advanced than the Enigma decryption techniques that Turing and the British cryptographers employed – it used 1500 vacuum tubes and was the first machine to employ them for computation – but it also looked for matching keys using brute force. Once a matching key was found, the user had to manually do the decryption.

Turning to the reading, we first reviewed Turing’s paper “On Computable Numbers.” It was quickly apparent that few people in the class understood much of Turing’s paper beyond a basic outline and what he was trying to prove, so we started breaking it down.

We first covered some background, why the paper was written in the first place. The mathematician David Hilbert wanted mathematics to be complete and encompass all knowledge, and to be a system where every presented problem could be solved. Kurt Gödel came and showed that, actually, this wasn’t possible in his paper “On Formally Undecidable Propositions of Principia Mathematica and Related Systems.” This was a difficult to understand paper, but Turing’s paper made it more comprehensible by using something he called a Turing Machine to help explain his ideas.

We then took some time to define a Turing Machine, and Dr. Wagstaff took the time to draw one on the board. A Turing Machine is made of some sort of input/output system, in this case a tape, and something that can read and write to the tape. The tape contains various symbols, all of which are contained in a possible set of symbols. In the drawing to the right, the current symbol is denoted Si. The machine also has a configuration qi, which we’d refer to as a ‘state.’ In this state is implicit memory, for example, if you’re in state one and see an A, do this, if you’re in state 2 and see an A, do that. This description helps to clarify the tables in the book. These tables are examples of Turing programs, and the example Dr. Wagstaff drew below. What the machine would do in reading these programs is read the state qi, read the symbol Si, print a symbol Sk based on certain parameters (or not), and then move to the state q’. Additionally, an entire Turing Machine can be recorded as a number. This is crucial because another Turing Machine can then read in that number and simulate that machine.

So how is this relevant to the mathematical completeness that Hilbert so wanted and Gödel and Turing showed wasn’t possible? It comes down to the Entsheidungs Problem, also called the Decision Problem. Basically, the problem asks if it’s possible to design a system that can take any logical or mathematical statement and decide if it’s true or not. Turing modified this and called it the Halting Problem. If a program halts, it ends. If it doesn’t it continues on an infinite loop. The primary question, then, of the Halting Problem is whether it’s possible to design a program that can read in another program and determine if the other program halts or not. Turing addresses this with what is essentially a more complicated version of the proof (by contradiction) Dr. Wagstaff outlined to the right. What this says is that if you assume that, like a Turing Machine, any program can be written as a number, and that you can write program H(p) that can read in any program and will return 1 or 0 if program p halts or does not halt, respectively. So then you define another program G(p) such that if program H(p) returns 0, then G(p) returns 0, and if H(p) returns something not zero, then G(p) goes into an infinite loop. Now, every machine can be written as a number, so G(G) is possible. And then the complicated part: G(G) takes action based on the output of H(G). If G does not halt, then H(G) returns 0, which would mean that G(G) would return 0. This is a contradiction because G does not halt, but returns 0 when presented with itself. Likewise, if G does halt, then H(G) returns 1, which would mean that G(G) would go into an infinite loop. This too is a contradiction, as G does not halt, but goes into an infinite loop. This means that the assumption that H(p) can be written is false. No such program is possible to write.

Moving on from complicated proofs, we looked at the reading from Diamond Age. We first went over some background on the passage to give it some context, and then went into the details of the passage. The main character, the girl, is playing a game with a “primer,” which the class likened to an iPad. Princess Nell is her character, and her character has been imprisoned by automatons. She then has to administer a Turing Test to determine if her captors are human or machine, as this will determine her method of escape. After learning that her captors are machines, she escapes to the top of the tower where she find the skeleton of the Duke of Turing. She reads his books and journals, complete with references to ‘bugs in the machine,’ and masters them. After learning how to write her own programs in the chains used to hold programs, she becomes the Duchess of Turing.  The passage mostly focused on the Turing Test, figuring out if her captors were human or not, but it does involve a Turing Machine that functions as the lock on her door. The numbers on the lock describe what state the machine is in, and with their help she was able to reverse engineer the lock by running different chains through the machine and seeing how the states changed.

There was also brief mention of the book Gödel, Escher, Bach: An Eternal Golden Braid by Douglas Hofstadter, which discusses knowledge, meaning, and thinking.

We then transitioned and watched a clip from the movie Breaking the Code, which is based on a play about Alan Turing. (Interestingly, this clip isn’t in the American cut of the film.)  The clip involves Turing explaining his work to someone reviewing it, and Turing basically explains what we were talking about for the first half of class. This person says that Turing’s paper is “baffling,” specifically pointing out the title. After a request to explain, Turing gives a very detailed and helpful explaination as to what a Turing Machine is for and how it relates to the Entsheidungs Problem. He explains that it is about trying to prove right from wrong, and in a review of the history of attempts at this he mentions someone to trying to break down everything in to pieces of pure logic. He notes that this, of course, failed, and attempts to analyze mathematical axioms led to new types of mathematics. He describes that Hilbert thought it was possible to have a fundamental system for mathematics, with consistency, completeness, and decidability. Turing then notes that Godel showed this was impossible, and that math is either inconsistent or incomplete. Turing realized that he would have to have a system of proving all mathematical statements past, present, and future for Hilbert to be correct, which is what his Turing Machine idea would be designed to do. He notes that, of course, a Turing Machine cannot do this.

Class then closed with the assignment for the next class.

The Transcontinental Calculation

17 Monday Oct 2011

Posted by Nathan Hinkle in Alternate History

≈ Comments Off on The Transcontinental Calculation

In the first half of the 1800s, getting from one side of the United States to the other was a significant affair. There were no planes back then (of course), no cars, and the overland route via covered wagon was even more treacherous than the video games of our youth alluded to. In 1863, workers broke ground on the US transcontinental railroad. By 1869, one could ride from Nebraska to California in a week, instead of the six treacherous months previously required. It was one of the most remarkable feats of civil engineering – not to mention sheer labor – of the 19th century. Not only did the railroad unite the country with transportation, it also enabled a new era of communication: telegraph lines were installed next to the railroad, allowing messages to be sent instantly across the country.

One can imagine the number of calculations required to build a single trestle, let alone an entire 1780 miles of railroad. Only about 25% of the workers involved in the project were actually physical laborers doing the blasting, digging, and other heavy work. It is some of the other workers who would have most benefited from access to a difference engine.

For one, there is the obvious need for engineers to perform calculations regarding where it is most efficient to lay the route, how strong bridges must be to support the trains, how far it can be between refueling points without trains running out of coal and other supplies, and countless other mathematical problems. Indeed, any engineering project in that era would have benefited greatly from access to more accurate and varied tables of numbers.

A less immediately obvious, but undeniable application for a difference engine is all of the accountants and workers responsible for ensuring sufficient supplies. Given a certain number of expected miles of construction, how many railroad ties does one need to order? How much rail? When should you send the shipments of materials to optimize the number of trains you send, without losing valuable work time for lack of parts? Indeed, the benefit to accounting and management might have been greater than the advantages for the engineers.

Had the difference engine been available in 1863, would it have had any lasting impacts in the context of the railroad, or would it merely have eased the burden on overworked engineers and accountants? It’s hard to say. Even if use of a difference engine had made it cheaper to construct the railroad, it might not have been completed any sooner – the primary delays were caused by bad weather and treacherous conditions. Where the tables derived from a difference engine might have been more useful would have been for the hundreds of people starting new businesses in the recently opened up territories of the west. The railroad lead to a massive expansion of the population in the western US, and to be certain, many of them would have found the tables that a difference engine would have made available to be quite handy.

Babbage vs. Evolution

16 Sunday Oct 2011

Posted by Andrew Atkinson in Alternate History

≈ 1 Comment

May 22, 1826 marks the first voyage of one of the most important ships in history, the HMS Beagle. The Beagle and the HMS Adventure departed together on a several-year long mission to Patagonia and Tierra del Fuego to conduct hydrographic survey, which involves measurement and description of the ocean and coastal regions. Like regular surveying, hydrographic surveying requires trigonometry and precise calculations. This task was difficult in the 19th century, especially on top of the challenge of navigation. The Straight of Magellan, off of the coast of Tierra del Fuego, is one of the most important but dangerous water passages in the world. In the weeks of surveying this especially difficult area, Captain Pringle Stokes went into a long phase of depression, ending in his suicide. He was replaced by Robert FitzRoy.

After his success on the first voyage, FitzRoy was put in charge of the Beagle’s second voyage, which departed in late 1831. On the first voyage, FitzRoy had wanted an expert on geology, so for the second, he decided to “endeavor to carry out a person qualified to examine the land; while the officers, and myself, would attend to hydrography”. He wanted a naturalist to go on land to learn about the geology. He had the additional requirement that the naturalist be someone that would make him a good companion. FitzRoy’s friend, Dr. John Henslow sent a letter to someone he thought might fulfill the position. The letter read,

“…that I consider you to be the best qualified person I know of who is likely to undertake such a situation— I state this not on the supposition of yr. being a finished Naturalist, but as amply qualified for collecting, observing, & noting any thing worthy to be noted in Natural History. Peacock has the appointment at his disposal & if he can not find a man willing to take the office, the opportunity will probably be lost— Capt. F. wants a man (I understand) more as a companion than a mere collector & would not take any one however good a Naturalist who was not recommended to him likewise as a gentleman. … there never was a finer chance for a man of zeal & spirit… Don’t put on any modest doubts or fears about your disqualifications for I assure you I think you are the very man they are in search of.”

The letter was to Charles Darwin. You know the rest of the story.

But what if Charles Babbage had completed his difference engine before all of this? He originally proposed his idea to the Royal Astronomical Society in 1822. Assume that everything ran smoothly and the invention was completed by 1828. The difference engine would have been able to quickly, cheaply, and accurately produce important trigonometric tables that would have been hugely beneficial to navigation and hydrographic surveying. In this case, the first voyage of the Beagle could have run much more efficiently and Captain Stokes could have enjoyed a relaxing trip to Tierra del Fuego instead of shooting himself in his cabin. Robert FitzRoy wouldn’t have taken over as captain so he wouldn’t have requested a naturalist/companion for his next voyage, and Darwin would not have been on board. He would have continued his plan of become a priest instead of writing one of the most influential works of all time.

So what if Darwin had never written “On the Origin of Species”? It’s fair to assume that the theory of evolution through natural selection would still have become scientifically accepted, since Darwin was not the first or only person to suggest it. However, Darwin was by far the most convincing, thorough, and methodically correct of the early proponents of evolution. “On the Origin of Species” laid important groundwork not only in evolution, but all life sciences, because of its strong use of the hypothetico-deductive method. Prior to this, naturalists would mostly just describe, name, and study the anatomy of species. Darwin used reasoning, analogy, and large amounts of evidence to form his “long argument”, which laid new foundations for the scientific method in biology. It was well-devised and argued strongly, making it an extremely persuasive work that inspired the evolutionary movement and exemplified proper scientific methodology. Without Darwin, natural selection would have been years behind, as would the foundations of biological research in general. This is perhaps what would have happened if Babbage had finished his machine and vastly improved the availability of accurate trig tables for celestial navigation and surveying.

But luckily Babbage never finished his engine…

Alternate History: Discovery of Neptune

16 Sunday Oct 2011

Posted by Jenelle Parson in Alternate History

≈ Comments Off on Alternate History: Discovery of Neptune

Neptune was almost discovered by Galileo, but he mistook it for a star. Lalande, a French astronomer who created tables of the planetary positions also recorded Neptune’s position but also thought it was a star. One of the people responsible for discovering Uranus, John Herschel, also thought that it was a star. When Delambre was computing tables of Uranus, he discovered discrepancies in the position. He noted that there were discrepancies. During his time at Cambridge, John Couch Adam decided to begin investigating the irregularities of Uranus’ orbit. At relatively the same time, a French astronomer Urbain Jean Joseph Le Verrier also recognized the irregularities in the orbit of Uranus and thought this was due to an undiscovered planet. He then did computations based upon Newton’s gravitational laws and deduced the location of the undiscovered planet. Le Verrier gave his calculations to Johann Gottfried Galle who discovered the planet Uranus.

John Couch Adams

According to many stories, and as stated in Jacquard’s Loom, the reason why the planet was not discovered due to Adam’s calculations was because “[i]nstead of instigating a major telescopic search that would almost certainly have resulted in the discovery of the new planet, Neptune, Airy chose not to act on Adam’s information.” (104). Due to Airy overlooking Adam’s calculations, it was decided that Adams and Le Verrier needed equal credit in the discovery of Neptune. However, according to the Neptune file, found again in 1998, this is not the full story.

Urbain Le Verrier

In the file it is revealed that instead of being ignored by Airy, Adams was actually vague and inconsistent in his planetary position. Adam’s predictions ranged over 20 degrees of the sky, and after the planet was searched for during six-weeks at the Cambridge University Observatory was still not found. This was far different than Le Verrier’s calculations, which were one degree off of the actual planet’s location. Galle found the planet in half an hour. From Adam’s journal transcriptions, it shows “him still working on a problem which (one gathers) it was first necessary to solve in order to achieve a full solution.” (Kollerstrom, 5.42). After Galle’s discovery based on Le Verrier’s prediction, British astronomers contrived a selected story of events. Only Adam’s more accurate mathematical results were made public and made to appear as if Adam’s had the predicted the exact location of the planet. While Le Verrier protested at the time, it was in vain. He became very bitter about the lack of recognition for his work.

The God Neptune

If there was a difference machine to facilitate in Le Verrier’s calculations, I feel that he would have predicted the location of the planet quicker. While a difference machine would also help Adam, it was shown that his earlier calculations were more accurate than his later ones. For this reason I suspect that Adam’s predictions would have just become continually worse. From this, I feel that Le Verrier would be proven as the true discoverer of Neptune, and Adam would be recognized for his calculations and work to discover the location of the planet, but wouldn’t be considered a co-discoverer.

If it had not been for the lack of recognition of his discovery, I feel that Le Verrier would have been a more likeable person and wouldn’t have been so unpopular. This would have resulted in a more productive appointment as director of the Paris Observatory and probably would not have ended in him being overthrown and when later reinstated, stripped of most of his authority. This productive appointment would have led to more astronomical discoveries. This amount of astronomical discoveries would have caused Paris to be a center of astronomical discovery.

One such astronomical discovery, I feel, would have happened sooner is the discovery of Pluto. Since Le Verrier had already done work on the orbit of Neptune, I feel that if the orbit of Neptune would have continued to be tracked Le Verrier would have discovered perturbations in the orbit due to the planet Pluto. Such perturbations would have been linked through Newtonian orbital theory, which could have been further developed by Le Verrier if his view on Newtonian orbital theory had not been poisoned by his lack of recognition for the discovery of Neptune. Such an early discovery of Pluto would give it historical significance as a planet, and perhaps its planetary status would not have been taken away.

Sources:

Coimbra, Miguel. “Neptune – God of the Seas and Oceans.” Web. 12 Oct. 2011. <http://www.miguelcoimbra.com/images/gallery2.php?bimg=galerie/books/2romans/neptune.jpg&l=820&h=820>

Essinger, James. Jacquard’s Web. New York: Oxford University Press, 2002. Print.

Kollerstrom, Nicholas. “Recovering the Neptune Files.” RAS Research (2003): 5.23-5.24. Web. 12 Oct. 2011. <http://www.dioi.org/kn/neptunefile.pdf>

O’Connor, John, and Edmund Robertson. “Neptune and Pluto.” The MacTutor History of Mathematics archive. University of St Andrews. Sept. 1996. Web. 12 Oct. 2011. <http://www-history.mcs.st-and.ac.uk/HistTopics/Neptune_and_Pluto.html>

O’Connor, John, and Edmund Robertson. “Urbain Jean Joseph Le Verrier.” The MacTutor History of Mathematics archive. University of St Andrews. Dec. 1996. Web. 12 Oct. 2011. <http://www.gap-system.org/~history/Biographies/Le_Verrier.html>

“Portrait of John Couch Adams.” DSpace. University of Cambridge. 2008. Web. 12 Oct. 2011. <http://www.dspace.cam.ac.uk/handle/1810/214762>

Sheehan, William. “Secret Documents Rewrite the Discovery of Neptune.” Social Sky & Telescope: The Essential Magazine of Astronomy (2003): n. pag. Web. 12 Oct. 2011. <http://www.skyandtelescope.com/news/3307531.html>

“Urbain Le Verrier.” Random Knowledge. WordPress. 11 Mar. 2008. Web. 12 Oct 2011. <http://randomknowledge.wordpress.com/2008/03/11/urbain-le-verrier/>

William, David. “Neptune Fact Sheet”. Planetary Fact Sheets. NASA. Nov. 2010. Web. 12 Oct. 2011. <http://nssdc.gsfc.nasa.gov/planetary/factsheet/neptunefact.html>

Assignment 2: Southern Victory

15 Saturday Oct 2011

Posted by Austin Sharp in Alternate History

≈ 1 Comment

      The historical event I have selected is the recovery of Lee’s Special Order 191 by a Union soldier during the American Civil War. The order detailed Lee’s intentions and how he was splitting his forces while invading Maryland and Pennsylvania. The order was intended to be destroyed (it was found wrapped around several cigars), but instead was found and relayed to George McClellan, commander of the Union Army of the Potomac. McClellan had previously been outmaneuvered and outfought by Lee’s Army of Northern Virginia multiple times. However, with this order he was able to predict Lee’s movements, and forestalled the invasion of the North at the Battle of Antietam. Many historians believe that McClellen, a notoriously over-cautious and slow-moving general, could have taken greater advantage of the order. Antietam was a very bloody battle, with heavy casualties on both sides. Lee’s army did retreat, but McClellan, fearing a trap, refused to pursue, despite the insistence of President Lincoln. Hindsight shows that if he had pressed his advantage, the Army of Northern Virginia was not in good shape, and could have been destroyed or severely damaged. A few days later, Lincoln removed McClellan from command for failing to take full advantage of his intelligence.
However, the Battle of Antietam did allow Lincoln to make the Emancipation Proclamation. This was crucial, because the President’s advisors had convinced him to delay the announcement until after a Union victory, so as to not seem like a move of desperation. The result of the Emancipation Proclamation was that France and Britain could not convincingly recognize the Confederacy as a legitimate nation, due to slavery now being a central issue of the war.
Harry Turtledove, the foremost contemporary alternate history author, used Special Order 191 as the point of divergence for his epic alternate history series Southern Victory, where the pertinent copy of Special Order 191 is in fact destroyed. I would propose a similar change as part of a ripple effect from Babbage’s Difference Engines becoming widespread and well-used.
Had Difference Engines been finished, used, and proved helpful enough for common use, the technology of the period leading up to 1860 could have been wildly affected. Babbage would have continued to be prominent, and it seems reasonable to assume that this other ideas, inventions and interests would have become more important among the scientists and engineers of his day. In addition to that, if the Difference Engine succeeded, it’s likely that other mechnical computation devices would have been invented in a similar sense to the electro-mechanical devices that began to flourish after Hollerith’s initial success in the 1890s.
One field that would have been the key beneficiary of these advances would have been cryptography. By World War I, military cryptography was commonplace; however, Special Order 191 was not encrypted, which allowed the Union army to quickly realize its importance, forward it up the chain of command, and understand it. Had cryptography spread to the Confederacy’s armed forces, it would have at the very least taken the Union some time to decrypt the order, were it even realized as important at all by the corporal who found it.
In actual history, McClellan’s deficiencies as a commander were such that even with fantastic military intelligence, he was only able to fight Lee to a standstill, barely enough of a success to allow the Emancipation Proclamation to go out. Without Special Order 191 in hand, it seems more than likely that Lee would have once again humiliated Union armed forces, this time on their own soil, and possibly given the Confederacy enough of an advantage to win the war. Had the CSA’s advantage after Lee’s Maryland campaign been seen as sufficient, Britain and France would likely have recognized the South and broken the blockade to restore the flow of cotton exports and to hurt the USA.
Such a vast change in the power balance on the American continent would have had vast consequences. Certainly, Britain and France would have been enemies of the United States, rather than eventual allies, due to their effective alliance with the CSA. Furthermore, assuming German unification proceeded as in actual history, the USA and Germany could well have applied the principle of “the enemy of my enemy is my friend”, and found common ground in the later 1800s, maybe even in World War I. Had the United States never entered World War I against the Central Powers, but rather been tied down by a war at home (or at least the prospect of being counterbalanced by the Confederate States), the entire 20th century would look completely different. Everything from German backlash to the Treaty of Versailles and the dismantling of the Ottoman Empire to the October Revolution in Russia could have had vastly different outcomes, with a German-Union alliance, and an independent Confederate States of America.
The farther one moves forward from Special Order 191, the greater the implications become. With simple knowledge of encryption, and perhaps even something as simple as a substitution or rotation cipher, the importance of Special Order 191 or at least its meaning would have never been realized (or at least not soon enough). That paper, wrapped around cigars, is one of the hinges upon which history has turned, and with the technological advances that could have been perpetrated by the Difference Engine, history could have turned in a very different direction.

← Older posts

♣ Topics

  • Ada Lovelace Day
  • Alternate History
  • Class Summary
  • News
  • People
  • Personal History
  • Reading Summary
  • The Future
  • War is in the air

♣ Archives

  • December 2011
  • November 2011
  • October 2011
  • September 2011

♣ Recent Comments

  • Randy orton on RIP Steve Jobs
  • Kevin on Computers can be hacked and so should life
  • Kiri Wagstaff on Alan Turing’s “Computing Machinery and Intelligence”
  • Sarah Fine on Class Summary: 11/21/11
  • Sarah Fine on Class summary: 11/16

♣ Meta

  • Log in
  • Entries feed
  • Comments feed
  • WordPress.org

Proudly powered by WordPress Theme: Chateau by Ignacio Ricci.