Preface
In the beginning of the 20th century, a great mind began to think about his interests in technology and art. His name was Konrad Zuse. As a young man, he had been looking for fields where he could use his creativity, but he was somehow disappointed with fixed rules that gave him no space to apply any free thought. Having started with mechanical engineering, he soon switched over to architecture, where he was disappointed again because he could only draw predefined Doric and Ionic columns and could not create something he had in mind himself. So, he switched over to civil and construction engineering. During his studying years, he thought of automatization that could ease real-life tasks, such as automatically working cameras or programmable instruments that could simplify complicated and annoying calculation tasks. He also built the first working vending machine where one could select goods from a dial and retrieve them after inserting coins.
Note
Picture taken from Wikimedia Commons, user ArtMechanic, license according to CC BY-SA3.0 http://creativecommons.org/licenses/by-sa/3.0/legalcode.
Finally, he was possessed enough by the idea to invent a binary working computer, which he started to build in the living room of his parents' house, who were not quite amused about this. He was the first engineer who ever used the binary number system invented by Gottfried Wilhelm Leibniz (1646-1716). Based on this, he reinvented two-state logic without knowing that such a thing already existed. Using his ideas, he finally succeeded, after two years, in building the world's first working computer, the Z1, which he finished in 1937. It was a mechanical apparatus that used the motor of a vacuum cleaner, which seemed more practical to him compared to using electronic parts.
The Z1 was the first binary computer that had an input and output system, a memory, an arithmetical unit, and a program execution unit. Programs could be loaded from paper cards with hole patterns. However, the mechanical parts of the Z1 were not very reliable. Its successor, the Z2, worked with electronic relays and was able to perform floating point operations and possessed a 16-bit memory as well as a 10 Hz system clock.
At this time, computers were huge and filled whole rooms. However, development from electronic relays to electronic tubes soon increased the possible computation speed. The most famous computer that used such tubes was the Electronic Numerical Integrator and Computer (ENIAC). Its power supply had to provide almost 200 kW, which is 500 times more than what is required for standard computers nowadays. Addition or subtraction of a simple number took 200 microseconds, and calculating the root of a number took around a third of second.
Further developments in electronics and the amazing discovery of semiconductors laid down the fundament for the invention of the transistor. The first working bipolar transistor was invented at Bell Laboratories and was presented on December 23, 1947. It was for the first time that it was possible to control electronic currents with voltages in a much more reliable way.
In 1949, Werner Jacobi invented a semiconductor amplifier that used five transistors on one semiconductor substrate. This development was not noticed in the beginning, but it provided the basis for further miniaturization. This approach found more and more popularity from 1958 onward. Robert Noyce invented a fully integrated semiconductor circuit that also included its wire interconnections, which already used photo lithographic processes and diffusion processes for fabrication.
After several improvements in transistor fabrication, such as self-aligned gate structures, the semiconductor company Intel® developed the world's first universally functioning microprocessor for the Japanese calculator company Busicom. This led to the development of the first Central Processing Unit (CPU), which is the Intel 4004, and it was universally applicable. It was first made available on November 15, 1971.
The 4004 ran at a clock speed of maximum 740 kHz and consisted of around 2300 transistors. The execution of one command lasted eight cycles, which was one machine cycle. This led to a maximum throughput of 92500 instructions per second.
However, this was just the beginning. The number of bits the processors were able to work with at once was increased from four to eight, starting with the 8008 microprocessor. Also, speed and supported memory sizes increased dramatically over the first decade. Computers became affordable and are now an integral part of almost everybody's home.
Modern processors consist of billions of transistors and run at clock speeds of around 4 GHz, which leads to command throughputs of up to 4 billion instructions per second. Memory sizes increased from a few kilobytes to terabytes over the last decades and transistor sizes have shrunken by a factor of thousand from 10 microns to 15 nanometers.
Additionally, miniaturization has led to the possibility of building a fully working computer the size of a credit card with a million times more memory and 50,000 times less power consumption, which is a million times faster than ENIAC.
Combining several of these small computers can further increase their unimaginable calculation capacity, leading to the exciting world of parallel computations and super clusters.
In this book, you will learn about a state-of-the-art model of these very tiny computer boards, that is BeagleBone Black, which resembles a full computer system. Its abilities are far beyond what anybody could dream of 50 years ago. You will learn how to integrate several of these small boards into a fully working super computer cluster, giving you the possibility to freely scale up its calculation power.
Practical examples will demonstrate what this immense power is really useful for nowadays.