Computer History - Computer and Technology

Computer History – Computer and Technology

The scale and use of computers in the world is so large, that it is hard to ignore them anymore. Computers appear to us in so many ways that we often fail to see them as they really are. People attached to a computer when they bought their morning coffee from a vending machine. As they drive themselves to work, the traffic lights that so often hinder us are controlled by computers in an effort to speed up the journey. Accept it or not, the computer has invaded our lives.

The origins and roots of computers began as many other inventions and technologies did in the past. It evolved from a relatively simple idea or plan designed to help make jobs easier and faster. The first basic type of computer was designed to do this; Statistics – count!. They performed basic math functions such as multiplication and division and displayed the results in a variety of methods. Some computers display the results in a binary representation of the LEDs. Binary refers to the use of only ones and zeros, so lit LEDs represent LEDs and unlit LEDs represent zeros. The irony of this is that people need to perform another mathematical function to translate the binary system into decimal to make it readable to the user.

One of the first computers was called ENIAC. It was monstrously huge, about the size of an ordinary railway car. It contained electronic tubes, heavy gauge wire, angle irons, and knife wrenches to name a few components. It’s getting hard to believe that computers evolved into small, suitcase-sized computers in the 1990s.

Computers eventually evolved into less outdated looking devices near the end of the 1960s. They were reduced to the size of a small car and were processing bits of information at faster rates than the older models. Most computers at this time were called “mainframes” due to the fact that many computers were linked together to perform a specific function. The primary user of these types of computers were military agencies and large corporations such as Bell, AT&T, General Electric, and Boeing. Organizations like this have the funds to afford such technologies. However, operating these computers requires extensive intelligence and human resources. The average person cannot understand trying to operate and use these million dollar processors.

The United States was given the title of computer pioneer. It wasn’t until the early 1970s that countries such as Japan and the United Kingdom began using their technology for computer development. This resulted in newer components and smaller computers. The use and operation of computers has evolved into a form that people of average intelligence can handle and manipulate without much ado. When the economies of other countries began to compete with the United States, the computer industry expanded at a great rate. Prices fell dramatically and computers became affordable for the average family.

Like the invention of the wheel, the computer is here to stay. Operating and using computers in today’s era of the 90’s has become so easy and simple that we may have taken many things for granted. Almost everything in society requires some form of training or education. Many people say that the typewriter preceded the computer. The typewriter certainly required training and experience in order to operate it at an efficient and efficient level. Children are taught basic computer skills in the classroom in order to prepare them for the future development of the computer age.

The history of computers began about 2,000 years ago, with the birth of the abacus, a wooden rack that held two horizontal wires with beads strung on them. When these beads are moved, according to the programming rules memorized by the user, all ordinary arithmetic problems can be performed. Another important invention around the same time was the astrolabe, used for navigation.

Blaise Pascal is credited with building the first digital computer in 1642. He added numbers entered by computers and was made to help his father, a tax collector. In 1671, Gottfried Wilhelm von Leibniz invented a computer, which was created in 1694. It could add and, after changing a few things, multiply. Leibnitz invented a special stopped gear mechanism for entering addition numbers, and this is still in use.

The prototypes that Pascal and Leibniz made weren’t used in many places, and were considered odd until just over a century later, when Thomas Colmar (aka Charles Xavier Thomas) created the first successful mechanical calculator that could add, subtract, multiply, and divide. This was followed by much improved desktop calculators by various inventors, so that by about 1890 the range of improvements included: accumulation of partial results, storage and automatic re-entry of previous results (memory function), and printing of results. Both of these required manual installation. These improvements were made primarily for commercial users, not for science needs.

While Thomas of Colmar was developing the desktop calculator, a series of very interesting developments in computers was started in Cambridge, England, by Charles Babbage (whose computer shop called him “Babbage”), a professor of mathematics. In 1812, Babbage realized that many long arithmetic operations, especially those required to make mathematical tables, were in fact a series of predictable, constantly repeating actions. From this he suspected that it should be possible to do this automatically. He began designing an automatic mechanical calculating machine, which he called the difference engine. By 1822, he had a working model to illustrate. Financial assistance was obtained from the British government and Babbage began manufacturing a difference engine in 1823. It was to be steam-powered and fully automated, including printing the resulting tables, and commanding through a fixed program of instructions.

The Difference Engine, while limited in its adaptability and applicability, was truly a major advance. Babbage continued to work on it for the next ten years, but in 1833 he lost interest because he thought he had a better idea; Building what would now be called a general purpose automatic mechanical digital computer, controlled entirely by program. Babbage called this idea the Analytical Engine. The ideas of this design showed a lot of prescience, though this can only be appreciated a full century later.

The schematics for this engine required an identical decimal computer that would run with numbers of 50 decimal digits (or words) and have a storage capacity (memory) of 1,000 such digits. Embedded operations were to include everything a modern general-purpose computer would need, even the all-important conditional control portability that would allow commands to be executed in any order, not just the order in which they were programmed.

As people can see, it took a great deal of intelligence and fortitude to get into ’90s style and use computers. People have assumed that computers are a natural progression in society and take them for granted. Just as people learn to drive a car, it also takes skill and learning to use a computer.

Computers in society have become difficult to understand. What exactly they consisted of and what actions they performed was highly dependent on the type of computer. To say that someone has a typical computer is not necessarily to narrow down the capabilities of that computer. Computer styles and types covered so many different functions and procedures that it was difficult to name them all. It was easy to identify the purpose of the original computers in the 1940s when they were first invented. They basically performed mathematical functions many times faster than anyone could calculate. However, the evolution of the computer has created many styles and genres that were highly dependent on a well-defined purpose.

Computers in the 1990s were roughly divided into three groups consisting of mainframe computers, networking units, and personal computers. Mainframe computers were extremely large units with the capabilities of processing and storing huge amounts of data in the form of numbers and words. Mainframe computers were the first types of computers developed in the 1940’s. Users of these types of computers range from banking firms to large corporations and government agencies. They are usually quite expensive in terms of cost but are designed to last at least five to ten years. They also require a well-educated and experienced workforce to be operated and maintained. Larry Woolforst, in his book A Breakthrough to the Computer Age, describes older mainframes in the 1940s than those of the 1990s by speculating, “…contrast with the roaring engine sound that fueled the first flights of the Wright brothers in Kitty Hawk and the roar of mighty engines on Cape Canaveral launch pad. The end of the first part.

Works cited

Woolforest, Harry. A breakthrough for the computer age. New York: Charles Scribner’s Sons, 1982.

Balfarman, John and Doron Sawad. dream machine. London: BBC Books, 1991.

Campbell Kelly, Martin and William Asprey. The computer, the history of the information machine. New York: BasicBooks, 1996.

Leave a Reply

Your email address will not be published. Required fields are marked *