Do you need this or any other assignment done for you from scratch?
We have qualified writers to help you.
We assure you a quality paper that is 100% free from plagiarism and AI.
You can choose either format of your choice ( Apa, Mla, Havard, Chicago, or any other)
NB: We do not resell your papers. Upon ordering, we do an original paper exclusively for you.
NB: All your data is kept safe from the public.
Introduction
The computer is one of the outstanding scientific inventions by man tracing back to the mid-twentieth century. Many advances have been made with the advent of computers. For instance, the organization of data has become more efficient and effective at the same time. Exchanging communication has also been revolutionized by computers. Moreover, the human mindset has equally made some significant strides in terms of critical thinking and analytical skills. Lots of calculations that used to be done manually have now been computerized. Important economic sectors like business and recreation have benefited more from this technological growth.
Modern computer devices have a high efficiency level than the first generation machines. The size is also smaller enhancing its portability. This paper reviews the life history of computers from ancient ages to modern times.
Pre-computer hardware
Computers first came to be mentioned way back in 1613 (Allan 27). The term referred to an individual who had the task of performing calculations. The use of this word became a norm. In any case, a computer was not a device as we know it today but a live person. This continued until the end of the first half of the twentieth century. The beginning of the 20th Century witnessed a gradual change in meaning whereby it referred to more of a device used to carry out calculations rather than a human being.
Historical overview
One of the most ancient machines used to count was in resemblance to tallying sticks (Ceruzzi 23). Later developments saw the use of cones. They had a general name calculi. In this context, objects or items were bound together in certain quantities to facilitate easy counting. Tiny, well-shaped rods were used.
Later on, a counting instrument called abacus was initiated. Its main purpose was to assist mathematicians in arithmetic. An abacus worked on the principle of reckoning and application of arithmetic functions without necessarily going through the long tedious counting process. Simply made but huge analog computers then followed during the Middle Ages to assist space scientists in their calculations (Ceruzzi 44). In 1206, an astronomical clock was invented by Al-Jazari (Allan 59). This clock created the foundation for the ability of analog computers to perform programming. This was followed by the first-ever manual calculator in 1623 courtesy of Shickard Wilhelm who invented it. This machine was digital and it marked the real onset of the computer age.
Thereafter, the skill of using cards punched with holes made its entry into computing. Joseph Marie Jacquard made a form of a loom whose operation was initiated by holed cards. It made it easier for early computer scientists to appreciate the skill of programming.
The beginning of the 20th Century was characterized by the use of desktop calculators. Earlier inventions were re-structured and more features were added to allow them to use electric energy. Before and during the First World War, manual and electrical analog computers were considered to be the best in computing technology. They were seen as the future of computing competence. James Thomson took the technology of analog computing a notch higher in 1876 when he innovated a device called differential analyzer (Swedin & Ferro 63). The beginning of the Second World War in 1939 was accompanied by computing competence from analog to digital. In essence, modern computer technology started during this period. Every component switched from the use of extensive wiring to electronic boards. The period marked by the Second World War was characterized by three development stages that took place simultaneously.
American scientists were also pursuing their developments around this time. For instance, Claude Shanon expounded on the relationship between Boolean concepts and electrical circuits (Ceruzzi 27). One of the multi-purpose computers built in the United States was called the ENIAC. It stood for Electronic Numerical Integrator and Computer. The outstanding feature of this machine was its high speed and ability to handle complicated programs. Although ENIAC was perceived to be a state-of-the-art piece of technology, it had some limitations (Swedin & Ferro 83). This then prompted the need to improve both its capacity and efficiency. This led to the introduction of first-generation computers. EDVAC came into existence as a result. It could store its programs, unlike ENIAC.
By the start of the mid-twentieth century, commercial computers which had a faster multiplier were already in use. It was a modification from previous machines. The second-generation computers used tubes instead of transistors. These machines had fewer power requirements. IBM 1401 was very popular in the global market. The third generation computers did not follow immediately. Between the two generations, hybrid systems like UNIVAC 494 took the centre stage (Allan 176).
The 21st Century computer technology base has been broadened. For instance, the multi-core Central Processing units are used on large scale. There are several versions of microcomputers and micro-processors in use today.
Conclusion
The synopsis of computer history is traced back to the early seventeenth century. This era was characterized by manual and crude forms of performing arithmetic. Interestingly, the word computer had a human definition as opposed to a device as it is known today. Major technological advances in computing have eventually resulted in the use of microprocessors and microcomputers which consume less power, occupy less space, are highly programmable, and generally efficient.
Works Cited
Allan A. Roy. A history of the personal computer: the people and the technology 1st ed. Ontario: Allan Publishing, 2001. Print.
Ceruzzi E. Paul. A history of modern computing 2nd ed. M.A: Techset Composition Ltd, 2003. Print.
Swedin Eric Gottfrid and Ferro L. David. Computers: The life story of a Technology: Westport: GreenWood Press, 2005. Print.
Do you need this or any other assignment done for you from scratch?
We have qualified writers to help you.
We assure you a quality paper that is 100% free from plagiarism and AI.
You can choose either format of your choice ( Apa, Mla, Havard, Chicago, or any other)
NB: We do not resell your papers. Upon ordering, we do an original paper exclusively for you.
NB: All your data is kept safe from the public.