Computing hardware
evolved from machines that needed separate manual action to perform
each arithmetic operation, to punched card machines, and then to stored-program computers.
The history of stored-program computers relates first to computer
architecture, that is, the organization of the units to perform input
and output, to store data and to operate as an integrated mechanism.
Before the development of the general-purpose computer, most
calculations were done by humans. Mechanical tools to help humans with
digital calculations were then called "calculating machines", by
proprietary names, or even as they are now, calculators.
It was those humans who used the machines who were then called
computers. Aside from written numerals, the first aids to computation
were purely mechanical devices which required the operator to set up the
initial values of an elementary arithmetic operation, then manipulate
the device to obtain the result. A sophisticated (and comparatively
recent) example is the slide rule
in which numbers are represented as lengths on a logarithmic scale and
computation is performed by setting a cursor and aligning sliding
scales, thus adding those lengths. Numbers could be represented in a
continuous "analog" form, for instance a voltage or some other physical
property was set to be proportional to the number. Analog computers,
like those designed and built by Vannevar Bush
before World War II were of this type. Numbers could be represented in
the form of digits, automatically manipulated by a mechanical mechanism.
Although this last approach required more complex mechanisms in many
cases, it made for greater precision of results.
In the United States, the development of the computer was underpinned
by massive government investment in the technology for military
applications during WWII and then the Cold War. The latter superpower
confrontation made it possible for local manufacturers to transform
their machines into commercially viable products.[1]
It was the same story in Europe, where adoption of computers began
largely through proactive steps taken by national governments to
stimulate development and deployment of the technology.[2]
The invention of electronic amplifiers made calculating machines much
faster than their mechanical or electromechanical predecessors. Vacuum tube (thermionic valve) amplifiers gave way to solid state transistors, and then rapidly to integrated circuits
which continue to improve, placing millions of electrical switches
(typically transistors) on a single elaborately manufactured piece of
semi-conductor the size of a fingernail. By defeating the tyranny of numbers,
integrated circuits made high-speed and low-cost digital computers a
widespread commodity. There is an ongoing effort to make computer
hardware faster, cheaper, and capable of storing more data.
Computing hardware has become a platform for uses other than mere
computation, such as process automation, electronic communications,
equipment control, entertainment, education, etc. Each field in turn has
imposed its own requirements on the hardware, which has evolved in
response to those requirements, such as the role of the touch screen to create a more intuitive and natural user interface.
As all computers rely on digital storage, and tend to be limited by the size and speed of memory, the history of computer data storage is tied to the development of computers.
No comments:
Post a Comment