Dricenak.com

Innovation right here

Arts Entertainments

the history of computers

the first computers


The history of computers dates back well beyond the 1900s, in fact, computers have been around for more than 5,000 years.

In ancient times, a “computer” (or “computer”) was a person who performed numerical calculations under the direction of a mathematician.

Some of the best known devices used are the Abacus or the Antikythera mechanism.

Around 1725 Basile Bouchon used perforated paper on a loom to establish the pattern to be reproduced on cloth. This ensured that the pattern was always the same and had hardly any human error.

Later, in 1801, Joseph Jacquard (1752 – 1834), used the idea of ​​the punched card to automate more devices with great success.

The first computers?


by Charles Babbage. (1792-1871), was ahead of his time and, using the idea of ​​the punched card, he developed the first computing devices to be used for scientific purposes. He invented Charles Babbage’s difference engine, which he began in 1823 but never completed. He later started working on the Analytical Engine, it was designed in 1842.

Babbage was also credited with the invention of computer science concepts such as conditional branches, iterative loops, and index variables.

Ada Lovelace (1815-1852), was Babbage’s colleague and founder of scientific computing.

Many people improved on Babbage’s inventions, George Scheutz along with his son, Edvard Scheutz, started working on a smaller version and by 1853 they had built a machine that could process 15-digit numbers and calculate fourth order differences.

One of the first notable (and successful) commercial uses of computers was the US Census Bureau, which used punch card equipment designed by Herman Hollerith to tabulate data from the 1890 census.

To compensate for the cyclical nature of the Census Bureau’s demand for his machines, Hollerith founded the Tabulating Machine Company (1896), which was one of three companies that merged to form IBM in 1911.

Later, Claude Shannon (1916-2001) first suggested the use of digital electronics in computers and in 1937 JVAtanasoff built the first electronic computer that could solve 29 simultaneous equations with 29 unknowns. But this device was not programmable.

During those difficult times, computers evolved at a rapid pace. But due to restrictions, many projects remained secret until much later, and a notable example is the British military “Colossus” developed in 1943 by Alan Turing and his team.

In the late 1940s, the US Army commissioned John V. Mauchly to develop a device for calculating ballistics during World War II. It turned out that the machine was only ready in 1945, but the Electronic Numerical Integrator and Computeror ENIAC, turned out to be a turning point in the history of computing.

ENIAC proved to be a very efficient machine but not very easy to operate. Any change would at some point require the device itself to be reprogrammed. Engineers were well aware of this obvious problem and developed a “stored program architecture”.

John von Neumann, (an ENIAC consultant), Mauchly and his team developed EDVAC, this new project used a stored program.

Eckert and Mauchly later developed what was arguably the first commercially successful computer, the UNIVAC.

Software technology during this period was very primitive. The first programs were written in machine code. In the 1950s, programmers used a symbolic notation, known as assembly language, and then translated the symbolic notation into machine code by hand. Later programs known as assemblers performed the translation task.

The age of the transistor, the end of the inventor.


The late 1950s saw the end of valve driven computers. Transistor-based computers were used because they were smaller, cheaper, faster, and much more reliable.

Corporations, rather than inventors, were now producing the new computers.

Some of the best known are:

  • TRADIC at Bell Laboratories in 1954,
  • TX-0 at MIT Lincoln Laboratory
  • IBM 704 and its successors, the 709 and 7094. The latter introduced I/O processors for better performance between I/O devices and main memory.
  • Dinner’s first computers, the Livermore Atomic Research Computer (LARC) and the IBM 7030 (also known as Stretch)
  • The Texas Instrument Advanced Scientific Computer (TI-ASC)

Now the basics of computers were in place, with transistors computers were faster and with stored program architecture the computer could be used for almost anything.

Soon new high-level programs arrived, FORTRAN (1956), ALGOL (1958) and COBOL (1959), Cambridge and the University of London cooperated in the development of CPL (Combined Programming Language, 1963). Martin Richards of Cambridge developed a subset of CPL called BCPL (Basic Computer Programming Language, 1967).

In 1969, the CDC 7600 was released, which could perform 10 million floating point operations per second (10 Mflops).

The years of the network.


Starting in 1985, the race began to put as many transistors as possible in a computer. Each of them could do a simple operation. But aside from being faster and being able to perform more operations, the computer hasn’t evolved much.

The concept of parallel processing is more widely used starting in the 1990s.

In the area of ​​computer networking, both Wide Area Network (WAN) and Local Area Network (LAN) technology developed at a rapid pace.

Get a more detailed history of the computer [http://www.myoddpc.com/other/history_of_computer.php].

LEAVE A RESPONSE

Your email address will not be published. Required fields are marked *