User:Thewiing

This is your user page. Please edit this page to tell the community about yourself! ==My favorite wikis== * Add links to your favorite wikis on Fandom here! * Favorite page #2 * Favorite page #3

 puter History   The starting point of the 17th century (1623) can be regarded  <img data-rte-meta="%7B%22type%22%3A%22image%22%2C%22wikitext%22%3A%22%5B%5BFile%3APascal.jpg%7Cthumb%7C157x157px%5D%5D%22%2C%22title%22%3A%22Pascal.jpg%22%2C%22params%22%3A%7B%22alt%22%3A%22Pascal%22%2C%22thumbnail%22%3Atrue%2C%22caption%22%3A%22%22%2C%22width%22%3A157%2C%22height%22%3A157%7D%7D" data-rte-instance="177-7695000095a9e65faf0586" alt="" src="https://vignette.wikia.nocookie.net/central/images/9/94/Pascal.jpg/revision/latest?cb=20180306091250" width="150" height="157" class="image thumb" data-image-name="Pascal.jpg" data-image-key="Pascal.jpg"  type="image" /> as a reference point, when the scientist V. Shikard created a machine that can add and subtract numbers. But the first arithmometer, capable of performing four basic arithmetic operations, was the arithmometer of the famous French scientist and philosopher Blaise Pascal. The main element in it was a cogwheel, the invention of which, in itself, was a key event in the history of computer technology. I would like to note that evolution in the field of computer technology is uneven, spasmodic: periods of accumulation of forces are replaced by breakthroughs in development, after which a period of stabilization occurs, during which the results achieved are used practically and simultaneously knowledge and strength accumulate for the next leap forward. After each turn, the process of evolution goes to a new, higher stage.

<img data-rte-meta="%7B%22type%22%3A%22image%22%2C%22wikitext%22%3A%22%5B%5BFile%3ALeibniz.jpg%7Cthumb%7C173x173px%5D%5D%22%2C%22title%22%3A%22Leibniz.jpg%22%2C%22params%22%3A%7B%22alt%22%3A%22Leibniz%22%2C%22thumbnail%22%3Atrue%2C%22caption%22%3A%22%22%2C%22width%22%3A173%2C%22height%22%3A173%7D%7D" data-rte-instance="177-7695000095a9e65faf0586" alt="" src="https://vignette.wikia.nocookie.net/central/images/1/1c/Leibniz.jpg/revision/latest?cb=20180306091446" width="150" height="173" class="image thumb" data-image-name="Leibniz.jpg" data-image-key="Leibniz.jpg"  type="image" />In 1671 the German philosopher and mathematician Gustav Leibniz also creates an arithmometer based on a special gear wheel - the cogwheel of Leibniz. The Leibniz arithmometer, like its predecessors, performed four basic arithmetic operations. This is the end of this period, and mankind has  been accumulating strength and knowledge for the next round of the evolution of computer technology for almost a century and a half. The eighteenth and nineteenth centuries were a time when various sciences, including mathematics and astronomy developed rapidly. They often encountered problems requiring long and time-consuming computations.

Another well-known person in the history of computer technology was the English mathematician Charles Babbage. In 1823, Babbage began working on a machine for calculating polynomials, but, more interestingly, this machine had to produce, in addition to direct production of calculations, the results - print them on a negative plate for photo printing. It was planned that the machine would be powered by a steam engine. Due to technical difficulties, Babbage failed to realize his project to the end. Here, for the first time, the idea arose to use some external (peripheral) device to produce the results of calculations. Note that another scientist, Shoytz, in 1853, nevertheless realized a machine conceived by Babbage (it turned out even less than planned). Probably, Babbage liked the creative process of finding new ideas more than embodying them into something tangible. In 1834, he outlined the principles of the work of another machine, which he called "Analytical." Technical difficulties again did not allow him to fully realize his ideas. Babbage was able to bring the car only to the experimental stage. But precisely the idea is the engine of scientific and technological progress. The next car of Charles Babbage was the embodiment of the following ideas:<img data-rte-meta="%7B%22type%22%3A%22image%22%2C%22wikitext%22%3A%22%5B%5BFile%3ABabbage.jpg%7Cthumb%7C178x178px%5D%5D%22%2C%22title%22%3A%22Babbage.jpg%22%2C%22params%22%3A%7B%22alt%22%3A%22Babbage%22%2C%22thumbnail%22%3Atrue%2C%22caption%22%3A%22%22%2C%22width%22%3A178%2C%22height%22%3A178%7D%7D" data-rte-instance="177-7695000095a9e65faf0586" alt="" src="https://vignette.wikia.nocookie.net/central/images/7/7b/Babbage.jpg/revision/latest?cb=20180306091931" width="150" height="178" class="image thumb" data-image-name="Babbage.jpg" data-image-key="Babbage.jpg"  type="image" />Charles Babbage's ideas were developed and used by other scientists. So, in 1890, at the turn of the 20th century, an American Herman Hollerith developed a machine that works with data tables (the first Excel?). The machine was controlled by a program on punch cards. It was used in the conduct of the population census in the USA in 1890. In 1896, Hollerith founded the company, which was the predecessor of IBM. With the death of Babbage in the evolution of computer technology, another break came, until the 30s of the XX century. In the future, the entire development of mankind became inconceivable without computers.   Naturally, because of the large proportion of mechanical parts, these machines were doomed. It was necessary to look for a new, more technological element base. And then they remembered the invention of Forest, which in 1906 created a three-electrode vacuum lamp, called a triode. Due to its functional properties, it has become the most natural replacement for the relay. In 1946, the first universal computer, ENIAC, was created in the USA, at the University of Pennsylvania. ENIAC computer contained 18 thousand lamps, weighed 30 tons, occupied an area of ​​about 200 square meters and consumed enormous power. It still used decimal operations, and programming carried out the axis by switching the connectors and setting the switches. Naturally, such "programming" entailed the appearance of many problems, caused, first of all, by incorrect installation of switches. The project ENIAC is associated with the name of another key figure in the history of computing - the mathematician John von Neumann. It was he who first proposed to write the program and its data into the machine's memory so that they could be modified in the course of work, if necessary. This key principle, was used in the future when creating a fundamentally new computer EDVAC (1951). In this machine, binary arithmetic is already being applied and the operational memory built on ultrasonic mercury delay lines is used. The memory could store 1024 words. Each word consisted of  <h4 data-rte-spaces-before="1" data-rte-spaces-after="1" data-rte-empty-lines-before="1"> 44 binary digits. <p data-rte-fromparser="true"><img data-rte-meta="%7B%22type%22%3A%22image%22%2C%22wikitext%22%3A%22%5B%5BFile%3AEdvac.jpg%7Ccentre%7Cthumb%7C400x400px%5D%5D%22%2C%22title%22%3A%22Edvac.jpg%22%2C%22params%22%3A%7B%22alt%22%3A%22Edvac%22%2C%22thumbnail%22%3Atrue%2C%22align%22%3A%22center%22%2C%22caption%22%3A%22%22%2C%22width%22%3A400%2C%22height%22%3A400%7D%7D" data-rte-instance="177-7695000095a9e65faf0586" alt="" src="https://vignette.wikia.nocookie.net/central/images/1/17/Edvac.jpg/revision/latest?cb=20180306092916" width="400" height="262" class="image thumb alignCenter" data-image-name="Edvac.jpg" data-image-key="Edvac.jpg"  type="image" />

<h4 data-rte-spaces-before="1" data-rte-spaces-after="1" data-rte-empty-lines-before="1"> After the creation of EDVAC, mankind realized what heights of science and technology can be achieved by the human-computer tandem. This industry began to develop very quickly and dynamically, although there was also some periodicity associated with the need to accumulate a certain knowledge bag for another breakthrough. Until the mid-80-ies the evolution of computer technology is usually divided into generations. For completeness of presentation, we will give to these generations brief qualitative characteristics: <li data-rte-spaces-before="1"> </li></ul> <p data-rte-fromparser="true" data-rte-empty-lines-before="1">The first generation of computers (1945-1954). During this period, a typical set of structural elements forming part of a computer is formed. By this time, the developers have already formed about the same idea of ​​what elements should comprise a typical computer. This is a central processing unit (CPU), random access memory (RAM), and input / output (I / O) devices. The CPU, in turn, must consist of an arithmetic logic unit (ALU) and a control device (CU). Machines of this generation worked on a lamp element base, because of which they absorbed a huge amount of energy and were not very unreliable. With their help, basically, scientific problems were solved. Programs for these machines could already be compiled not in machine language, but in assembly language.

<li data-rte-spaces-before="1">The second generation of computers (1955-1964 gg.). The change of generations was determined by the appearance of a new element base: instead of a bulky lamp, miniature transistors began to be used in computers, delay lines as elements of operational memory replaced memory on magnetic cores. This ultimately led to a reduction in size, increased reliability and productivity of the computer. In the architecture of the computer appeared index registers and hardware for performing operations with a floating point. Teams have been developed to call subroutines. </li><li data-rte-spaces-before="1">The third generation of computers (1965-1970 gg.). The change of generations was again due to the renewal of the element base: instead of transistors in various nodes of the computer, integrated circuits of various degrees of integration began to be used. The chips allowed to place dozens of elements on a plate a few centimeters in size. This, in turn, not only increased the productivity of computers, but also reduced their size and cost. Appeared relatively inexpensive and small-sized machines - Mini-computers. They were actively used to manage various technological production processes in information collection and processing systems.The increase in the power of the computer made it possible to simultaneously execute several programs on one computer. To do this, it was necessary to learn how to coordinate the simultaneously performed actions, for which the functions of the operating system were expanded.Along with active developments in the field of hardware and architectural solutions, the share of developments in the field of programming technologies is growing. At this time, the theoretical bases of programming methods, compilation, databases, operating systems, etc. are being actively developed. Packages of applied programs for the most diverse areas of human activity are being created.Now it becomes an unacceptable luxury to rewrite all programs with the advent of each new type of computer. There is a tendency to create computer families, that is, the machines become compatible from the bottom up at the hardware-software level. The first of such families was the IBM System / 360 series and our domestic analogue of this computer - the EU computer. </li><li data-rte-spaces-before="1">The fourth generation of computers (1970-1984). The next change in the element base led to a change of generations. In the 1970s, large and very large integrated circuits (LSI and VLSI) were actively being developed, which allowed placing tens of thousands of elements on a single crystal. This entailed a further substantial reduction in the size and cost of the computer. Work with the software has become more friendly, which has led to an increase in the number of users.<img data-rte-meta="%7B%22type%22%3A%22image%22%2C%22wikitext%22%3A%22%5B%5BFile%3AIntel%204004.jpg%7Cthumb%7C130x130px%5D%5D%22%2C%22title%22%3A%22Intel_4004.jpg%22%2C%22params%22%3A%7B%22alt%22%3A%22Intel%204004%22%2C%22thumbnail%22%3Atrue%2C%22caption%22%3A%22%22%2C%22width%22%3A130%2C%22height%22%3A130%7D%7D" data-rte-instance="177-7695000095a9e65faf0586" alt="" src="https://vignette.wikia.nocookie.net/central/images/5/52/Intel_4004.jpg/revision/latest?cb=20180306094038" width="130" height="126" class="image thumb" data-image-name="Intel 4004.jpg" data-image-key="Intel_4004.jpg"  type="image" />In principle, with such a degree of integration of the elements, it became possible to try to create a functionally complete computer on a single chip. Appropriate attempts were made, although they met, mostly, a mistrustful smile. Probably, these smiles would be less if one could foresee that this very idea will cause the death of large computers in some fifteen years. </li><li data-rte-spaces-before="1">The fifth generation of computers (1984 - our days) can be called microprocessor. Note that the fourth generation ended only in the early 80's, that is, parents in the face of large cars and their rapidly maturing and growing "child". For almost 10 years, they existed relatively peacefully together. For both of them, this time has only benefited. Designers of large computers have accumulated huge theoretical and practical experience, and microprocessor programmers have managed to find their own, albeit initially very narrow, niche in the market. </li></ul> <p data-rte-fromparser="true">In 1976, Intel completed the development of a 16-bit 8086 processor. It had a sufficiently large bit                   capacity of registers (16 bits) and a system bus address (20 bits), which could address up to 1 MB of               RAM.

<li data-rte-spaces-before="1"><u data-rte-washtml="1">Leibniz <img data-rte-meta="%7B%22type%22%3A%22image%22%2C%22wikitext%22%3A%22%5B%5BFile%3ALeibniz.jpg%7Cthumb%7C173x173px%5D%5D%22%2C%22title%22%3A%22Leibniz.jpg%22%2C%22params%22%3A%7B%22alt%22%3A%22Leibniz%22%2C%22thumbnail%22%3Atrue%2C%22caption%22%3A%22%22%2C%22width%22%3A173%2C%22height%22%3A173%7D%7D" data-rte-instance="177-7695000095a9e65faf0586" alt="" src="https://vignette.wikia.nocookie.net/central/images/1/1c/Leibniz.jpg/revision/latest?cb=20180306091446" width="150" height="173" class="image thumb" data-image-name="Leibniz.jpg" data-image-key="Leibniz.jpg"  type="image" />In 1671 the German philosopher and mathematician Gustav Leibniz also creates an arithmometer <img data-rte-meta="%7B%22type%22%3A%22image%22%2C%22wikitext%22%3A%22%5B%5BFile%3ALeibniz.jpg%7Cthumb%7C173x173px%5D%5D%22%2C%22title%22%3A%22Leibniz.jpg%22%2C%22params%22%3A%7B%22alt%22%3A%22Leibniz%22%2C%22thumbnail%22%3Atrue%2C%22caption%22%3A%22%22%2C%22width%22%3A173%2C%22height%22%3A173%7D%7D" data-rte-instance="177-7695000095a9e65faf0586" alt="" src="https://vignette.wikia.nocookie.net/central/images/1/1c/Leibniz.jpg/revision/latest?cb=20180306091446" width="150" height="173" class="image thumb" data-image-name="Leibniz.jpg" data-image-key="Leibniz.jpg"  type="image" />based on a special gear wheel - the cogwheel of Leibniz. The Leibniz arithmometer, like its predecessors, performed four basic arithmetic operations. This is the end of this period, and mankind has been accumulating strength and knowledge for the next round of the evolution of computer technology for almost a century and a half. The eighteenth and nineteenth centuries were a time when various sciences, including mathematics and astronomy developed rapidly. They often encountered problems requiring long and time-consuming computations. </li><li data-rte-spaces-before="1">Another well-known person in the history of computer technology was the English mathematician <img data-rte-meta="%7B%22type%22%3A%22image%22%2C%22wikitext%22%3A%22%5B%5BFile%3ABabbage.jpg%7Cthumb%7C178x178px%5D%5D%22%2C%22title%22%3A%22Babbage.jpg%22%2C%22params%22%3A%7B%22alt%22%3A%22Babbage%22%2C%22thumbnail%22%3Atrue%2C%22caption%22%3A%22%22%2C%22width%22%3A178%2C%22height%22%3A178%7D%7D" data-rte-instance="177-7695000095a9e65faf0586" alt="" src="https://vignette.wikia.nocookie.net/central/images/7/7b/Babbage.jpg/revision/latest?cb=20180306091931" width="150" height="178" class="image thumb" data-image-name="Babbage.jpg" data-image-key="Babbage.jpg"  type="image" /> </li></ul> <p />