The history of the emergence and development of information technology. Early history of information technology

63 years after C. Babbage's death, "someone" was found who took upon himself the task of creating a machine similar - in terms of the principle of operation, to the one that C. Babbage gave his life to. It turned out to be a German student Konrad Zuse (1910 - 1985). He began work on the creation of the machine in 1934, a year before receiving an engineering degree. Conrad didn't know about Babbage's machine, or about the work of Leibniz, or about Boole algebra, which is suitable for designing circuits using elements that have only two stable states.

Nevertheless, he turned out to be a worthy heir to W. Leibniz and J. Buhl, since he brought back to life the already forgotten binary system of calculus, and used something similar to Boolean algebra when calculating circuits. In 1937 machine Z1 (which meant Zuse 1) was ready and started working.

It was like Babbage's machine purely mechanical. Usage binary system worked a miracle - the machine occupied only two square meters on the table in the inventor's apartment. The length of the words was 22 binary digits. Operations were performed using floating point. For the mantissa and its sign, 15 digits were assigned, for the order - 7. The memory (also on mechanical elements) contained 64 words (versus 1000 for Babbage, which also reduced the size of the machine). The numbers and the program were entered manually. A year later, a data input device and programs appeared in the machine, using a film strip on which information was perforated, and a mechanical arithmetic device replaced the sequential AU with telephone relays. The Austrian engineer Helmut Schreyer, a specialist in the field of electronics, helped K. Zuse in this. The improved machine was named Z2. In 1941, Zuse, with the participation of G. Schreier, creates a relay computer with program control (Z3), containing 2000 relays and repeating the main characteristics of Z1 and Z2. It became the world's first completely relay digital computer with program control and was successfully operated. Its dimensions only slightly exceeded those of Z1 and Z2.

Back in 1938, G. Schreier suggested using electron tubes instead of telephone relays to build Z2. K. Zuse did not approve of his proposal. But during the Second World War, he himself came to the conclusion about the possibility of a lamp version of the machine. They delivered this message to a circle of learned men and were ridiculed and condemned. The figure they gave - 2000 electron tubes needed to build a machine, could cool the hottest heads. Only one of the listeners supported their plan. They did not stop there and submitted their considerations to the military department, indicating that the new machine could be used to decipher Allied radio messages.

But the chance to create in Germany not only the first relay, but also the world's first electronic computer was missed.

By this time, K. Zuse organized a small company, and two specialized relay machines S1 and S2 were created by her efforts. The first - to calculate the wings of "flying torpedoes" - projectiles that bombarded London, the second - to control them. It turned out to be the world's first control computer.

By the end of the war, K. Zuse creates another relay computer - Z4. It will be the only surviving of all the machines developed by him. The rest will be destroyed during the bombing of Berlin and the factories where they were produced.

And so, K. Zuse set several milestones in the history of the development of computers: he was the first in the world to use the binary system of calculation when building a computer (1937), he created the world's first relay computer with program control (1941) and a digital specialized control computer (1943).

These truly brilliant achievements, however, have a significant impact on the development computer science in the world did not provide.

The fact is that there were no publications about them and any advertising due to the secrecy of the work, and therefore they became known only a few years after the end of the Second World War.

Events in the USA developed differently. In 1944, Harvard University scientist Howard Aiken (1900-1973) created the first in the USA (at that time it was considered the first in the world.) Relay-mechanical digital computer MARK-1. In terms of its characteristics (performance, memory capacity), it was close to the Z3, but differed significantly in size (length 17 m, height 2.5 m, weight 5 tons, 500 thousand mechanical parts).

The machine used the decimal number system. As in Babbage's machine, gears were used in the counters and memory registers. Control and communication between them was carried out with the help of relays, the number of which exceeded 3000. G. Aiken did not hide the fact that he borrowed a lot in the design of the machine from C. Babbage. "If Babbage was alive, I would have nothing to do," he said. Remarkable quality of the machine was its reliability. Installed at Harvard University, she worked there for 16 years.

Following MARK-1, the scientist creates three more machines (MARK-2, MARK-3 and MARK-4) and also using relays, not vacuum tubes, explaining this by the unreliability of the latter.

Unlike the works of Zuse, which were carried out in secrecy, the development of the MARK1 was carried out openly and the creation of an unusual machine for those times was quickly recognized in many countries. The daughter of K. Zuse, who worked in military intelligence and was at that time in Norway, sent her father a newspaper clipping announcing the grandiose achievement of the American scientist.

K. Zuse could triumph. He was ahead of the emerging opponent in many ways. Later he will send him a letter and tell him about it. And the German government in 1980 will give him 800 thousand marks to recreate the Z1, which he did together with the students who helped him. K. Zuse donated his resurrected firstborn to the Museum of Computing Technology in Padeborn for eternal storage.

I would like to continue the story about G. Aiken with a curious episode. The fact is that the work on the creation of MARK1 was carried out at the production premises of IBM. Its head at that time, Tom Watson, who loved order in everything, insisted that the huge car be "dressed" in glass and steel, which made it very respectable. When the machine was transported to the university and presented to the public, the name of T. Watson was not mentioned among the creators of the machine, which terribly angered the head of IBM, who invested half a million dollars in the creation of the machine. He decided to "wipe his nose" to G. Aiken. As a result, a relay-electronic monster appeared, in huge cabinets of which 23 thousand relays and 13 thousand vacuum tubes were placed. The machine was inoperable. In the end, she was exhibited in New York to show the inexperienced public. This giant ended the period of electromechanical digital computers.

As for G. Aiken, when he returned to the university, he was the first in the world to begin lecturing on a then new subject, now called Computer Science - the science of computers, he was also one of the first to propose the use of machines in business calculations and business. The motive for the creation of MARK-1 was the desire of G. Aiken to help himself in the numerous calculations that he had to do when preparing his dissertation work (dedicated, by the way, to studying the properties of vacuum tubes).

However, the time was already approaching when the volume of settlement work in developed countries began to grow like a snowball, primarily in the field of military equipment, which was facilitated by the Second World War.

In 1941, employees of the Ballistic Research Laboratory at Aberdeen Ordnance Range in the United States turned to the nearby technical school at the University of Pennsylvania for help in compiling firing tables for artillery pieces, relying on the Bush differential analyzer, a bulky mechanical analog computing device, available at the school. However, an employee of the school, physicist John Mauchly (1907-1986), who was fond of meteorology and made several simple ones to solve problems in this area digital devices on vacuum tubes, suggested something different. He drafted (in August 1942) and sent to the US War Department a proposal to create powerful computer(at that time) on vacuum tubes. These truly historical five pages were shelved by military officials, and Mauchly's proposal would probably have remained without consequences if the test site employees had not become interested in it. They secured funding for the project, and in April 1943 a contract was signed between the test site and the University of Pennsylvania to build a computer called the Electronic Digital Integrator and Computer (ENIAC). 400 thousand dollars were allocated for this. About 200 people were involved in the work, including several dozen mathematicians and engineers.

The work was led by J. Mauchly and the talented electronics engineer Presper Eckert (1919 - 1995). It was he who suggested using vacuum tubes rejected by military representatives for the car (they could be obtained for free). Considering that the required number of lamps was approaching 20 thousand, and the funds allocated for the creation of the machine are very limited, this was a wise decision. He also proposed to reduce the lamp filament voltage, which significantly increased the reliability of their operation. The hard work ended at the end of 1945. ENIAC was presented for testing and successfully passed them. At the beginning of 1946, the machine began to count real tasks. In size, it was more impressive than MARK-1: 26 m long, 6 m high, weight 35 tons. But it was not the size that struck, but the performance - it was 1000 times higher than the performance of the MARK-1. Such was the result of using vacuum tubes!

Otherwise, ENIAC differed little from MARK-1. It used the decimal system. Word length - 10 decimal places. The capacity of the electronic memory is 20 words. Entering programs - from the switching field, which caused a lot of inconvenience: changing the program took many hours and even days.

In 1945, when work on the creation of ENIAC was being completed, and its creators were already developing a new electronic digital computer EDVAK, in which they intended to place programs in RAM in order to eliminate the main drawback of ENIAC - the difficulty of entering calculation programs, an outstanding mathematician, member of the Mathattan project to create an atomic bomb John von Neumann (1903-1957). It should be said that the developers of the machine, apparently, did not ask for this help. J. Neumann himself probably took the initiative when he heard from his friend G. Goldstein, a mathematician who worked in the military department, about ENIAC. He immediately appreciated the prospects for the development of new technology and took an active part in the completion of the work on the creation of EDVAK. The part of the report he wrote on the car contained general description EDVAKA and the basic principles of building a machine (1945).

It was reproduced by G. Goldstein (without the consent of J. Mauchly and P. Eckert) and sent to a number of organizations. In 1946 Neumann, Goldstein, and Burks (all three of whom worked at the Princeton Institute for Advanced Study) wrote another report ("Preliminary Discussion on Logical Device Design," June 1946) which contained a detailed and detailed description of the principles of building digital electronic computers. In the same year, the report was distributed at the summer session of the University of Pennsylvania.

The principles outlined in the report were as follows.

  • 1. Machines on electronic elements should work not in decimal, but in binary system of calculation.
  • 2. The program must be placed in one of the blocks of the machine - in a storage device with sufficient capacity and appropriate speeds for retrieving and writing program instructions.
  • 3. The program, as well as the numbers with which the machine operates, is written in binary code. Thus, in the form of representation, commands and numbers are of the same type. This circumstance leads to the following important consequences:
    • - intermediate results of calculations, constants and other numbers can be placed in the same storage device as the program;
    • - the numerical form of the program record allows the machine to perform operations on the quantities that encode the program commands.
  • 4. Difficulties in the physical implementation of a storage device, the speed of which corresponds to the speed of the operation of logical circuits, requires a hierarchical organization of memory.
  • 5. The arithmetic device of the machine is designed on the basis of circuits that perform the operation of addition, the creation of special devices for performing other operations is not advisable.
  • 6. The machine uses a parallel principle of organization of the computational process (operations on words are performed simultaneously for all digits).

It cannot be said that the listed principles of computer construction were first expressed by J. Neumann and other authors. Their merit is that, having generalized the accumulated experience in building digital computers, they managed to move from schematic (technical) descriptions of machines to their generalized logically clear structure, made an important step from theoretically important foundations (Turing machine) to the practice of building real computers. The name of J. Neumann drew attention to the reports, and the principles and structure of computers expressed in them were called Neumann's.

Under the leadership of J. Neumann at the Princeton Institute for Advanced Study in 1952, another MANIAC vacuum tube machine was created (for calculations on the creation of a hydrogen bomb), and in 1954 another one, already without the participation of J. Neumann. The latter was named after the scientist "Joniak". Unfortunately, just three years later, J. Neumann fell seriously ill and died.

J. Mauchly and P. Eckert, offended by the fact that they did not appear in the Princeton University report and the decision they had suffered to place programs in RAM began to be attributed to J. Neumann, and, on the other hand, seeing that many that arose like mushrooms after rain , firms seeking to capture the computer market, decided to take patents for ENIAC.

However, they were denied this. Meticulous rivals found information that back in 1938 - 1941, professor of mathematics John Atanasov (1903 - 1996), a Bulgarian by birth, who worked at the Iowa State Agricultural School, together with his assistant Clifford Bury, developed a model of a specialized digital computer (using a binary number systems) for solving systems of algebraic equations. The layout contained 300 electronic tubes, had memory on capacitors. Thus, Atanasov turned out to be the pioneer of lamp technology in the field of computers.

In addition, J. Mauchly, as it was found out by the court hearing the case for the issuance of a patent, turned out to be familiar with Atanasov's work not by hearsay, but spent five days in his laboratory, during the days of the model creation.

As for the storage of programs in RAM and the theoretical substantiation of the main properties of modern computers, here J. Mauchly and P. Eckert were not the first. Back in 1936, Alan Turing (1912 - 1953), a mathematician of genius, who then published his remarkable work "On Computable Numbers", said this.

Assuming that the most important feature of an algorithm (information processing task) is the possibility of the mechanical nature of its execution, A. Turing proposed an abstract machine for the study of algorithms, called the "Turing machine". In it, he anticipated the main properties modern computer. Data had to be entered into the machine from a paper tape divided into cells. Each of them contained a character or was empty. The machine could not only process the characters recorded on the tape, but also change them, erasing the old ones and writing new ones in accordance with the instructions stored in its internal memory. To do this, it was supplemented by a logical block containing a functional table that determines the sequence of machine actions. In other words, A. Turing provided for the presence of some storage device for storing the program of the machine's actions. But not only this determines his outstanding merits.

In 1942 - 1943, at the height of the Second World War, in England, in the strictest secrecy with his participation in Bletchley Park near London, the world's first specialized digital computer "Colossus" was built and successfully operated on vacuum tubes for decoding secret radiograms. German radio stations. She successfully coped with the task. One of the participants in the creation of the machine praised the merits of A. Turing: "I do not want to say that we won the war thanks to Turing, but I take the liberty of saying that without him we could have lost it." After the war, the scientist took part in the creation of a universal tube computer. Sudden death at the age of 41 prevented him from fully realizing his outstanding creative potential. In memory of A. Turing, a prize was established in his name for outstanding work in the field of mathematics and computer science. The computer "Colossus" has been restored and kept in the museum of Bletchley Park, where it was created.

However, in practical terms, J. Mauchly and P. Eckert really turned out to be the first who, having understood the expediency of storing the program in the machine's RAM (regardless of A. Turing), put it into a real machine - their second EDVAK machine. Unfortunately, its development was delayed, and it was put into operation only in 1951. At that time, in England, a computer with a program stored in RAM had been working for two years! The fact is that in 1946, at the height of work on EDVAK, J. Mauchly delivered a course of lectures on the principles of building computers at the University of Pennsylvania. Among the listeners was a young scientist, Maurice Wilkes (born in 1913) from the University of Cambridge, the same one where C. Babbage proposed a project for a digital computer with program control a hundred years ago. Returning to England, a talented young scientist managed in a very short time to create an EDSAK computer (electronic computer on delay lines) of sequential action with memory on mercury tubes using a binary system of calculation and a program stored in RAM. In 1949 the machine started working. So M. Wilks was the first in the world who managed to create a computer with a program stored in RAM. In 1951, he also proposed microprogram control of operations. EDSAK became the prototype of the world's first serial commercial computer LEO (1953). Today, M. Wilks is the only survivor of the computer pioneers of the world of the older generation, those who created the first computers. J. Mauchly and P. Eckert tried to organize their own company, but it had to be sold due to financial difficulties. Their new development - the UNIVAC machine, designed for commercial settlements, became the property of the Remington Rand company and in many ways contributed to its success.

Although J. Mauchly and P. Eckert did not receive a patent for ENIAC, its creation was certainly a golden milestone in the development of digital computing, marking the transition from mechanical and electromechanical to electronic digital computers.

In 1996, at the initiative of the University of Pennsylvania, many countries of the world celebrated the 50th anniversary of informatics, linking this event with the 50th anniversary of ENIAC. There were many reasons for this - before and after ENIAC, not a single computer caused such a resonance in the world and did not have such an influence on the development of digital computing technology as the wonderful brainchild of J. Mauchly and P. Eckert.

In the second half of our century, the development of technical means went much faster. The sphere of software, new methods of numerical calculations, and the theory of artificial intelligence developed even more rapidly.

In 1995, John Lee, an American professor of computer science at the University of Virginia, published the book Computer Pioneers. He included among the pioneers those who made a significant contribution to the development of technical means, software, computational methods, the theory of artificial intelligence, etc., since the appearance of the first primitive means information processing to the present day.

Human speech was the first carrier of knowledge about actions performed jointly by people. Knowledge was gradually accumulated and passed down orally from generation to generation. The process of oral storytelling received its first technological support with the creation of writing in various media. At first, stone, bone, clay, papyrus, silk were used for writing, then paper. The emergence of printing accelerated the pace of accumulation and dissemination of knowledge, stimulated the development of sciences.

First stageIT development- "manual" information technology (until the second half of the 19th century). Tools: pen, inkwells, account book. The form of information transfer is mail. But already in the XVII century. tools began to be developed, which made it possible to further create mechanized, and then automated IT.

During this period, the English scientist C. Babbage theoretically investigated the process of performing calculations and substantiated the foundations of the architecture of a computer (1830); mathematician A. Lovelace developed the first program for Babbage's machine (1843)

Second phaseIT development- “mechanical” information technology (since the end of the 19th century). Tools: typewriter, telephone, phonograph. Information is transmitted with the help of improved postal communication, and a search is underway for convenient means of presenting and transmitting information. At the end of the XIX century. the effect of electricity was discovered, which contributed to the invention of the telegraph, telephone, radio, which made it possible to quickly transmit and accumulate information in any volume. The means of information communication appeared, thanks to which the transmission of information could be carried out over long distances.

During this period, the English mathematician George Boole published the book "Laws of Thought", which was a tool for the development and analysis complex schemes, of which the modern computer consists of many thousands (1854); the first telephone conversations over telegraph wires (1876); production of computing punching machines and punched cards (1896).

Third stageIT development started in the late 1940s. 20th century - from the creation of the first computers.

During this period, the development of automated information technologies begins; magnetic and optical information carriers, silicon are used; “electrical” information technology is used (40-60s of the XX century). Until the end of the 1950s. in computers, the main element of the design was vacuum tubes (I generation), the development of ideology and programming technology was due to the achievements of American scientists.

Tools: large computers and related software, electric typewriter, portable tape recorder, copiers.

During this period: Z3 is presented to the attention of the scientific community - a programmable electromechanical computer with all the properties of a modern computer, created by the German engineer K. Zuse in 1941; launched Mark I - the first American programmable computer (1944); created the first in the USA electronic machine- "ENIAC" (calculator) (1946); in the USSR under the leadership of S.A. Lebedev created MESM - a small electronic calculating machine (1951); serial production of machines began in the Soviet Union, the first of which were BESM-1 and Strela (1953); IBM introduced the first 5 MB RAMAC hard disk drive (hard disk drive) in 1956.

The fourth stage of IT development- "electronic" information technology (since the early 1970s). Its tools are large computers and automated control systems created on their basis, equipped with wide software. The goal is the formation of a meaningful part of the information.

The invention of microprocessor technology and the appearance of a personal computer (70s of the XX century) made it possible to finally switch from mechanical and electrical means of information conversion to electronic ones, which led to the miniaturization of all instruments and devices. Computers are built on microprocessors and integrated circuits. computer networks, data transmission systems.

In the 1970s-1980s. mini-computers are created and distributed, an interactive mode of interaction of several users is carried out.

The fifth stage of IT development- computer ("new") information technology (since the mid-80s). Toolkit - a personal computer (PC) with a large number of software products for various purposes. The decision support system is being developed, artificial intelligence implemented on a PC, telecommunications are used. microprocessors are used. The goal is the content and availability for the general consumer of miniaturized technical means for domestic, cultural and other purposes.

In the 1980s-1990s. there is a qualitative leap in software development technology: the center of gravity of technological solutions is transferred to the creation of means of interaction between users and computers when creating a software product. An important place in IT is the representation and processing of knowledge. Knowledge bases and expert systems are being created. Personal computers are widely distributed.

IT Development in the 1990s-2000s: Intel Presents new processor- 32-bit 80486SX, 27 million operations per second (1990); Apple creates the first monochrome handheld scanner(1991); NEC releases first double speed CD-ROM drive (1992); M. Andrissen presented to the public his new web browser, called Mosaic Netscape (1994); by 1995, software from Microsoft was used by 85% of personal computers. The Windows operating system is being improved year by year, already possessing the means of access to the global Internet;

At the present stage, tool environments and visual programming systems are being developed to create programs in languages high level: TurboPascal, Delphi, Visual Bask, С++Builder, etc. Therefore, massive distributed data processing finds application. The Internet provides a unique opportunity, potentially allowing you to create the largest parallel computer in order to effectively use the existing potential of the network. It can also be viewed as a metacomputer - the largest parallel computer, consisting of many computers.

Basic concepts information technologies.

Management activity in any organization is based on the processing of data and the production of output information, which implies the existence of a technology for converting input data into effective information.

Technology when translated from Greek (techne) means art, skill, skill, and this is nothing more than processes. A process should be understood as a certain set of actions aimed at achieving a set goal. The process should be determined by the strategy chosen by the person and implemented using a combination of various means and methods.

Under the technology of material production is understood a process determined by a set of means and methods of processing, manufacturing, changing the state, properties, form of raw materials or material. Technology changes the quality or initial state of matter in order to obtain a material product

Information is one of the most valuable resources of society, along with such traditional material types of resources as oil, gas, minerals, etc., which means that the process of its processing, by analogy with the processes of processing material resources, can be perceived as a technology. Then the following definition holds.

Information technology is a process that uses a set of means and methods for collecting, processing and transmitting data (primary information) to obtain new quality information about the state of an object, process or phenomenon (information product).

The purpose of material production technology is to produce products that satisfy the needs of a person or system.

The purpose of information technology is the production of information for its analysis by a person and the adoption on its basis of a decision to perform an action.

Extending the content of this concept to management activities, and taking into account the specifics of the information processes on which it is based, we define information technology as a system of methods and methods for collecting, transmitting, accumulating, processing, storing, presenting and using information based on the use of technical means.

IT in accordance with the difference in information processes are classified into technologies:

Ø Collection of information;

Ø Transfer of information;

Ø Accumulation of information;

Ø Information processing;

Ø Information storage;

Ø Submission of information;

Ø Use of information.

A specific information technology for its implementation requires the presence of:

1. A set of appropriate technical means that implement the information process itself;

2. Systems of control means for the technical complex (for computer technology, these are software tools);

3. Organizational and methodological support, linking the implementation of all actions and personnel into a single technological process in accordance with the purpose of a specific information process within the framework of providing a certain function of management activity.

Each specific information process can be implemented by a separate technology using its own technical base, technical means management system and organizational and methodological support. But management activity is based on the implementation of almost all of the listed types of information technologies in accordance with the sequence and content of the individual stages of the decision-making process. Therefore, modern IT support for management activities is based on the integrated use of various types of information processes on the basis of a single technical complex, the basis of which is computer technology.

Information technology is the most important component of the process of using the information resources of society. To date, it has gone through several evolutionary stages, the change of which was determined mainly by the development of scientific and technological progress, the emergence of new technical means of information processing. In modern society, the main technical means of information processing technology is a personal computer, which significantly influenced both the concept of building and using technological processes, and the quality of the resulting information. The introduction of the personal computer in information sphere and the use of telecommunication means of communication determined a new stage in the development of information technology and, as a result, a change in its name by adding one of the synonyms: "new", "computer" or "modern".

The adjective "new" emphasizes the innovative rather than evolutionary nature of this technology. Its implementation is an innovative act in the sense that it significantly changes the content of various activities in organizations. The concept of new information technology also includes communication technologies that provide information transfer by various means, namely telephone, telegraph, telecommunications, fax, etc. In Table. the main characteristic features of the new information technology are given.

Main characteristics of the new information technology

New information technology - information technology with a "friendly" user interface, using personal computers and telecommunications.

The adjective "computer" emphasizes that the main technical means of its implementation is a computer.

Three basic principles of new (computer) information technology:

Interactive (dialogue) mode of work with a computer;

Integration (connection, interconnection) with other software products;

· flexibility in the process of changing both data and task definitions.

The implementation of the technological process of material production is carried out using various technical means, which include: equipment, machines, tools, conveyor lines, etc.

By analogy, there should be something similar for information technology. Such technical means of information production will be the hardware, software and mathematical support of this process. With their help, primary information is processed into information of a new quality. Let us single out software products separately from these tools and call them a toolkit, and for greater clarity, we can specify it by calling it an information technology software toolkit. Let's define this concept.

Information technology tool - one or more related software products for a specific type of computer, the technology of which allows you to achieve the goal set by the user.

As tools, you can use the following common types of software products for a personal computer: a word processor (editor), desktop publishing systems, spreadsheets, database management systems, electronic notebooks, electronic calendars, functional information systems (financial, accounting, marketing etc.), expert systems, etc.

Information technology, like any other, must meet the following requirements:

Provide a high degree dismemberment of the entire information processing process into stages (phases), operations, actions;

Include the entire set of elements necessary to achieve the goal;

Be regular. Stages, actions, operations of the technological process can be standardized and unified, which will allow more efficient targeted management of information processes.

The emergence and development of information technology.

Until the second half of the 19th century, the basis of information technology was the pen, inkwell and account book. Communication (connection) is carried out by sending packages (despatches). Productivity information processing was extremely low, each letter was copied separately manually, in addition to the invoices summed up manually, there was no other information for decision-making

The appearance in the second half of the 60s of large productive computers on the periphery of institutional activity (in computer centers) made it possible to mix the emphasis in information technology on processing not the form, but the content of information. This was the beginning of the formation of "electronic" or "computer" technology. As you know, management information technology should contain at least 3 most important components of information processing: accounting, analysis and decision making. These components are implemented in a "viscous" environment - a paper "sea" of documents, which becomes more and more immense every year.

Since the 1970s, there has been a tendency to shift the center of gravity of the development of automated control systems to the fundamental components of information technology (especially to analytical work) with the maximum use of man-machine procedures. But as before, all this work was carried out on powerful computers located centrally in computer centers. At the same time, the construction of such automated control systems is based on the hypothesis that the tasks of analysis and decision making belong to the class of formalizable, amenable to mathematical modeling. It was assumed that such automated control systems should improve the quality, completeness, authenticity and timeliness information support decision makers, whose performance will increase due to the increase in the number of analyzed tasks.

With the advent of personal computers on the “crest of the microprocessor revolution”, a fundamental modernization of the idea of ​​automated control systems takes place: from computer centers and centralization of control, to distributed computing potential, increasing the uniformity of information processing technology and decentralization of control. This approach has found its embodiment in decision support systems (DSS) and expert systems (ES), which characterize a new stage of computerization of organizational management technology in essence - the stage of personalization of automated control systems. Consistency is the main sign of DSS and the recognition that the most powerful computer cannot replace a person. In this case, we are talking about a structural human-machine control unit, which is optimized in the processes of work: the capabilities of the computer are expanded due to the structuring of the tasks being solved by the user and replenishment of its knowledge base, and the user's capabilities - due to the automation of those tasks that were previously inappropriate to be transferred to computer for economic or technical reasons. It becomes possible to analyze the consequences of various decisions and get answers to questions like: “what will happen if ...?” without wasting time on the time-consuming programming process .

The history of information technology has its roots in ancient times. The first step can be considered the invention of the simplest digital device - accounts. The abacus was invented completely independently and almost simultaneously in Ancient Greece, Ancient Rome, China, Japan and Russia.

The abacus in ancient Greece was called abacus, that is, a board or even a “Salamis board” (Salamis Island in the Aegean Sea). The abacus was a sanded board with grooves on which numbers were indicated with pebbles. The first groove meant units, the second - tens, and so on. During counting, any of them could accumulate more than 10 pebbles, which meant adding one pebble to the next groove. In Rome, the abacus existed in a different form: wooden boards were replaced with marble ones, balls were also made of marble.

In China, the "suan-pan" abacus was slightly different from the Greek and Roman ones. They were based not on the number ten, but on the number five. In the upper part of the "suan-pan" there were rows of five units, and in the lower part - two. If it was required, say, to reflect the number eight, one bone was placed in the lower part, and three in the units part. In Japan, there was a similar device, only the name was already “Serobyan”.

In Russia, the scores were much simpler - a bunch of units and a bunch of tens with bones or pebbles. But in the fifteenth century the "board count" will become widespread, that is, the use of a wooden frame with horizontal ropes on which the bones were strung.

Ordinary abacus were the forefathers of modern digital devices. However, if some of the objects of the surrounding material world were amenable to direct counting, piece by piece calculation, then others required a preliminary measurement of numerical values. Accordingly, two directions in the development of computing and computer technology have historically developed: digital and analog.

The analog direction, based on the calculation of an unknown physical object (process) by analogy with the model of a known object (process), received the greatest development in the period of the late 19th - mid-20th centuries. The founder of the analog direction is the author of the idea of ​​logarithmic calculus, the Scottish baron John Napier, who prepared in 1614 the scientific tome “Description of the amazing table of logarithms”. John Napier not only theoretically substantiated the functions, but also developed a practical table of binary logarithms.



The principle of John Napier's invention is to match the logarithm (the exponent to which a number must be raised) to a given number. The invention has simplified the performance of multiplication and division operations, since when multiplying, it is enough to add the logarithms of numbers.

In 1617, Napier invented a method for multiplying numbers using sticks. A special device consisted of rods divided into segments, which could be arranged in such a way that when adding numbers in segments adjacent to each other horizontally, the result of multiplying these numbers was obtained.

Somewhat later, the Englishman Henry Briggs compiled the first table of decimal logarithms. Based on the theory and tables of logarithms, the first slide rules were created. In 1620, the Englishman Edmund Gunther used a special plate for calculations on a proportional compass, popular at that time, on which the logarithms of numbers and trigonometric quantities were plotted parallel to each other (the so-called "Guenther scales"). In 1623, William Oughtred invented the rectangular slide rule, and Richard Delamain in 1630 invented the circular rule. In 1775, librarian John Robertson added a "slider" to the ruler to make it easier to read numbers from different scales. And, finally, in 1851-1854. Frenchman Amedey Mannheim dramatically changed the design of the ruler, giving it an almost modern look. The complete dominance of the slide rule continued until the 1920s and 1930s. XX century, until the appearance of electric arithmometers, which made it possible to carry out simple arithmetic calculations with much greater accuracy. The slide rule gradually lost its position, but turned out to be indispensable for complex trigonometric calculations and therefore has been preserved and continues to be used today.



Most people who use a slide rule are successful in doing typical calculations. However, complex operations for calculating integrals, differentials , moments of functions, etc., which are carried out in several stages according to special algorithms and require good mathematical preparation, cause significant difficulties. All this led to the appearance at one time of a whole class of analog devices designed to calculate specific mathematical indicators and quantities by a user who is not too sophisticated in matters of higher mathematics. In the early to mid-19th century, the following were created: a planimeter (calculating the area of ​​flat figures), a curvimeter (determining the length of curves), a differentiator, an integrator, an integraf (graphical results of integration), an integrimeter (integrating graphs), etc. . devices. The author of the first planimeter (1814) is the inventor Hermann. In 1854, the Amsler polar planimeter appeared. The first and second moments of the function were calculated using an integrator from Koradi. There were universal sets of blocks, for example, the KI-3 combined integrator, from which the user, in accordance with his own requests, could choose the necessary device.

The digital direction in the development of computing technology turned out to be more promising and today forms the basis of computer technology and technology. Even Leonardo da Vinci at the beginning of the 16th century. created a sketch of a 13-bit adder with ten-tooth rings. Although a working device based on these drawings was built only in the 20th century, the reality of Leonardo da Vinci's project was confirmed.

In 1623, Professor Wilhelm Schickard, in his letters to I. Kepler, described the design of a calculating machine, the so-called "clock for counting". The machine was also not built, but now a working model of it has been created based on the description.

The first mechanical digital machine built, capable of adding numbers with a corresponding increase in digits, was created by the French philosopher and mechanic Blaise Pascal in 1642. The purpose of this machine was to facilitate the work of Father B. Pascal, a tax inspector. The machine looked like a box with numerous gears, among which was the main design gear. The calculated gear was connected with a lever by means of a ratchet mechanism, the deviation of which made it possible to enter single-digit numbers into the counter and sum them up. It was rather difficult to carry out calculations with multi-digit numbers on such a machine.

In 1657, two Englishmen R. Bissacar and S. Patridge, completely independently of each other, developed a rectangular slide rule. In its unchanged form, the slide rule exists to this day.

In 1673, the famous German philosopher and mathematician Gottfried Wilhelm Leibniz invented a mechanical calculator - a more advanced calculating machine capable of performing basic arithmetic. Using the binary system, the machine could add, subtract, multiply, divide, and take square roots.

In 1700, Charles Perrault published his brother's book "Collection of a large number of machines of his own invention by Claude Perrault." The book describes a adding machine with racks instead of gears called a "rhabdological abacus". The name of the machine consists of two words: the ancient "abacus" and "rhabdology" - the medieval science of performing arithmetic operations using small sticks with numbers.

Gottfried Wilheim Leibniz in 1703, continuing a series of his works, writes the treatise Explication de I "Arithmetique Binaire" on the use of the binary number system in computers. Later, in 1727, based on the work of Leibniz, Jacob Leopold's calculating machine was created.

German mathematician and astronomer Christian Ludwig Gersten in 1723 created an arithmetic machine. The machine calculated the quotient and the number of consecutive addition operations when multiplying numbers. In addition, it was possible to control the correctness of data entry.

In 1751, the Frenchman Perera, based on the ideas of Pascal and Perrault, invents an arithmetic machine. Unlike other devices, it was more compact, since its counting wheels were not located on parallel axes, but on a single axis that passed through the entire machine.

In 1820, the first industrial production of digital adding machines took place . The championship belongs here to the Frenchman Thomas de Kalmar. In Russia, to the first adding machines of this type Bunyakovsky's self-accounts (1867) are included. In 1874, an engineer from St. Petersburg, Wilgodt Odner, significantly improved the design of the adding machine, using wheels with retractable teeth (Odner wheels) to enter numbers. Odner's arithmometer made it possible to carry out computational operations at a speed of up to 250 operations with four-digit digits in one hour.

It is quite possible that the development of digital computing technology would have remained at the level of small machines, if not for the discovery of the Frenchman Joseph Marie Jacquard, who at the beginning of the 19th century used a card with punched holes (punched card) to control a loom. Jacquard's machine was programmed using a whole deck of punched cards, each of which controlled one shuttle move so that when switching to a new pattern, the operator replaced one deck of punched cards with another. Scientists have tried to use this discovery to create a fundamentally new calculating machine that performs operations without human intervention.

In 1822, the English mathematician Charles Babbage created a program-controlled calculating machine, which is the prototype of today's peripherals input and printing. It consisted of manually rotated gears and rollers.

At the end of the 80s. In the 19th century, Herman Hollerith, an employee of the US National Census Bureau, managed to develop a statistical tabulator capable of automatically processing punched cards. The creation of the tabulator marked the beginning of the production of a new class of digital counting and punching (computing and analytical) machines, which differed from the class of small machines in the original system for entering data from punched cards. By the middle of the 20th century, perforating machines were produced by IBM and Remington Rand in the form of rather complex perforated complexes. These included punchers (stuffing punched cards), control punchers (re-stuffing and checking for misalignment of holes), sorting machines (laying punched cards into groups according to certain characteristics), spreading machines (more thorough layout of punched cards and compiling function tables), tabulators (reading punched cards , calculation and printing of calculation results), multiplayers (multiplication operations for numbers written on punched cards). Top Models perforated complexes processed up to 650 cards per minute, and the multiplayer multiplied 870 eight-digit numbers within an hour. The most advanced model of the IBM Model 604 electronic puncher, released in 1948, had a programmable data processing command panel and provided the ability to perform up to 60 operations with each punched card.

At the beginning of the 20th century, adding keys appeared with keys for entering numbers. Increasing the degree of automation of the work of adding machines made it possible to create automatic counting machines, or so-called small calculating machines with an electric drive and automatic execution up to 3 thousand operations with three- and four-digit numbers per hour. On an industrial scale, small calculating machines in the first half of the 20th century were produced by the companies Friden, Burroughs, Monro, etc. A variety of small machines were accounting counting and writing and counting and text machines, produced in Europe by Olivetti, and in the USA by the National Cash Register ( NCR). In Russia during this period, "Mercedes" were widespread - accounting machines designed to enter data and calculate the final balances (balances) on synthetic accounting accounts.

Based on the ideas and inventions of Babbage and Hollerith, Harvard University professor Howard Aiken was able to create in 1937-1943. a higher-level computing punching machine called "Mark-1", which worked on electromagnetic relays. In 1947, a machine of this series "Mark-2" appeared, containing 13 thousand relays.

Around the same period, theoretical prerequisites appeared and technical possibility the creation of a more perfect machine on electric lamps. In 1943, employees of the University of Pennsylvania (USA) began to develop such a machine under the leadership of John Mauchly and Prosper Eckert, with the participation of the famous mathematician John von Neumann. The result of their joint efforts was the ENIAC tube computer (1946), which contained 18 thousand lamps and consumed 150 kW of electricity. In the process of working on the tube machine, John von Neumann published a report (1945), which is one of the most important scientific documents in the theory of the development of computer technology. The report substantiated the principles of the design and functioning of universal computers of a new generation of computers, which absorbed all the best that was created by many generations of scientists, theorists and practitioners.

This led to the creation of computers, the so-called first generation. They are characterized by the use of vacuum tube technology, memory systems on mercury delay lines, magnetic drums and Williams cathode ray tubes. Data was entered using punched tapes, punched cards, and magnetic tapes with stored programs. printers were used. The speed of computers of the first generation did not exceed 20 thousand operations per second.

Further, the development of digital computing technology proceeded at a rapid pace. In 1949, according to Neumann's principles, the English researcher Maurice Wilkes built the first computer. Until the mid 50s. lamp machines were produced on an industrial scale. However, scientific research in the field of electronics opened up new prospects for development. The leading position in this area was occupied by the United States. In 1948 Walter Brattain and John Bardeen of AT&T invented the transistor, and in 1954 Gordon Tip of Texas Instruments used silicon to make the transistor. Since 1955, computers based on transistors have been produced, which have smaller dimensions, increased speed and reduced power consumption in comparison with lamp machines. Computers were assembled by hand, under a microscope.

The use of transistors marked the transition to computers second generation. Transistors replaced vacuum tubes and computers became more reliable and faster (up to 500 thousand operations per second). Improved and functional devices - work with magnetic tapes, memory on magnetic disks.

In 1958, the first interval microcircuit (Jack Kilby - Texas Instruments) and the first industrial integrated circuit (Chip) were invented, the author of which Robert Noyce later founded (1968) the world famous company Intel (INTegrated ELectronics). Computers based on integrated circuits, which have been in production since 1960, were even faster and smaller.

In 1959, Datapoint researchers made the important conclusion that the computer needed a central arithmetic logic unit that could control calculations, programs, and devices. It was about the microprocessor. Datapoint employees have developed fundamental technical solutions on the creation of a microprocessor and, together with Intel, in the mid-60s began to carry out its industrial fine-tuning. The first results were not entirely successful: Intel microprocessors were running much slower than expected. The collaboration between Datapoint and Intel has ended.

Computers were developed in 1964 third generation using electronic circuits low and medium degree of integration (up to 1000 components per chip). Since that time, they began to design not a single computer, but rather a whole family of computers based on the use of software. An example of third-generation computers can be considered the then created American IBM 360, as well as the Soviet EU 1030 and 1060. In the late 60s. minicomputers appeared, and in 1971 - the first microprocessor. A year later, Intel released the first widely known Intel 8008 microprocessor, and in April 1974, the second microprocessor. Generations of Intel 8080.

Since the mid 70s. computers were developed fourth generation. They are characterized by the use of large and very large integrated circuits (up to a million components per chip). The first computers of the fourth generation were released by Amdahl Corp. These computers used high-speed integrated circuit memory systems with a capacity of several megabytes. When turned off, the RAM data was transferred to disk. When turned on, it booted up. The performance of fourth-generation computers is hundreds of millions of operations per second.

Also in the mid-70s, the first personal computers appeared. The further history of computers is closely connected with the development of microprocessor technology. In 1975, the first mass personal computer Altair was created on the basis of the Intel 8080 processor. By the end of the 1970s, thanks to the efforts by Intel, which developed the latest microprocessors Intel 8086 and Intel 8088, there were prerequisites for improving the computing and ergonomic characteristics of computers. During this period, the largest electrical corporation IBM joined the competition in the market and tried to create a personal computer based on the Intel 8088 processor. In August 1981, the IBM PC appeared, which quickly gained immense popularity. The successful design of the IBM PC predetermined its use as the personal computer standard of the late 20th century.

Computers have been developed since 1982 fifth generation. Their basis is the orientation to the processing of knowledge. Scientists are confident that the processing of knowledge, which is characteristic only of a person, can also be carried out by a computer in order to solve the problems posed and make adequate decisions.

In 1984, Microsoft introduced the first samples of the operating Windows systems. Americans still consider this invention one of the outstanding discoveries of the 20th century.

An important proposal was made in March 1989 by Tim Berners-Lee, an employee of the International European Research Center (CERN). The essence of the idea was to create a new distributed information system called the World Wide Web. A hypertext-based information system could integrate CERN information resources (report databases, documentation, postal addresses etc.). The project was accepted in 1990.

Send your good work in the knowledge base is simple. Use the form below

Students, graduate students, young scientists who use the knowledge base in their studies and work will be very grateful to you.

Posted on http://allbest.ru

State budgetary educational institution of higher vocational education"Kursk State Medical University"

INDEPENDENT WORK

In the discipline "Informatics"

The history of the emergence and development of information technology

Completed:

1st year student "2" groups

Kurbatov Alexey Vladimirovich

Checked:

K. p. n., st.pr. Department of Physics, Informatics

Goryushkin E.I.

Kursk - 2014

Introduction

1. Basic concepts of information technology

2. Stages of development of information technologies

3. Problems of using information technologies

Conclusion

List of used literature

information program computer

Introduction

Information technology is "a set of methods, production processes and software and hardware tools combined into a technological chain that provides the collection, processing, storage, transmission and display of information." The purpose of the functioning of this chain, i.e. information technology is to reduce the complexity of the processes of using an information resource and increase their reliability and efficiency. The effectiveness of information technology is ultimately determined by the qualifications of the subjects of informatization processes. At the same time, technologies should be as accessible to consumers as possible.

According to the definition adopted by UNESCO, Information Technology (IT) is “a complex of interrelated scientific, technological, engineering sciences that study methods for the effective organization of the work of people involved in the processing and storage of information using computer technology and methods for organizing and interacting with people and production equipment, them practical application, as well as the associated social, economic and cultural problems.

The main features of modern IT:

computer processing of information;

storage of large amounts of information on machine media;

transmission of information over any distance in the shortest possible time.

Modern material production and other areas of activity are increasingly in need of information services, the processing of a huge amount of information. A universal technical means of processing any information is a computer, which plays the role of an amplifier of the intellectual capabilities of a person and society as a whole, and communication tools using computers serve to communicate and transmit information. The emergence and development of computers is a necessary component of the process of informatization of society.

Modern information technologies, with their rapidly growing potential and rapidly declining costs, open up great opportunities for new forms of labor organization and employment within both individual corporations and society as a whole. The range of such opportunities is expanding significantly - innovations affect all spheres of people's lives, family, education, work, geographical boundaries of human communities, etc. Today, information technology can make a decisive contribution to strengthening the relationship between the growth of labor productivity, production volumes, investment and employment . New types of services distributed through networks are able to create many jobs, which is confirmed by the practice of recent years.

Until the early 1980s, information technology was represented mainly by large computers and was used for the needs of only half of the corporate "pyramid", since, due to their high cost, it was impossible to automate the solution of managerial tasks. The automation of repetitive information processing processes was comparable to the automation of manual labor based on the use of machines that replaced people. It is estimated that between 1960 and 1980 over 12 million existing or potential information processing jobs were automated using traditional computers. The automation of workplaces located at the lower levels of the administrative hierarchy led to a decrease in the size of enterprises, but at the same time did not cause fundamental changes in the general model of labor organization.

1. Mainconceptsinformationtechnologies

Technology when translated from Greek (techne) means art, skill, skill, and this is nothing more than processes. A process should be understood as a certain set of actions aimed at achieving a set goal. The process should be determined by the strategy chosen by the person and implemented using a combination of various means and methods.

Information technology is a process that uses a set of means and methods for collecting, processing and transmitting data (primary information) to obtain new quality information about the state of an object, process or phenomenon (information product).

The purpose of information technology is the production of information for its analysis by a person and the adoption on its basis of a decision to perform an action.

Information technology is closely related to information systems, which are its main environment.

Information technology is a process consisting of clearly regulated rules for performing operations, actions, stages of varying degrees of complexity on data stored in computers. The main goal of information technology is to obtain the information necessary for the user as a result of targeted actions for the processing of primary information.

An information system is an environment whose constituent elements are computers, computer networks, software products, databases, people, various kinds of technical and software communications, etc. The main purpose of an information system is to organize the storage and transmission of information. An information system is a human-computer information processing system. The implementation of the functions of an information system is impossible without knowledge of the information technology oriented towards it. Information technology can also exist outside the scope of the information system.

Information technology is a set of well-defined purposeful actions of personnel for processing information on a computer.

Information system - a human-computer system for decision support and production of information products, using computer information technology.

Software:

Technology platform (certain type of equipment on which information technology can be installed)

Software platform (operating system)

Desktop platform (for a small team not using a server)

Enterprise platform (for a group or company using one or more servers)

Internet platform (for Internet applications that use a server)

New information technology

Information technology is the most important component of the process of using the information resources of society. To date, it has gone through several evolutionary stages, the change of which was determined mainly by the development of scientific and technological progress, the emergence of new technical means of information processing. In modern society, the main technical means of information processing technology is a personal computer, which significantly influenced both the concept of building and using technological processes, and the quality of the resulting information. The introduction of a personal computer into the information sphere and the use of telecommunication means of communication determined a new stage in the development of information technology.

New information technology - information technology with a "friendly" user interface, using personal computers and telecommunications.

Three basic principles of new (computer) information technology:

Interactive (dialog) mode of work with a computer;

Integration (docking, interconnection) with other software products;

Flexibility in the process of changing both data and task definitions.

Information technology tools. The implementation of the technological process of material production is carried out using various technical means, which include: equipment, machines, tools, conveyor lines, etc. Such technical means of information production will be the hardware, software and mathematical support of this process. With their help, primary information is processed into information of a new quality.

Information technology tool - one or more related software products for a specific type of computer, the technology of which allows you to achieve the goal set by the user.

Types of software products for a personal computer: word processor (editor), desktop publishing systems, spreadsheets, database management systems, electronic notebooks, electronic calendars, functional information systems (financial, accounting, marketing, etc.), expert systems etc.

Information technology requirements:

Low cost, within the reach of the individual buyer;

Autonomy in operation without special requirements for environmental conditions;

Flexibility of architecture, ensuring its adaptability to a variety of applications: in management, science, education, in everyday life;

- "friendliness" of the operating system and other software, causing the user to work with it without special professional training;

High reliability of work (more than 8000 hours between failures).

Information technology components:

Level 1 - stages where relatively long technological processes are implemented, consisting of operations and actions of subsequent levels.

2nd level - operations, as a result of which a specific object will be created in the software environment selected at the 1st level.

Level 3 - actions - a set of work methods standard for each software environment, leading to the fulfillment of the goal set in the corresponding operation. Each action changes the content of the screen.

Level 4 - elementary mouse and keyboard operations.

2. Stagesdevelopmentinformationtechnologies

There are several points of view on the development of information technologies using computers, which are determined by various signs of division.

Common to all the approaches outlined below is that with the advent of the personal computer, a new stage in the development of information technology has begun. The main goal is to satisfy the personal information needs of a person, both for the professional sphere and for everyday life.

Division sign - type of tasks and information processing processes

1st stage (60s - 70s) - data processing in computer centers in the mode of collective use. The main direction in the development of information technology was the automation of operational routine human actions.

Stage 2 (from the 80s) - the creation of information technologies aimed at solving strategic problems.

Sign of division - problems standing in the way of informatization of society

Stage 1 (until the end of the 1960s) is characterized by the problem of processing large amounts of data in conditions of limited hardware capabilities.

The 2nd stage (until the end of the 70s) is associated with the spread of computers of the IBM / 360 series. The problem of this stage is the software lagging behind the level of hardware development.

Stage 3 (from the beginning of the 80s) - the computer becomes a tool for a non-professional user, and information systems - a means of supporting his decision-making. Problems - the maximum satisfaction of the user's needs and the creation of an appropriate interface for working in a computer environment.

4th stage (from the beginning of the 90s) - the creation of a modern technology of interorganizational relations and information systems. The problems of this stage are very numerous.

The most significant of them are:

Development of agreements and establishment of standards, protocols for computer communications;

Organization of access to strategic information;

Organization of protection and security of information.

The sign of division is an advantage that computer technology brings

Stage 1 (since the beginning of the 1960s) is characterized by rather efficient information processing when performing routine operations with a focus on centralized collective use of computer center resources. The main criterion for evaluating the effectiveness of the created information systems was the difference between the funds spent on development and the funds saved as a result of implementation. The main problem at this stage was psychological - poor interaction between users, for whom information systems were created, and developers due to the difference in their views and understanding of the problems being solved. As a consequence of this problem, systems were created that were poorly perceived by users and, despite their rather large capabilities, were not used to the full.

Stage 2 (since the mid-70s) is associated with the advent of personal computers. The approach to creating information systems has changed - the orientation is shifting towards the individual user to support his decisions. The user is interested in the ongoing development, contact is established with the developer, and mutual understanding arises between both groups of specialists. At this stage, both centralized data processing, typical for the first stage, and decentralized, based on solving local problems and working with local databases at the user's workplace, are used.

The 3rd stage (since the beginning of the 90s) is associated with the concept of analyzing strategic advantages in business and is based on the achievements of telecommunications technology for distributed information processing. Information systems aim not just to increase the efficiency of data processing and help the manager. Appropriate information technology should help the organization survive the competition and gain an advantage.

Sign of division - types of technology tools

Stage 1 (until the second half of the 19th century) - "manual" information technology, the tools of which were: pen, inkwell, book. Communications were carried out manually by sending letters, packages, dispatches through the mail. The main goal of technology is to present information in the right form.

Stage 2 (from the end of the 19th century) - “mechanical” technology, the tools of which were: a typewriter, telephone, voice recorder, mail equipped with more advanced means of delivery. The main goal of technology is to present information in the right form by more convenient means.

Stage 3 (40-60s of the 20th century) - "electrical" technology, the tools of which were: large computers and related software, electric typewriters, copiers, portable voice recorders.

The purpose of the technology is changing. The emphasis in information technology is beginning to shift from the form of information presentation to the formation of its content.

4th stage (since the beginning of the 1970s) - "electronic" technology, the main tools of which are large computers and those created on their basis automated systems control systems (ACS) and information retrieval systems (IPS) equipped with a wide range of basic and specialized software systems. The center of gravity of technology is shifting even more to the formation of the content side of information for the management environment of various spheres of public life, especially to the organization analytical work. Many objective and subjective factors did not allow us to solve the tasks set for the new concept of information technology. However, experience was gained in the formation of the content side of management information and a professional, psychological and social basis was prepared for the transition to a new stage in the development of technology.

Stage 5 (since the mid-1980s) - “computer” (“new”) technology, the main tool of which is a personal computer with a wide range of standard software products for various purposes. At this stage, the process of personalization of automated control systems takes place, which manifests itself in the creation of decision support systems by certain specialists. Similar systems have built-in elements of analysis and intelligence for different levels of management, are implemented on a personal computer and use telecommunications. In connection with the transition to the microprocessor base, technical means for domestic, cultural and other purposes are also undergoing significant changes. Global and local computer networks are beginning to be widely used in various fields.

3. Problemsuseinformationtechnologies

For information technology, it is natural that they become obsolete and are replaced by new ones.

In this regard, when introducing a new information technology, it must be taken into account that information products have an extremely high rate of replacement with new types or versions. Turnaround periods range from a few months to one year. Therefore, for the effective use of information technologies, they must be regularly upgraded.

There are the following types of information processing:

Centralized;

decentralized.

Centralized processing of information on computers of computer centers was the first historically established technology. Large computing centers for collective use were created, equipped with large computers, which made it possible to process large arrays of input information and receive on this basis different kinds information products, which are then transmitted to users.

Advantages of the centralized technology methodology:

The ability of the user to access large amounts of information in the form of databases and information products of a wide range;

Relative ease of implementation of methodological solutions for the development and improvement of information technology due to their centralized adoption.

Disadvantages of the centralized technology methodology:

Limited liability of personnel that does not contribute to the prompt receipt of information by the user, thereby preventing the correct development of management decisions;

Restriction of the user's capabilities in the process of obtaining and using information.

Decentralized information processing is associated with the advent of personal computers and the development of telecommunications. It gives the user ample opportunities in working with information and does not limit his initiatives.

The advantages of the decentralized information processing methodology are:

Flexibility of the structure, providing scope for user initiatives;

Strengthening the responsibility of lower-level employees;

Reducing the need to use a central computer and, accordingly, control from the computer center;

More complete realization of the user's creative potential through the use of computer communications.

But this methodology also has disadvantages:

The complexity of standardization due to the large number of unique developments;

Psychological rejection by users of the standards recommended by the computer center and ready-made software products;

The uneven development of the level of information technology in local places, which is primarily determined by the skill level of a particular employee.

Conclusion

In our time, humanity is experiencing a scientific and technological revolution, the material basis of which is electronic computers. Based on this technique, the new kind technologies - information.

Information technology refers to the processing of information based on computer computing systems.

Thus, information technology has firmly entered our lives. They opened up new opportunities for work and leisure, made it possible to greatly facilitate the work of a person.

These include processes where the "source material" and "production" (output) is information. Of course, the processed information is associated with certain material carriers and, therefore, these processes also include the processing of matter and the processing of energy. But the latter is not essential for information technology. The main role here is played by information, not its carrier. The most common global network is the Internet. Numerous forecasts indicate that by the beginning of the next century, the Internet will not only turn the already familiar personal computers into something fundamentally different, but will also change the way of life of the majority of the world's population.

Modern society can hardly be imagined without information technology. It is difficult to imagine the prospects for the development of computer technology today even for specialists. However, it is clear that something big is in store for us in the future. And if the pace of information technology development does not slow down (and there is no doubt about it), then this will happen very soon, the main thing is to direct the development of this powerful tool in the right direction.

Listliterature

1. N.V. Makarova, V.B. Volkov, Informatics: a textbook for universities / N.V. Makarova: Peter, 2011. -576 p.

2. V.E. Figurnov IBM PC for users. M., "Infra-M", 7th ed., 2006 - 640 p.

3. Computer science. Edited by S.V. Simonovich. St. Petersburg, Peter, 2005.

4. Informatics: textbook. for students of educational institutions of secondary vocational education / E.V. Mikheeva, O.I. Titov. - 4th ed., ster. - M .: Publishing Center "Academy", 2010. - 352 p.

5. Informatics: textbook. for economic students. higher specialties textbook institutions / ed. N.V. Makarova.-3rd revised. ed. - M. : Finance and statistics, 2004. - 765 p. :ill.

6. Akinshina, L.V., Shaker, T.D. Modern information technologies in education. Part 1 / L.V. Akinshina, T.D. Shaker. Vladivostok: Publishing House of the Far Eastern State Technical University, 2004. 211 p.

7. Batin, N.V. Fundamentals of information technology / N.V. Batin. Minsk: Institute for the Training of Scientific Personnel Nat. acad. Sciences of Belarus, 2008. 235 p.

8. Informatics / ed. prof. Yu.A. Romanova. M.: Eksmo, 2005. 322 p.

9. Ostreykovsky, V.A. Informatics / V.A. Ostreikovskiy. M.: Higher school, 2001. 319 p.

10. Homonenko A.D. Fundamentals of modern computer technologies / A.D. Homonenko. M.: Korona print, 2009. 448 p.

Hosted on Allbest.ru

...

Similar Documents

    The concept of information technology, stages of their development, components and main types. Features of information technologies of data processing and expert systems. Methodology for the use of information technology. Advantages of computer technologies.

    term paper, added 09/16/2011

    The role of information systems and technologies in the life of modern society. Purpose and composition of personal computer software. Use of OLE technologies. Operating environments for solving the main classes of engineering and economic problems.

    practical work, added 02/27/2009

    The concept of information technology, the history of their formation. Goals of development and functioning of information technologies, characteristics of the means and methods used. The place of the information and software product in the system of information circulation.

    abstract, added 05/20/2014

    The concept, types and principles of information technology. Pedagogical goals and methodological possibilities of using information technologies in teaching music. Classification of pedagogical software. Trends in the development of music pedagogy.

    abstract, added 12/16/2010

    The main properties of information technology in the economy. Classification, main components and structural scheme information technologies. System and tools. Features of the interaction of information technologies with the external environment.

    presentation, added 01/22/2011

    Conditions for increasing the efficiency of managerial work. Basic properties of information technology. System and tools. Classification of information technologies by type of information. The main trends in the development of information technology.

    abstract, added 04/01/2010

    History of development of information technologies. Classification, types of software. Methodologies and technologies for designing information systems. Requirements for methodology and technology. Structural approach to the design of information systems.

    thesis, added 02/07/2009

    Development of information technologies in Russian Federation. The effectiveness of the use of ICT for the socio-economic development of the country: a set of infrastructure, software and skills to work with them among citizens, business structures and the public sector.

    term paper, added 07/15/2012

    The role of the management structure in the information system. Examples of information systems. Structure and classification of information systems. Information Technology. Stages of development of information technologies. Types of information technologies.

    term paper, added 06/17/2003

    The structure of the information process. Address structure and components Email. Stages of development of information technologies. Email software. Types of modern information technologies. Collection, processing and storage of information.



Loading...
Top