The history of the emergence and development of information technology. The concept of information technology

The history of the emergence and development of information technology.

The history of information technology has its roots in ancient times. The first simplest digital device is the abacus. Everything that could be calculated by the piece was calculated using such digital devices.

In 1949, the first tube computer was built - a universal computer of a new generation. In management activities, computers of the first generation were used to solve individual, most labor-intensive tasks, for example, payroll and material accounting, as well as to solve individual optimizations. tasks.

Since 1955, computers have been produced on transistors, their dimensions have become smaller, power consumption has decreased, and increased. Since 1960, the production of computers based on integrated circuits (Chip) has been launched. Computer technology based on transistors and microcircuits meant the creation of computers of the second generation

In 1964, using electronic circuits computers of the third generation were created with a small and medium degree of integration. In the late 60s, the first minicomputers appeared, and in 1971, the first microprocessor. Since that time, not individual computers have been developed and designed, but many components of computer technology based on the use of software. Software is considered as an independent and at the same time an integral part of computer technology.

In the mid-1970s, fourth-generation computers were developed using large and ultra-large integrated circuits with a capacity of several megabytes. When such computers are turned off, the data random access memory are transferred to the disk, when turned on, self-loading occurs.

Since 1982, the development of fifth-generation computers focused on knowledge processing has been underway. Prior to this, it was believed that the processing of knowledge is characteristic only of man. In management activities, with the help of fifth-generation computers, complex economic problems are solved, an object-oriented approach to solving individual problems is provided. For computer science This generation is characterized by a wide range of applications, an intelligent interface, the presence of information-advising systems and decision support systems, an interactive mode of user operation, as well as a network organization of information structures. With the creation of fifth-generation computers, the term NIT (new information technology) appeared, meaning the combination of computer technology, communications and office equipment.

The concept of information. Basic properties of information.

The concept of information is one of the main ones in modern science. The importance of information in the life of society is growing rapidly, methods of working with information are changing, and the scope of new information technologies is expanding.

Information- this is information about objects and phenomena of the environment, their parameters, properties and state, which reduce the degree of uncertainty and incompleteness of knowledge about them.

By information it is necessary to understand not the objects and processes themselves, but their reflection or display in the form of numbers, formulas, descriptions, drawings, symbols, samples.

Basic properties of information: reliability and completeness; value and relevance; clarity and intelligibility.

Information is reliable if it does not distort the true state of affairs. Information is complete if it is sufficient for understanding and decision making. The value of information depends on what tasks are solved with its help. Keeping up-to-date information is essential when working in a constantly changing environment. Information becomes clear and useful when it is expressed in the language spoken by those for whom it is intended.

Characteristics of modern computer facilities.

Characteristics of the microprocessor. Exist various models microprocessors manufactured by different companies. The main characteristics of the MP are the clock frequency and bit depth of the processor. The operating mode of the microprocessor is set by a microcircuit, which is called a clock frequency generator. This is a kind of metronome inside the computer, a certain number of cycles is allocated for the execution of each operation by the processor. Clock frequency meas. in megahertz.

The next characteristic is the processor capacity. Bit depth is the maximum length of a binary code that can be processed or transmitted by the processor as a whole. Most modern PCs use 32-bit processors. The most high-performance machines have 64-bit processors.

The amount of internal (RAM) memory. Computer memory is divided into random (internal) memory and long-term (external) memory. The performance of the machine is highly dependent on the volume internal memory. If there is not enough internal memory for some programs to work, then the computer begins to transfer part of the data to external memory, which drastically reduces its performance. Modern programs require RAM of tens and hundreds of megabytes. Modern programs require hundreds of megabytes of RAM to run well.

Characteristics of external memory devices. External storage devices are drives on magnetic and optical disks. Embedded in system unit magnetic disks are called hard drives, or hard drives. Read/Write to HDD produced faster than all other types external media, but still slower than in RAM. The more volume hard drive, all the better. On modern PCs, hard drives are installed, the volume of which is measured in gigabytes: tens and hundreds of gigabytes. Buying a computer, you get the necessary set of programs on the hard drive. Usually the buyer himself orders the composition of the computer software.

All other external memory media are removable, that is, they can be inserted into the drive and removed from the drive. These include floppy disks - CD-ROM, CD-RW, DVD-ROM.

Lately on the shift floppy disks Flash memory comes as the main means of transferring information from one computer to another. Flash memory is electronic device external memory used to read and write information in file format. Flash memory, like disks, is a non-volatile device. However, compared to disks, flash memory has a much larger information volume (hundreds and thousands of megabytes). And the speed of reading and writing data on a flash drive is approaching the speed of RAM,

All other device types are considered I/O devices. Mandatory of them are the keyboard, monitor and manipulator (usually a mouse). Additional devices: printer, modem, scanner, sound system and some others. The choice of these devices depends on the needs and financial capabilities of the buyer.

The emergence of the OS

In the mid-40s, the first tube computing devices were created. Programming was carried out exclusively in machine language. There was no system software, except for libraries of mathematical and utility subprograms. Operating systems still did not appear, all the tasks of organizing the computing process were solved manually by each programmer from the control panel.

Since the mid-50s, a new period in the development of computer technology began, associated with the emergence of a new technical base - semiconductor elements. The speed of processors has increased, increased. the amount of RAM and external memory.

To organize an effective sharing translators, library programs and loaders, the positions of operators were introduced into the staff of many computer centers. But most of the time, the processor was idle waiting for the operator to start the next task. To solve this problem, the first systems were developed batch processing, which automated the entire sequence of operator actions to organize the computational process. Early batch processing systems were the prototype of modern operating systems, they became the first system programs designed not to process data, but to control the computing process.

During the implementation of batch processing systems, a formalized job control language was developed, with the help of which the programmer told the system and the operator what actions and in what sequence he wants to perform on the computer. A typical set of directives usually included a sign of the beginning of a separate work, a translator call, a loader call, signs of the beginning and end of the source data.

The operator compiled a package of tasks, which later, without his participation, were sequentially launched for execution by the control program - the monitor. In addition, the monitor was able to independently handle the most common emergency situations encountered during the operation of user programs, such as the absence of initial data, register overflow, division by zero, access to a non-existent memory area, etc. The package was usually a set of punched cards, but for to speed up work, it could be transferred to a more convenient and capacious medium, such as magnetic tape or magnetic disk. The monitor program itself in the first implementations was also stored on punched cards or punched tape, and in later implementations on magnetic tape and magnetic disks.

Early batch processing systems significantly reduced the amount of time spent on auxiliary activities to organize the computing process, which means that another step was taken to increase the efficiency of using computers. However, at the same time, user programmers lost direct access to the computer, which reduced their efficiency - making any correction required much more time than when working interactively at the machine's console.

8. Integrated application packages. Advantages of their use in implementation in information technologies.

Integrated packages- a set of several software products that functionally complement each other, support unified information technologies implemented on a common computing and operating platform.

The most common integrated packages, the components of which are:

Text editor;

Table processor;

Organizer;

Support Tools Email;

Presentation programs;

Graphics editor.

The components of integrated packages can work in isolation from each other, but the main advantages of integrated packages come when they are judiciously combined with each other. Users of integrated packages have a unified interface for various components, thereby providing. the relative ease of the process of their development.

Distinctive features this class software tools are:

Completeness of information technology for end users;

The same type of end user interface for all programs included in the integrated package - common menu commands, standard icons of the same functions, standard construction and work with the dialog. windows, etc.;

Common service for programs of the integrated package (for example, dictionary and spell checker, chart builder, data converter, etc.);

Ease of exchange and references to objects created by programs of the integrated package (two methods are used: DDE - dynamic data exchange and OLE - dynamic linking by objects), uniform transfer of objects;

Availability of a single language platform for parsing macros, user programs;

The ability to create documents that integrate the capabilities various programs included in the integrated package.

Integrated packages are also effective in group work on a network of many users. For example, from the application program in which the user works, you can send documents and data files to another user, while supporting standards for transferring data in the form of objects over a network or via e-mail.

The concept of style.

Style- this is a kind of command that allows you to simultaneously apply all the formatting features specified for a given style to a specified part of the text: - fonts; - shifts from the left and right edges; - line spacing; - alignment of edges; - indents; - permission or prohibition of transfers.

Table of contents entries can be entered manually and tabs can be used to create dotted lines or dotted indents between entries and their page numbers. More fast way creating a table of contents is "automatic". In order to place the table of contents in the center in the Alignment group, select the option Centered, to indicate the beginning of the paragraph, press the TAB button.

Table editing.

The Word editor has two alternative ways editing tables: using the mouse and using menu commands.

Each table consists of a certain number of cells. If the image of table separator lines is enabled using the Table / Display grid command, then all cells of the table are clearly visible. The Tab key is used to move the text cursor over the table cells.

You can select text in a table using the mouse or using the keyboard shortcuts. In order to highlight individual characters in a table, you can use the Shift key combinations in combination with the cursor keys. To select an individual table cell with the mouse, you can triple-click in this cell or use the selection bar that each table cell has between the grid line and the cell text.

In order to select a separate column of the table with the mouse, you need to move the mouse pointer to the top of the table, where it will take the form of a black arrow pointing down, and then click the mouse. Selecting a table row is similar to selecting a line of text: using the selection bar to the left of the document border.

Also, to select individual rows and columns of a table, you can use the menu commands Table / Select row and Table / Select column.

To insert columns or rows, just select the column or row and click the Table/Insert/ button and press the appropriate button.

To delete rows, columns or cells, select the row, column or cell you want to delete, select Table / Delete Cells, Delete Rows or Delete Columns.

Editing a table also includes resizing rows, columns, and cells.

To split one cell into several, just click on it right click mouse and select the Split Cells command or the menu command Table / Split Cells. Next, specify how many rows and columns you want to split the selected cell into, and click OK.

To merge two or more cells into one, select these cells, then execute the Table/Merge Cells command or use a similar command from the context menu.

To adjust the width of columns, select the columns whose width you want to change, then select the menu Table / Cell Height and Width, click the Column tab, then enter the desired width value in the Column Width field, click OK.

To adjust the row height, select the rows whose height you want to change; select Table / Cell Height and Width from the menu, click the Row tab from the Row Height list to specify the exact value.

If the table spans several pages of the document, you can set the automatic repetition of the first line of the table by selecting the menu command Table / Headings.

25. Appointment and general characteristics spreadsheet editor Microsoft Excel.

Microsoft Excel is a powerful table editor designed to perform all table processing processes: from creating spreadsheet documents, before calculating mathematical functions and plotting graphs for them, as well as printing them out.

It works with many fonts, both Russian and any of the twenty-one languages ​​of the world. One of the many useful properties of Excel includes automatic text correction by cells, automatic word wrapping and spelling correction of words, saving text in a certain set period of time, the presence of masters of standard tables, blanks and templates that allow you to create an advance report, balance sheet in a matter of minutes, timesheet, invoice, financial templates and more. Excel searches for a given word or text fragment, replaces it with the specified fragment, deletes it, copies it to the internal clipboard, or replaces it by font, typeface or font size, as well as by superscript or subscript characters.

In this, Excel is in many ways similar to a text editor. Microsoft Word, but it also has its own peculiarities: for each cell, you can set number formats, alignment, cell merging, text direction under any degree, etc. With the help of Excel macros, you can include graphics objects, pictures, music modules in *. wav.

To restrict access to the document, you can set a password for tables, which Excel will ask when loading tables to perform any actions with them. Excel allows you to open many windows to work with multiple tables at the same time.

Vector graphics.

Vector graphics are images created (or rather described) using mathematical formulas. Unlike raster graphics, which are nothing more than an array of colored pixels and store information for each of them, vector graphics are a set of graphical primitives described by mathematical formulas. For example, in order to build a line on the screen, you just need to know the coordinates of the start and end points of the line and the color with which you want to draw it, and to build a polygon. - vertex coordinates, fill color and, if necessary, stroke color.

Disadvantages of vector graphics:

Raster graphics.

Raster graphics are images made up of pixels - small colored squares placed in a rectangular grid. A pixel is the smallest unit of a digital image. Quality bitmap directly depends on the number of pixels of which it consists - the more pixels, the more details can be displayed. It will not work to enlarge a raster image by stupidly zooming in - the number of pixels cannot be increased, I think many people were convinced of this when they tried to make out small details in a small digital photograph, zooming it in on the screen; as a result of this action, it was not possible to discern anything other than increasing squares (they are just pixels). Only CIA agents in Hollywood films succeed in such a trick, when they recognize the license plates of the car using the magnification of the picture from the external surveillance camera. If you are not an employee of this structure and do not own such magical equipment, nothing will work out for you.

A bitmap image has several characteristics. For a photo stocker, the most important are: resolution, size and color model.

Resolution is the number of pixels per inch (ppi - pixel per inch) to describe the display on the screen or the number of dots per inch (dpi - dot per inch) to print images.

Size - the total number of pixels in an image, usually measured in Mp (megapixels), is simply the result of multiplying the number of pixels in the height by the number of pixels in the width of the image.

A color model is a characteristic of an image that describes its representation based on color channels.

Disadvantages of raster graphics:

Raster format

Raster images are formed in the process of scanning multi-color illustrations and photographs, as well as when using digital photo and video cameras. You can create a bitmap image directly on your computer using a bitmap graphics editor.

A bitmap image is created using dots of different colors (pixels) that form rows and columns. Each pixel can take on any color from a palette containing tens of thousands or even tens of millions of colors, so bitmap images provide high fidelity in color reproduction and halftones. The quality of a bitmap image increases with increasing spatial resolution (the number of pixels in the image horizontally and vertically) and the number of colors in the palette.

Advantages of raster graphics:

Ability to reproduce images of any level of complexity. The amount of detail reproduced in an image largely depends on the number of pixels.

Accurate reproduction of color transitions.

The presence of many programs for displaying and editing raster graphics. The vast majority of programs support the same raster graphics file formats. Raster representation is perhaps the "oldest" way to store digital images.

Disadvantages of raster graphics:

Large file size. In fact, for each pixel, you have to store information about its coordinates and color.

The impossibility of scaling (in particular, zooming in) the image without loss of quality.

Vector graphics- these are images created (or rather, described) using mathematical formulas. Unlike raster graphics, which are nothing more than an array of colored pixels and store information for each of them, vector graphics are a set of graphical primitives described by mathematical formulas.

Thanks to this way of presenting graphic information, a vector image can not only be scaled both up and down, but you can also rearrange the primitives and change their shape to create completely different images from the same objects.

Advantages of vector graphics:

Small file size with relatively simple image detailing.

Possibility of unlimited scaling without loss of quality.

Ability to move, rotate, stretch, group, etc. without losing quality.

Ability to position objects along an axis perpendicular to the screen plane (along the z-axis - "above", "below", "above all", "below all").

Ability to perform Boolean transformations on objects - addition, subtraction, intersection, addition.

Line thickness control at any image scale.

Disadvantages of vector graphics:

Large file size with complex image detail. (There are times when, due to many small complex details, the size of the vector image is much larger than the size of its raster copy)

Difficulty in transmitting a photorealistic image (follows from the 1st flaw)

Compatibility problems of programs working with vector graphics, while not all programs open (or correctly display) even "common" formats (such as eps) created in other editors.

The concept of color in graphics.

Color is an extremely difficult problem for both physics and physiology, since it has both psychophysiological and physical nature. The perception of color depends on the physical properties of light, i.e. electromagnetic energy, from its interaction with physical substances, as well as from their interpretation by the human visual system. In other words, the color of an object depends not only on the object itself, but also on the source of light illuminating the object and on the system of human vision. Moreover, some objects reflect light (board, paper), while others let it through (glass, water). If a surface that only reflects blue light is illuminated with red light, it will appear black. Similarly, if a green light source is viewed through a glass that only transmits red light, it will also appear black.
In computer graphics, two primary color mixing systems are used: additive - red, green, blue (RGB) and subtractive - cyan, magenta, yellow (CMY). The colors of one system are complementary to the colors of the other: cyan to red, magenta to green, and yellow to blue. The complementary color is the difference between white and the given color.
Subtractive The CMY color system is used for reflective surfaces such as printing inks, films, and non-luminous screens.
Additive The RGB color system is useful for luminous surfaces such as CRT screens or color lamps.

additive Color is obtained by combining light of different colors. In this scheme, the absence of all colors is black, and the presence of all colors is white. Scheme additive colors works with emitted light, such as a computer monitor. In the scheme subtractive flowers, the process is reversed. Here, any color is obtained by subtracting other colors from the total beam of light. In this scheme White color appears as a result of the absence of all colors, while their presence gives black. Scheme subtractive colors works with reflected light.

RGB color system

A computer monitor creates color directly by emitting light, and uses the RGB color scheme. If you look at the monitor screen from a close distance, you will notice that it consists of tiny dots of red, green and blue colors. The computer can control the amount of light emitted through any colored dot, and by combining different combinations of any color, it can create any color. Determined by the nature of computer monitors, the RGB scheme is the most popular and widespread, but it has a drawback: computer drawings do not always have to be present only on the monitor, sometimes they have to be printed, then another color system must be used - CMYK.

CMYK color system

This system was widely known long before computers were used to create graphic images. Computers are used to separate image colors into CMYK colors, and their special models have been developed for printing. Converting colors from the RGB system to the CMYK system faces a number of problems. The main difficulty lies in the fact that colors can change in different systems. These systems have a different nature of obtaining colors and what we see on the screen of monitors can never be exactly repeated when printing. Currently, there are programs that allow you to work directly in CMYK colors. Vector graphics programs already reliably have this ability, and raster graphics programs have only recently begun to provide users with the means to work with CMYK colors and fine-tune how the drawing will look when printed.

PowerPoint presentations.

The simplest and most common electronic presentation format is the PowerPoint presentation. With this program, you can use audio and video files in your presentation and create simple animations. The main advantage of this presentation format is the ability to make changes to the presentation without special knowledge and skills, adapting it to different audiences and goals.

PDF presentations

Another view is quite simple computer presentation is a presentation in pdf format. This is a version of the electronic catalog, convenient for distribution by e-mail, posting on the site and printing on a printer. The main advantage of a presentation in pdf format is its low weight, which makes it easy and simple to distribute the file by e-mail. The pdf presentation is static and suitable for any printer and operating system, but this is also a disadvantage.

Video presentation

In this type of presentation computer graphics and other animated special effects give way to a live picture - a video image. This type of presentation is becoming a thing of the past and is due to the limited capabilities of video, so

how ordinary presentations that take more than 5-7 minutes are not perceived by the audience, and in such a period of time it is not possible to show all the necessary information using video. In addition, the video is associated with boring corporate films and other boring formats - this is another drawback of this form of presentation. The main advantage is a lively, trustworthy picture.

multimedia presentation

Multimedia presentations - the most extensive type of presentations in terms of its capabilities. This presentation format allows you to integrate sound, video files, animation, 3D objects and any other elements without sacrificing quality. The main and indisputable advantage of multimedia presentations is the possibility of introducing virtually any format into them - power point presentations, pdf presentations and video presentations.

Flash presentations

Almost all the best multimedia presentations are based on Flash (flash) presentations. A Flash presentation is a presentation created as a single file, without folders and swapping documents, with the ability to auto-start the presentation when loading a disk using the brightest saturated animation. Another advantage of a flash-based presentation is its relatively low weight, which makes it possible to post such presentations on the Internet or give them on mini-discs.

Proper structuring of the presentation makes it easier for the listeners to perceive the information. During the speech, it is advisable to adhere to the well-known rule of three parts: introduction - main part - conclusion. The presentation is followed by a question and answer session. Thus, four functional parts are distinguished in the structure of the presentation, each of which has its own tasks and means: Let's pay your attention to the "shock" parts of the presentation - conclusion and introduction. Yes, in exactly this sequence: when preparing, the final part is written first and only then the introductory part. Why? Because the end is the most important part of the presentation, which should be remembered most of all by the audience. The content of the entire presentation should be aimed precisely at a successful conclusion. Almost always, people make the final decision at the end of the presentation. Therefore, in the final part, once again recall the main idea, focus on key details, and emphasize the advantages of your proposal. The introduction and conclusion are the brightest moments of the presentation; every word should be thought out and weighed in them.

PowerPoint window

When PowerPoint starts, a blank title slide is created and displayed in the program window.

As in other applications Microsoft Office Along the top of the PowerPoint window is the title bar, below are the main menu and toolbars.

The main menu contains a Slide Show item, which is not available in other application windows. It allows you to see how the slide show will play out. At the bottom of the window is the status bar. It displays explanatory inscriptions: number of the current slide, number of slides, type of presentation.

The settings for displaying PowerPoint after launch are determined by the settings made on the View tab of the Options command dialog box on the Tools menu. On this tab, you can select the Startup Task Page check box, which will display the Getting Started task pane on the right side of the window.

Slides can be in landscape or portrait orientation. To move between slides, you can use the scroll bar or the buttons located on it: Next Slide (Next Slide) and Previous Slide (Previous Slide). The PageUp and PageDown keys serve the same purposes. At the bottom left of the presentation window are buttons that allow you to change the view mode of your presentation.

There are five modes in PowerPoint that provide a wide range of options for creating, building and presenting presentations. In Slide View, you can work on individual slides. The slide sorter view allows you to change the order and status of the slides in your presentation. The notes page mode is intended for entering abstracts or a brief summary of the report. In the show mode, you can show the presentation on your computer. The slides take up the entire screen. Switching modes is carried out using the buttons at the bottom of the presentation window.

Modes can also be accessed using menu commands.

You can customize your presentations in Outline and Slide Views. In outline view, all slides can be viewed and edited at the same time, while in slide view, only the current slide can be adjusted.

Slide sorter mode offers another way to work with slides, where the entire presentation is presented as a set of slides laid out in a certain order on a light surface. This mode, like the structure mode, allows you to change the order of the slides in the presentation.

The history of information technology has its roots in ancient times. The first step can be considered the invention of the simplest digital device - accounts. The abacus was invented completely independently and almost simultaneously in Ancient Greece, Ancient Rome, China, Japan and Rus'.

The abacus in ancient Greece was called abacus, that is, a board or even a “Salamis board” (Salamis Island in the Aegean Sea). The abacus was a sanded board with grooves on which numbers were indicated with pebbles. The first groove meant units, the second - tens, and so on. During counting, any of them could accumulate more than 10 pebbles, which meant adding one pebble to the next groove. In Rome, the abacus existed in a different form: wooden boards were replaced with marble ones, balls were also made of marble.

In China, the "suan-pan" abacus was slightly different from the Greek and Roman ones. They were based not on the number ten, but on the number five. In the upper part of the "suan-pan" there were rows of five units, and in the lower part - two. If it was required, say, to reflect the number eight, one bone was placed in the lower part, and three in the units part. In Japan, there was a similar device, only the name was already “Serobyan”.

In Rus', the scores were much simpler - a bunch of units and a bunch of tens with bones or pebbles. But in the fifteenth century the "board count" will become widespread, that is, the use of a wooden frame with horizontal ropes on which the bones were strung.

Ordinary abacus were the forefathers of modern digital devices. However, if some of the objects of the surrounding material world were amenable to direct counting, piece by piece calculation, then others required a preliminary measurement of numerical values. Accordingly, two directions in the development of computing and computer technology have historically developed: digital and analog.

The analog direction, based on the calculation of an unknown physical object (process) by analogy with the model of a known object (process), received the greatest development in the period of the late 19th - mid-20th centuries. The founder of the analog direction is the author of the idea of ​​logarithmic calculus, the Scottish baron John Napier, who prepared in 1614 the scientific tome “Description of the amazing table of logarithms”. John Napier not only theoretically substantiated the functions, but also developed a practical table of binary logarithms.



The principle of John Napier's invention is to match the logarithm (the exponent to which a number must be raised) to a given number. The invention has simplified the performance of multiplication and division operations, since when multiplying, it is enough to add the logarithms of numbers.

In 1617, Napier invented a method for multiplying numbers using sticks. A special device consisted of rods divided into segments, which could be arranged in such a way that when adding numbers in segments adjacent to each other horizontally, the result of multiplying these numbers was obtained.

Somewhat later, the Englishman Henry Briggs compiled the first table of decimal logarithms. Based on the theory and tables of logarithms, the first slide rules were created. In 1620, the Englishman Edmund Gunther used a special plate for calculations on a proportional compass, which was popular at that time, on which the logarithms of numbers and trigonometric quantities were plotted parallel to each other (the so-called "Guenther scales"). In 1623, William Oughtred invented the rectangular slide rule, and Richard Delamain in 1630 invented the circular rule. In 1775, the librarian John Robertson added a "slider" to the ruler to make it easier to read numbers from different scales. And, finally, in 1851-1854. Frenchman Amedey Mannheim dramatically changed the design of the ruler, giving it almost modern look. The complete dominance of the slide rule continued until the 1920s and 1930s. XX century, until the appearance of electric arithmometers, which made it possible to carry out simple arithmetic calculations with much greater accuracy. The slide rule gradually lost its position, but turned out to be indispensable for complex trigonometric calculations and therefore has been preserved and continues to be used today.



Most people who use a slide rule are successful in doing typical calculations. However, complex operations for calculating integrals, differentials , moments of functions, etc., which are carried out in several stages according to special algorithms and require good mathematical preparation, cause significant difficulties. All this led to the appearance at one time of a whole class of analog devices designed to calculate specific mathematical indicators and quantities by a user who is not too sophisticated in matters of higher mathematics. In the early to mid-19th century, the following were created: a planimeter (calculating the area of ​​flat figures), a curvimeter (determining the length of curves), a differentiator, an integrator, an integraf (graphical results of integration), an integrimeter (integrating graphs), etc. . devices. The author of the first planimeter (1814) is the inventor Hermann. In 1854, the Amsler polar planimeter appeared. The first and second moments of the function were calculated using an integrator from Koradi. There were universal sets of blocks, for example, the combined integrator KI-3, from which the user, in accordance with his own requests, could choose the necessary device.

The digital direction in the development of computing technology turned out to be more promising and today forms the basis computer technology and technology. Even Leonardo da Vinci at the beginning of the 16th century. created a sketch of a 13-bit adder with ten-tooth rings. Although a working device based on these drawings was built only in the 20th century, the reality of Leonardo da Vinci's project was confirmed.

In 1623, Professor Wilhelm Schickard, in his letters to I. Kepler, described the design of a calculating machine, the so-called "clock for counting". The machine was also not built, but now a working model of it has been created based on the description.

The first built mechanical digital machine capable of summing numbers with a corresponding increase in digits was created by the French philosopher and mechanic Blaise Pascal in 1642. The purpose of this machine was to facilitate the work of Father B. Pascal, a tax inspector. The machine looked like a box with numerous gears, among which was the main design gear. The calculated gear was connected to a lever with the help of a ratchet mechanism, the deviation of which made it possible to enter single-digit numbers into the counter and sum them up. It was rather difficult to carry out calculations with multi-digit numbers on such a machine.

In 1657, two Englishmen R. Bissacar and S. Patridge, completely independently of each other, developed a rectangular slide rule. In its unchanged form, the slide rule exists to this day.

In 1673, the famous German philosopher and mathematician Gottfried Wilhelm Leibniz invented a mechanical calculator - a more advanced calculating machine capable of performing basic arithmetic. With help binary system The reckoning machine could add, subtract, multiply, divide, and take square roots.

In 1700, Charles Perrault published his brother's book "Collection of a large number of machines of his own invention by Claude Perrault." The book describes a adding machine with racks instead of gears called a "rhabdological abacus". The name of the machine consists of two words: the ancient "abacus" and "rhabdology" - the medieval science of performing arithmetic operations using small sticks with numbers.

Gottfried Wilheim Leibniz in 1703, continuing a series of his works, writes the treatise Explication de I "Arithmetique Binaire" on the use of the binary number system in computers. Later, in 1727, based on the work of Leibniz, Jacob Leopold's calculating machine was created.

German mathematician and astronomer Christian Ludwig Gersten in 1723 created an arithmetic machine. The machine calculated the quotient and the number of consecutive addition operations when multiplying numbers. In addition, it was possible to control the correctness of data entry.

In 1751, the Frenchman Perera, based on the ideas of Pascal and Perrault, invents an arithmetic machine. Unlike other devices, it was more compact, since its counting wheels were not located on parallel axes, but on a single axis that passed through the entire machine.

In 1820, the first industrial production of digital adding machines took place . The championship belongs here to the Frenchman Thomas de Kalmar. In Russia, to the first adding machines of this type Bunyakovsky's self-accounts (1867) are included. In 1874, an engineer from St. Petersburg, Vilgodt Odner, significantly improved the design of the adding machine, using wheels with retractable teeth (Odner wheels) to enter numbers. Odner's arithmometer made it possible to carry out computational operations at a speed of up to 250 operations with four-digit digits in one hour.

It is quite possible that the development of digital computing technology would have remained at the level of small machines, if not for the discovery of the Frenchman Joseph Marie Jacquard, who at the beginning of the 19th century used a card with punched holes (punched card) to control a loom. Jacquard's machine was programmed using a whole deck of punched cards, each of which controlled one shuttle move so that when switching to a new pattern, the operator replaced one deck of punched cards with another. Scientists have tried to use this discovery to create a fundamentally new calculating machine that performs operations without human intervention.

In 1822, the English mathematician Charles Babbage created a program-controlled calculating machine, which is the prototype of today's peripherals input and printing. It consisted of manually rotated gears and rollers.

At the end of the 80s. In the 19th century, Herman Hollerith, an employee of the US National Census Bureau, managed to develop a statistical tabulator capable of automatically processing punched cards. The creation of the tabulator marked the beginning of the production of a new class of digital counting and punching (computing and analytical) machines, which differed from the class of small machines in the original system for entering data from punched cards. By the middle of the 20th century, perforating machines were produced by IBM and Remington Rand in the form of rather complex perforated complexes. These included punchers (stuffing punched cards), control punchers (re-stuffing and checking for misalignment of holes), sorting machines (laying punched cards into groups according to certain characteristics), spreading machines (more thorough layout of punched cards and compiling function tables), tabulators (reading punched cards , calculation and printing of calculation results), multiplayers (multiplication operations for numbers written on punched cards). Top Models perforated complexes processed up to 650 cards per minute, and the multiplayer multiplied 870 eight-digit numbers within an hour. The most advanced model of the IBM Model 604 electronic puncher, released in 1948, had a programmable data processing command panel and provided the ability to perform up to 60 operations with each punched card.

At the beginning of the 20th century, adding keys appeared with keys for entering numbers. Increasing the degree of automation of the work of adding machines made it possible to create automatic counting machines, or so-called small calculating machines with an electric drive and automatic execution up to 3 thousand operations with three- and four-digit numbers per hour. On an industrial scale, small calculating machines in the first half of the 20th century were produced by the companies Friden, Burroughs, Monro, etc. A variety of small machines were accounting counting and writing and counting and text machines, produced in Europe by Olivetti, and in the USA by the National Cash Register ( NCR). In Russia during this period, "Mercedes" were widespread - accounting machines designed to enter data and calculate the final balances (balances) on synthetic accounting accounts.

Based on the ideas and inventions of Babbage and Hollerith, Harvard University professor Howard Aiken was able to create in 1937-1943. computer punching machine high level called "Mark-1", which worked on electromagnetic relays. In 1947, a machine of this series "Mark-2" appeared, containing 13 thousand relays.

Around the same period, theoretical prerequisites appeared and technical possibility the creation of a more perfect machine on electric lamps. In 1943, employees of the University of Pennsylvania (USA) began to develop such a machine under the leadership of John Mauchly and Prosper Eckert, with the participation of the famous mathematician John von Neumann. The result of their joint efforts was the ENIAC tube computer (1946), which contained 18 thousand lamps and consumed 150 kW of electricity. While working on the tube machine, John von Neumann published a report (1945), which is one of the most important scientific documents in the theory of the development of computer technology. The report substantiated the principles of design and functioning of universal computers of a new generation of computers, which absorbed all the best that was created by many generations of scientists, theorists and practitioners.

This led to the creation of computers, the so-called first generation. They are characterized by the use of vacuum tube technology, memory systems on mercury delay lines, magnetic drums and Williams cathode ray tubes. Data was entered using punched tapes, punched cards, and magnetic tapes with stored programs. printers were used. The speed of computers of the first generation did not exceed 20 thousand operations per second.

Further, the development of digital computing technology proceeded at a rapid pace. In 1949, according to Neumann's principles, the English researcher Maurice Wilkes built the first computer. Until the mid 50s. lamp machines were produced on an industrial scale. However, scientific research in the field of electronics opened up new prospects for development. The leading position in this area was occupied by the United States. In 1948 Walter Brattain and John Bardeen of AT&T invented the transistor, and in 1954 Gordon Tip of Texas Instruments used silicon to make the transistor. Since 1955, computers based on transistors have been produced, which have smaller dimensions, increased speed and reduced power consumption in comparison with lamp machines. Computers were assembled by hand, under a microscope.

The use of transistors marked the transition to computers second generation. Transistors replaced vacuum tubes and computers became more reliable and faster (up to 500 thousand operations per second). Improved and functional devices - work with magnetic tapes, memory on magnetic disks.

In 1958, the first interval microcircuit (Jack Kilby - Texas Instruments) and the first industrial integrated circuit (Chip) were invented, the author of which Robert Noyce later founded (1968) the world famous company Intel (INTegrated ELectronics). Computers based on integrated circuits, which have been in production since 1960, were even faster and smaller.

In 1959, researchers at Datapoint made the important conclusion that the computer needed a central arithmetic logic unit that could control calculations, programs, and devices. It was about the microprocessor. Datapoint employees have developed fundamental technical solutions on the creation of a microprocessor and, together with Intel, in the mid-60s began to carry out its industrial fine-tuning. The first results were not entirely successful: Intel microprocessors were running much slower than expected. The collaboration between Datapoint and Intel has ended.

Computers were developed in 1964 third generation using electronic circuits of low and medium degree of integration (up to 1000 components per chip). Since that time, they began to design not a single computer, but rather a whole family of computers based on the use of software. An example of third-generation computers can be considered the then created American IBM 360, as well as the Soviet EU 1030 and 1060. In the late 60s. minicomputers appeared, and in 1971 - the first microprocessor. A year later, Intel released the first widely known Intel 8008 microprocessor, and in April 1974, the second-generation Intel 8080 microprocessor.

Since the mid 70s. computers were developed fourth generation. They are characterized by the use of large and very large integrated circuits (up to a million components per chip). The first computers of the fourth generation were released by Amdahl Corp. These computers used high-speed memory systems on integrated circuits several megabytes in size. When turned off, the RAM data was transferred to disk. When turned on, it booted up. The performance of fourth-generation computers is hundreds of millions of operations per second.

Also in the mid-70s, the first personal computers appeared. The further history of computers is closely connected with the development of microprocessor technology. In 1975, based on Intel processor 8080 was created the first mass personal computer Altair. By the end of the 1970s, thanks to the efforts by Intel, which developed the latest microprocessors Intel 8086 and Intel 8088, there were prerequisites for improving the computing and ergonomic characteristics of computers. During this period, the largest electrical corporation IBM joined the competition in the market and tried to create a personal computer based on the Intel 8088 processor. In August 1981, the IBM PC appeared, which quickly gained immense popularity. The successful design of the IBM PC predetermined its use as a standard personal computers end of the 20th century

Computers have been developed since 1982 fifth generation. Their basis is the orientation to the processing of knowledge. Scientists are confident that the processing of knowledge, which is characteristic only of a person, can also be carried out by a computer in order to solve the problems posed and make adequate decisions.

In 1984, Microsoft introduced the first samples of the operating Windows systems. Americans still consider this invention one of the outstanding discoveries of the 20th century.

An important proposal was made in March 1989 by Tim Berners-Lee, an employee of the International European Research Center (CERN). The essence of the idea was to create a new distributed information system called the World Wide Web. A hypertext-based information system could integrate CERN's information resources (report databases, documentation, postal addresses, etc.). The project was accepted in 1990.

63 years after C. Babbage's death, "someone" was found who took upon himself the task of creating a machine similar - in terms of the principle of operation, to the one that C. Babbage gave his life to. It turned out to be a German student Konrad Zuse (1910 - 1985). He began work on the creation of the machine in 1934, a year before receiving an engineering degree. Conrad didn't know about Babbage's machine, or about the work of Leibniz, or about Boole algebra, which is suitable for designing circuits using elements that have only two stable states.

Nevertheless, he turned out to be a worthy heir to W. Leibniz and J. Boole, since he brought back to life the already forgotten binary system of calculus, and used something similar to Boolean algebra when calculating circuits. In 1937 machine Z1 (which meant Zuse 1) was ready and started working.

It was like Babbage's machine purely mechanical. The use of the binary system worked wonders - the machine occupied only two square meters on the table in the inventor's apartment. The length of the words was 22 binary digits. Operations were performed using floating point. For the mantissa and its sign, 15 digits were assigned, for the order - 7. The memory (also on mechanical elements) contained 64 words (versus 1000 for Babbage, which also reduced the size of the machine). The numbers and the program were entered manually. A year later, a data input device and programs appeared in the machine, using a film strip on which information was perforated, and a mechanical arithmetic device replaced the sequential AU with telephone relays. The Austrian engineer Helmut Schreyer, a specialist in the field of electronics, helped K. Zuse in this. The improved machine was named Z2. In 1941, Zuse, with the participation of G. Schreier, creates a relay computer with program control (Z3), containing 2000 relays and repeating the main characteristics of Z1 and Z2. It became the world's first completely relay digital computer with program control and was successfully operated. Its dimensions only slightly exceeded those of Z1 and Z2.

Back in 1938, G. Schreier suggested using electron tubes instead of telephone relays to build Z2. K. Zuse did not approve of his proposal. But during the Second World War, he himself came to the conclusion about the possibility of a lamp version of the machine. They delivered this message to a circle of learned men and were ridiculed and condemned. The figure they gave - 2000 electron tubes needed to build a machine, could cool the hottest heads. Only one of the listeners supported their plan. They did not stop there and submitted their considerations to the military department, indicating that the new machine could be used to decipher Allied radio messages.

But the chance to create in Germany not only the first relay, but also the world's first electronic computer was missed.

By this time, K. Zuse organized a small company, and two specialized relay machines S1 and S2 were created by her efforts. The first - to calculate the wings of "flying torpedoes" - projectiles that bombarded London, the second - to control them. It turned out to be the world's first control computer.

By the end of the war, K. Zuse creates another relay computer - Z4. It will be the only surviving of all the machines developed by him. The rest will be destroyed during the bombing of Berlin and the factories where they were produced.

And so, K. Zuse set several milestones in the history of the development of computers: he was the first in the world to use the binary system of calculation when building a computer (1937), he created the world's first relay computer with program control (1941) and a digital specialized control computer (1943).

These truly brilliant achievements, however, did not have a significant impact on the development of computer technology in the world.

The fact is that there were no publications about them and any advertising due to the secrecy of the work, and therefore they became known only a few years after the end of the Second World War.

Events in the USA developed differently. In 1944, Harvard University scientist Howard Aiken (1900-1973) created the first in the USA (at that time it was considered the first in the world.) Relay-mechanical digital computer MARK-1. In terms of its characteristics (performance, memory capacity), it was close to the Z3, but differed significantly in size (length 17 m, height 2.5 m, weight 5 tons, 500 thousand mechanical parts).

The machine used the decimal number system. As in Babbage's machine, gears were used in the counters and memory registers. Control and communication between them was carried out with the help of relays, the number of which exceeded 3000. G. Aiken did not hide the fact that he borrowed a lot in the design of the machine from C. Babbage. "If Babbage was alive, I would have nothing to do," he said. Remarkable quality of the machine was its reliability. Installed at Harvard University, she worked there for 16 years.

Following MARK-1, the scientist creates three more machines (MARK-2, MARK-3 and MARK-4) and also using relays, not vacuum tubes, explaining this by the unreliability of the latter.

Unlike the works of Zuse, which were carried out in secrecy, the development of the MARK1 was carried out openly and the creation of an unusual machine for those times was quickly recognized in many countries. The daughter of K. Zuse, who worked in military intelligence and was at that time in Norway, sent her father a newspaper clipping announcing the grandiose achievement of the American scientist.

K. Zuse could triumph. He was ahead of the emerging opponent in many ways. Later he will send him a letter and tell him about it. And the German government in 1980 will give him 800 thousand marks to recreate the Z1, which he did together with the students who helped him. K. Zuse donated his resurrected firstborn to the Museum of Computing Technology in Padeborn for eternal storage.

I would like to continue the story about G. Aiken with a curious episode. The fact is that the work on the creation of MARK1 was carried out at the production premises of IBM. Its head at that time, Tom Watson, who loved order in everything, insisted that the huge car be "dressed" in glass and steel, which made it very respectable. When the machine was transported to the university and presented to the public, the name of T. Watson was not mentioned among the creators of the machine, which terribly angered the head of IBM, who invested half a million dollars in the creation of the machine. He decided to "wipe his nose" to G. Aiken. As a result, a relay-electronic monster appeared, in huge cabinets of which 23 thousand relays and 13 thousand vacuum tubes were placed. The machine was inoperable. In the end, she was exhibited in New York to show the inexperienced public. This giant ended the period of electromechanical digital computers.

As for G. Aiken, when he returned to the university, he was the first in the world to start lecturing on a then new subject, now called Computer Science - the science of computers, he was also one of the first to propose the use of machines in business calculations and business. The motive for the creation of MARK-1 was the desire of G. Aiken to help himself in the numerous calculations that he had to do when preparing his dissertation work (dedicated, by the way, to studying the properties of vacuum tubes).

However, the time was approaching when the volume of settlement work in developed countries began to grow like a snowball, primarily in the field of military equipment, which was facilitated by the Second World War.

In 1941, employees of the Ballistic Research Laboratory at Aberdeen Ordnance Range in the United States turned to the nearby technical school at the University of Pennsylvania for help in compiling firing tables for artillery pieces, relying on the Bush differential analyzer, a bulky mechanical analog computing device, available at the school. However, an employee of the school, physicist John Mauchly (1907-1986), who was fond of meteorology and made several simple digital devices on vacuum tubes to solve problems in this area, suggested something different. He was drawn up (in August 1942) and sent to the US military department a proposal to create a powerful computer (at that time) on vacuum tubes. These truly historical five pages were shelved by military officials, and Mauchly's proposal would probably have remained without consequences if the test site employees had not become interested in it. They secured funding for the project, and in April 1943 a contract was signed between the test site and the University of Pennsylvania to build a computer called the Electronic Digital Integrator and Computer (ENIAC). 400 thousand dollars were allocated for this. About 200 people were involved in the work, including several dozen mathematicians and engineers.

The work was led by J. Mauchly and the talented electronics engineer Presper Eckert (1919 - 1995). It was he who suggested using vacuum tubes rejected by military representatives for the car (they could be obtained for free). Considering that the required number of lamps was approaching 20 thousand, and the funds allocated for the creation of the machine are very limited, this was a wise decision. He also proposed to reduce the lamp filament voltage, which significantly increased the reliability of their operation. The hard work ended at the end of 1945. ENIAC was presented for testing and successfully passed them. At the beginning of 1946, the machine began to count real tasks. In size, it was more impressive than MARK-1: 26 m long, 6 m high, weight 35 tons. But it was not the size that struck, but the performance - it was 1000 times higher than the performance of the MARK-1. Such was the result of using vacuum tubes!

Otherwise, ENIAC differed little from MARK-1. It used the decimal system. Word length - 10 decimal places. The capacity of the electronic memory is 20 words. Entering programs - from the switching field, which caused a lot of inconvenience: changing the program took many hours and even days.

In 1945, when work on the creation of ENIAC was being completed, and its creators were already developing a new electronic digital computer EDVAK in which they intended to place programs in RAM in order to eliminate the main drawback of ENIAC - the difficulty of entering calculation programs, an outstanding mathematician, member of the Mathattan project to create an atomic bomb John von Neumann (1903-1957). It should be said that the developers of the machine, apparently, did not ask for this help. J. Neumann himself probably took the initiative when he heard from his friend G. Goldstein, a mathematician who worked in the military department, about ENIAC. He immediately appreciated the prospects for the development of new technology and took an active part in the completion of the work on the creation of EDVAK. The part of the report he wrote on the machine contained a general description of the EDVAK and the basic principles of constructing the machine (1945).

It was reproduced by G. Goldstein (without the consent of J. Mauchly and P. Eckert) and sent to a number of organizations. In 1946 Neumann, Goldstein, and Burks (all three of whom worked at the Princeton Institute for Advanced Study) wrote another report ("Preliminary Discussion on Logical Device Design," June 1946) which contained a detailed and detailed description of the principles of building digital electronic computers. In the same year, the report was distributed at the summer session of the University of Pennsylvania.

The principles outlined in the report were as follows.

  • 1. Machines on electronic elements should work not in decimal, but in binary system of calculation.
  • 2. The program must be located in one of the blocks of the machine - in a storage device with sufficient capacity and appropriate speeds for retrieving and writing program instructions.
  • 3. The program, as well as the numbers with which the machine operates, is written in binary code. Thus, in the form of representation, commands and numbers are of the same type. This circumstance leads to the following important consequences:
    • - intermediate results of calculations, constants and other numbers can be placed in the same storage device as the program;
    • - the numerical form of the program record allows the machine to perform operations on the quantities that encode the program commands.
  • 4. Difficulties in the physical implementation of a storage device whose speed corresponds to the speed of work logic circuits, requires a hierarchical organization of memory.
  • 5. The arithmetic device of the machine is designed on the basis of circuits that perform the operation of addition, the creation of special devices for performing other operations is not advisable.
  • 6. The machine uses a parallel principle of organization of the computational process (operations on words are performed simultaneously for all digits).

It cannot be said that the listed principles of computer construction were first expressed by J. Neumann and other authors. Their merit is that, having generalized the accumulated experience in building digital computers, they managed to move from circuit (technical) descriptions of machines to their generalized logically clear structure, made an important step from theoretically important foundations (Turing machine) to the practice of building real computers. The name of J. Neumann drew attention to the reports, and the principles and structure of computers expressed in them were called Neumann's.

Under the leadership of J. Neumann at the Princeton Institute for Advanced Study in 1952, another MANIAC vacuum tube machine was created (for calculations on the creation of a hydrogen bomb), and in 1954 another one, already without the participation of J. Neumann. The latter was named after the scientist "Joniak". Unfortunately, just three years later, J. Neumann fell seriously ill and died.

J. Mauchly and P. Eckert, offended by the fact that they did not appear in the Princeton University report and the decision they had suffered to place programs in RAM began to be attributed to J. Neumann, and, on the other hand, seeing that many that arose like mushrooms after rain , firms seeking to capture the computer market, decided to take patents for ENIAC.

However, they were denied this. Meticulous rivals found information that back in 1938 - 1941, professor of mathematics John Atanasov (1903 - 1996), a Bulgarian by birth, who worked at the Iowa State Agricultural School, together with his assistant Clifford Bury, developed a model of a specialized digital computer (using a binary number systems) for solving systems of algebraic equations. The layout contained 300 electronic tubes, had memory on capacitors. Thus, Atanasov turned out to be the pioneer of lamp technology in the field of computers.

In addition, J. Mauchly, as it was found out by the court that tried the case on issuing a patent, it turns out that he was familiar with Atanasov's work not by hearsay, but spent five days in his laboratory, during the days of the model creation.

As for the storage of programs in RAM and the theoretical substantiation of the main properties of modern computers, here J. Mauchly and P. Eckert were not the first. Back in 1936, Alan Turing (1912 - 1953), a mathematician of genius, who then published his remarkable work "On Computable Numbers", said this.

Assuming that the most important feature of an algorithm (information processing task) is the possibility of the mechanical nature of its execution, A. Turing proposed an abstract machine for the study of algorithms, called the "Turing machine". In it, he anticipated the main properties modern computer. Data had to be entered into the machine from a paper tape divided into cells. Each of them contained a character or was empty. The machine could not only process the characters recorded on the tape, but also change them, erasing the old ones and writing new ones in accordance with the instructions stored in its internal memory. To do this, it was supplemented by a logical block containing a functional table that determines the sequence of machine actions. In other words, A. Turing provided for the presence of some storage device for storing the program of the machine's actions. But not only this determines his outstanding merits.

In 1942 - 1943, at the height of the Second World War, in England, in the strictest secrecy with his participation in Bletchley Park near London, the world's first specialized digital computer "Colossus" was built and successfully operated on vacuum tubes for decoding secret radiograms. German radio stations. She successfully coped with the task. One of the participants in the creation of the machine praised the merits of A. Turing: "I do not want to say that we won the war thanks to Turing, but I take the liberty of saying that without him we could have lost it." After the war, the scientist took part in the creation of a universal tube computer. Sudden death at the age of 41 prevented him from fully realizing his outstanding creative potential. In memory of A. Turing, a prize was established in his name for outstanding work in the field of mathematics and computer science. The computer "Colossus" has been restored and kept in the museum of Bletchley Park, where it was created.

However, in practical terms, J. Mauchly and P. Eckert really turned out to be the first who, having understood the expediency of storing the program in the machine's RAM (regardless of A. Turing), put it into a real machine - their second EDVAK machine. Unfortunately, its development was delayed, and it was put into operation only in 1951. At that time, in England, a computer with a program stored in RAM had been working for two years! The fact is that in 1946, at the height of work on EDVAK, J. Mauchly delivered a course of lectures on the principles of building computers at the University of Pennsylvania. Among the listeners was a young scientist, Maurice Wilks (born in 1913) from the University of Cambridge, the very one where C. Babbage proposed a project for a digital computer with program control a hundred years ago. Returning to England, a talented young scientist managed to create an EDSAK computer in a very short time ( electronic computer on delay lines) of sequential action with memory on mercury tubes using a binary system of calculation and a program stored in RAM. In 1949 the machine started working. So M. Wilks was the first in the world who managed to create a computer with a program stored in RAM. In 1951, he also proposed microprogram control of operations. EDSAK became the prototype of the world's first serial commercial computer LEO (1953). Today, M. Wilks is the only survivor of the computer pioneers of the world of the older generation, those who created the first computers. J. Mauchly and P. Eckert tried to organize their own company, but it had to be sold due to financial difficulties. Their new development - the UNIVAC machine, designed for commercial settlements, became the property of the Remington Rand company and in many ways contributed to its success.

Although J. Mauchly and P. Eckert did not receive a patent for ENIAC, its creation was certainly a golden milestone in the development of digital computing, marking the transition from mechanical and electromechanical to electronic digital computers.

In 1996, at the initiative of the University of Pennsylvania, many countries of the world celebrated the 50th anniversary of informatics, linking this event with the 50th anniversary of ENIAC. There were many reasons for this - before and after ENIAC, not a single computer caused such a resonance in the world and did not have such an influence on the development of digital computing technology as the wonderful brainchild of J. Mauchly and P. Eckert.

In the second half of our century, the development of technical means went much faster. The sphere of software, new methods of numerical calculations, and the theory of artificial intelligence developed even more rapidly.

In 1995, John Lee, an American professor of computer science at the University of Virginia, published the book Computer Pioneers. He included among the pioneers those who made a significant contribution to the development of technical means, software, computational methods, the theory of artificial intelligence, etc., since the appearance of the first primitive means information processing to the present day.

1st stage(until the second half of the 19th century) - “manual” information technology, the tools of which are: pen, inkwell, account book. Communications are carried out manually by postal forwarding of letters, packages, dispatches. The main goal of technology is to present information in the right form.

2nd stage(since the end of the 19th century) - “mechanical” technology, the tools of which are: a typewriter, telephone, phonograph, mail, equipped with more advanced means of delivery. The main goal of technology is to present information in the right form in more convenient ways.

3rd stage(40-60s of the XX century) - “electrical” technology, the tools of which are: large computers and the corresponding software, electric typewriters, copiers, portable tape recorders. The purpose of the technology is changing. From the form of presentation of information, the emphasis is gradually shifting to the formation of its content.

4th stage(from the beginning of the 70s of the XX century) is an “electronic” technology, the main tools of which are large computers and automated control systems (ACS) created on their basis, equipped with a wide range of basic and specialized software systems. The center of gravity of technology is significantly shifted to the formation of the content side of information.

5th stage(since the mid-80s of the XX century) - “computer” technology, the main tool of which is a personal computer with a large number of standard software products for various purposes. At this stage, decision support systems are created. Similar systems have built-in elements of analysis and artificial intelligence for different levels of management. They are implemented on a personal computer and use telecommunications. In connection with the transition to a microprocessor base, technical means for domestic, cultural and other purposes are significantly changing. Telecommunications and local computer networks are being widely used in various fields.

The most widely used personal computers for editing texts in the preparation of magazines, books and various kinds of documentation. The advantages of computers over typewriters are obvious: the number of errors and typos is reduced, the preparation of materials is accelerated, and the quality of their design is improved.

The development of information technologies is unthinkable without the organization of e-mail, communication networks and information communications based on computer networks.

Any new use of computers requires, as a rule, not so much the acquisition of additional technical devices how much equipping with the proper software tools.

There are several classifications of software for computers. Consider the classification of software for a personal computer. It highlights gaming, educational, business programs, as well as information systems and software tools.

Game programs- one of the forms of exciting activities on the computer. With gaming programs, the mass distribution of personal computers began. To some extent computer games- This new technology recreation. When playing games, you need to remember, firstly, the saying “time is business, and time is fun”, and secondly, that excessive enthusiasm for any game can be harmful.

Learning programs serve to organize training sessions. These programs can be used for classes in logic, history, computer science, the Russian language, biology, geography, mathematics, physics and other academic disciplines. Computers in such classes can be used as electronic textbooks and simulators, laboratory stands and information and reference systems.

Business programs are intended for preparation, accumulation and processing of various service information. These programs can be used to computerize office work - maintaining documentation, preparing schedules, scheduling duty and other work. For this, various text editors, spreadsheets, graphic editor, databases, library information retrieval systems and other specialized programs.

Information Systems are used to organize, accumulate and search a wide variety of information on a computer. These include databases, library information retrieval systems, systems for selling and registering tickets in theaters, railway and air ticket offices.

Promising information media are knowledge bases and expert systems. With their help, consultations on medical topics, information on various services, help inventors, advise technologists, designers and give answers, simulating the behavior of experts in a particular field of knowledge and professional activity, will be given.

Tools are programs and software packages that programmers use to create programs and automated systems. These include text editors, interpreters, compilers, and other special software tools.

If gaming, business and learning programs serve as a means for organizing technologies for presenting information services, then tool programs create the basis for certain programming technologies.

Operating systems play a special role in the functioning of computers and the maintenance of software tools. The work of any computer begins with the loading and launch of the operating system, previously placed on the system disk.

Basic job data

Introduction

Chapter 1. The development of information technology in the period from the XIV to the XVII century

Chapter 2. The development of information technology from the XVIII to XX century

Conclusion

Glossary

List of sources used

List of abbreviations

Introduction

I chose this topic because I find it interesting and relevant. Next, I will try to explain why I made this choice and present some historical data on this topic.

In the history of mankind, there are several stages that human society has consistently passed in its development. These stages differ in the main way the society ensures its existence and the type of resources used by man and playing a major role in the implementation of this method. These stages include: the stages of gathering and hunting, agrarian and industrial. In our time, the most developed countries of the world are at the final stage of the industrial stage of the development of society. They carry out the transition to the next stage, which is called "information". In this society, information plays a decisive role. The infrastructure of society is formed by the ways and means of collecting, processing, storing and distributing information. Information becomes a strategic resource.

Therefore, since the second half of the 20th century in the civilized world, the main determining factor in the socio-economic development of society has been the transition from the "economy of things" to the "economy of knowledge", there has been a significant increase in the importance and role of information in solving almost all problems of the world community. This is convincing evidence that the scientific and technological revolution is gradually turning into an intellectual and information revolution, information is becoming not only a subject of communication, but also a profitable commodity, an unconditional and effective modern means of organizing and managing social production, science, culture, education and socio-economic development. development of society as a whole.

Modern advances in informatics, computer technology, operational printing and telecommunications have given rise to a new type of high technology, namely information technology.

The results of scientific and applied research in the field of informatics, computer technology and communications have created a solid foundation for the emergence of a new branch of knowledge and production - the information industry. The world is successfully developing the industry of information services, computer production and computerization as a technology for automated information processing; The industry and technology in the field of telecommunications has reached an unprecedented scale and qualitative leap - from the simplest communication line to the space one, covering millions of consumers and representing a wide range of possibilities for transporting information and interconnecting its consumers.

This whole complex (the consumer with his tasks, computer science, all technical means of information support, information technology and the information services industry, etc.) constitutes the infrastructure and information space for the implementation of the informatization of society.

Thus, informatization is a complex process of information support for the socio-economic development of society on the basis of modern information technologies and appropriate technical means.

And so the problem of informatization of society has become a priority and its importance in society is constantly growing.

Chapter 1. The development of information technology in the period from the XIV to the XVIII century

The history of the creation of digital computing facilities goes back centuries. It is fascinating and instructive, the names of outstanding scientists of the world are associated with it.

In the diaries of the brilliant Italian Leonardo da Vinci (1452 - 1519), already in our time, a number of drawings were discovered that turned out to be a sketch of a gear-wheel adding computer capable of adding 13-digit decimal numbers. Specialists of the well-known American company IBM reproduced the machine in metal and were convinced of the full viability of the scientist's idea. His adding machine can be considered a milestone in the history of digital computing. It was the first digital adder, a kind of embryo of the future electronic adder - the most important element of modern computers, still mechanical, very primitive (with manual control). In those years far from us, the brilliant scientist was probably the only person on Earth who understood the need to create devices to facilitate labor in performing calculations.

However, the need for this was so small that only more than a hundred years after the death of Leonardo da Vinci, another European was found - the German scientist Wilhelm Schickard (1592-1636), who, of course, did not read the diaries of the great Italian, who proposed his solution to this problem. The reason that prompted Shikkard to develop a calculating machine for summing and multiplying six-digit decimal numbers was his acquaintance with the Polish astronomer J. Kepler. Having familiarized himself with the work of the great astronomer, which was mainly related to calculations, Shikkard was set on fire with the idea of ​​helping him in hard work. In a letter addressed to him, sent in 1623, he gives a drawing of the machine and tells how it works. Unfortunately, history has not preserved any data on the further fate of the car. Apparently, an early death from a plague that swept Europe prevented the scientist from fulfilling his plan.

The inventions of Leonardo da Vinci and Wilhelm Schickard became known only in our time. They were unknown to contemporaries.

In the 17th century the situation changed. In 1641 - 1642. nineteen-year-old Blaise Pascal (1623 - 1662), then a little-known French scientist, creates a working adding machine ("pascaline"), see Appendix A. In the beginning, he built it with one sole purpose - to help his father in the calculations performed when collecting taxes . In the next four years, he created more advanced models of the machine. They were six and eight bit, built on the basis of gears, could add and subtract decimal numbers. Approximately 50 models of machines were created, B. Pascal received a royal privilege for their production, but the "Pascalins" did not receive practical application, although a lot was said and written about them (mainly in France).

In 1673 another great European, the German scientist Wilhelm Gottfried Leibniz (1646 - 1716), creates a calculating machine (an arithmetic device, according to Leibniz) for adding and multiplying twelve-digit decimal numbers. To the gears, he added a stepped roller, which allowed multiplication and division. "... My machine makes it possible to perform multiplication and division over huge numbers instantly, moreover, without resorting to sequential addition and subtraction," W. Leibniz wrote to one of his friends.

In digital electronic computers (computers), which appeared more than two centuries later, a device that performs arithmetic operations (the same as Leibniz's "arithmetic device") was called arithmetic. Later, as a number of logical operations were added, they began to call it arithmetic-logical. It has become the main device of modern computers.

Thus, the two geniuses of the 17th century set the first milestones in the history of the development of digital computing.

The merits of W. Leibniz, however, are not limited to the creation of an "arithmetic instrument". From his student years until the end of his life, he was engaged in the study of the properties of the binary number system, which later became the main one in the creation of computers. He gave it a certain mystical meaning and believed that on its basis it was possible to create a universal language to explain the phenomena of the world and use it in all sciences, including philosophy. The image of the medal, drawn by W. Leibniz in 1697, has been preserved, explaining the relationship between the binary and decimal systems of calculation (see Appendix B).

In 1799, in France, Joseph Marie Jacard (1752 - 1834) invented the loom, which used punched cards to set the pattern on the fabric. The initial data necessary for this were recorded in the form of punches in the corresponding places of the punched card. This is how the first primitive device for storing and entering software (controlling the weaving process in this case) information appeared.

In 1795, in the same place, the mathematician Gaspard Prony (1755 - 1839), who was commissioned by the French government to carry out work related to the transition to the metric system of measures, for the first time in the world developed a technological calculation scheme that involves the division of labor of mathematicians into three components. The first group of several highly qualified mathematicians determined (or developed) the methods of numerical calculations necessary for solving the problem, allowing them to reduce calculations to arithmetic operations - add, subtract, multiply, divide. The task of the sequence of arithmetic operations and the determination of the initial data necessary for their execution ("programming") was carried out by the second, somewhat more expanded in composition, group of mathematicians. To carry out the compiled "program" consisting of a sequence of arithmetic operations, there was no need to involve highly qualified specialists. This, the most time-consuming part of the work, was entrusted to the third and most numerous group of calculators. This division of labor made it possible to significantly speed up the results and increase their reliability. But the main thing was that this gave impetus to the further process of automation, the most time-consuming (but also the simplest!) third part of calculations - the transition to the creation of digital computing devices with program control of a sequence of arithmetic operations.

This final step in the evolution of digital computing devices (mechanical type) was made by the English scientist Charles Babbage (1791 - 1871). A brilliant mathematician, excellent in numerical methods of calculation, already experienced in creating technical means to facilitate the computational process (Babbage's difference machine for tabulating polynomials, 1812 - 1822), he immediately saw in the computing technology proposed by G. Prony the possibility of further development of his works. The analytical engine (as Babbage called it), the project of which he developed in 1836 - 1848, was a mechanical prototype of computers that appeared a century later. It was supposed to have the same five main devices as in a computer: arithmetic, memory, control, input, output.



Loading...
Top