Von Neumann's principles for constructing an electronic computer. Computer architecture

Also known as the von Neumann model, or Princeton architecture, it is based on a technique described in 1945 by mathematician and physicist John von Neumann as part of his "First Design" report on the EDVAC computer.

Architecture diagram

Von Neumann's report described an architecture diagram for an electronic digital computer with parts consisting of processing units, which contains:

  • arithmetic-logical unit;
  • register processor;
  • a control unit containing a command register and a program counter;
  • a storage device for storing data;
  • external storage device;
  • input and output mechanisms.

The point of the design was that any information stored on a computer could be used by a program in which the selected operation data could not be played back simultaneously because they shared a common bus. This is mentioned in the "First Project", which describes the scientist's thoughts on what architecture should be. Von Neumann called this situation a “bottleneck,” which often limits system performance.

A digital computer is a computer that stores a program that contains program instructions, read data, write data, and also includes random access memory (RAM). The principles of John von Neumann's architecture are also set out in his work “The First Project”. According to him, stored program computers were an improvement over control computers such as ENIAC. The latter was programmed by setting switches and inserting a patch causing data and control signals to be routed between different functional blocks. The vast majority of modern computers also use memory in this way. At the same time, von Neumann differs, for example, from Harvard, in that it uses cache memory rather than main memory.

Background

The first ones had predetermined fixed programs. Some very simple computers still use this design, either for simplicity or for educational purposes. For example, a desktop calculator is also a fixed-program computer. It can work with basic math, but it cannot be used as a game console. Changing the fixed program of a machine requires rewiring, restructuring or reorganizing the apparatus. The earliest computers were not so narrowly focused, since they were developed for the first time and for scientific purposes. Reprogramming came much later and was a labor-intensive process, starting with block diagrams and paper notes and ending with detailed technical designs. The process of physically upgrading the vehicle's restoration channels was especially difficult. It can take three weeks to install the program on the ENIAC and try to get it to work.

New idea

With the introduction of computers that stored programs in memory, everything changed. Stored in memory, they are a construct with a set of instructions. This means that the machine can immediately receive a set of commands to perform calculations.

The design of such programs refers to self-modifying codes. One of the first considerations for such an object was the need for an algorithm to increase or otherwise change the address portion of instructions. It was made by hand in early designs. This became less important when index registers and indirect addressing became common features found in John von Neumann machine architectures. Another use is to insert frequently used data in a command stream using an immediate solution. But self-modifying code has been largely criticized because it tends to be difficult to understand and debug. In addition, it also turned out to be ineffective in terms of reproducing and caching the circuitry of modern processors.

By and large, the ability to treat instructions as data is what makes assemblers, compilers, assemblers, loaders, and other tools possible computer-aided programming objects. So to speak, write programs that write programs. On a smaller scale, repetitive intensive input and output operations, such as BitBlt manipulation of primitive or pixel and vertex shaders in modern 3D graphics, have been found to be inefficient for operation without custom hardware.

Development of the concept of a stored program

A mathematician who became interested in the problem of mathematical logic after Max Newman's lecture at Cambridge University wrote a paper in 1936, it was published in the publication of the London Mathematical Society. In it, he described a hypothetical machine that he called the "universal computing machine" and which is now known as the universal Turing machine. It had an infinite storage (in modern terminology - memory), which contained both instructions and data, for which this architecture was created. Von Neumann met Turing while he was a visiting professor at Cambridge in 1935, and during Turing's doctoral thesis at the Institute for Advanced Study in Princeton, New Jersey, in 1936-1937.

Independently, Gee Presper Eckert and John Mauchly, who developed the ENIAC in the School of Electrical Engineering at Pennsylvania State University, wrote about the concept of a machine that stored a program in memory in December 1943. When planning the new machine, the EDVAC, Eckert wrote in January 1944 that it would store data and programs in a new device with memory addressing using a mercury metal delay. This was the first time that the construction of a machine storing a program in memory was proposed in practice. At the same time, he and Mauchly were not aware of Turing's work (photo below).

Computer Architecture: Von Neumann's Principle

Von Neumann was involved in the "Manhattan Project" at Los Alamos National Laboratory, which required enormous amounts of computation. This attracted him to the ENIAC project in the summer of 1944. There he entered into discussions on the development of the EDVAC computer. As part of this group, he wrote a paper entitled "First Draft Report on EDVAC", based on the work of Eckert and Mauchly. It was unfinished when his colleague Goldstein distributed the project with the name of von Neumann (by the way, Eckert and Mauchly were dumbfounded by such news). This document was read by dozens of von Neumann's colleagues in America and Europe and had a major influence on the next stage of computer development.

The basic principles of von Neumann's architecture, outlined in the "First Project", were gaining widespread fame while Turing was highlighting his report on the electronic calculator, which was described in detail in engineering and programming. It also outlined the author’s idea of ​​a machine called the Automatic Computing Engine (ACE). He presented it to the executive committee of the British National Physical Laboratory in 1946. Over time, there have even been various successful implementations of the ACE design.

Start of projects implementation

Both the von Neumann project and the Turing papers described computers that stored a specific program in memory, but von Neumann's paper achieved greater circulation in society, and computer architecture became known as John von Neumann architecture.

In 1945, Professor Neumann, who was then working at the Philadelphia engineering school where the first ENIAC was built, issued a paper on behalf of a group of his colleagues on the logical design of digital computers. The report contained a fairly detailed proposal for the design of the machine, which has since become known as EDVAC. It had only recently been created in America, but the report inspired von Neumann to create EDSAC.

Maniacs and Joniacs

In 1947, Burks, Goldstein, and von Neumann published another report highlighting the design of another type of machine (this time parallel) that would be extremely fast, perhaps capable of performing up to 20,000 operations per second. They noted that an unresolved problem in building it was designing a suitable memory whose entire contents must be instantly accessible. They first proposed using a special vacuum tube called a Selectron, which had been invented at the Princeton Laboratory. Such tubes were expensive and very difficult to make, especially if this architecture was used. Von Neumann subsequently decided to build a machine based on the Williams memory. This machine, which was completed in June 1952 in Princeton, became widely known as MANIAC (or simply Maniacs). Its design inspired the construction of half a dozen or more similar devices that are now being built in America and are jokingly called Johniacs.

Creation principles

One of the most modern digital computers, embodying developments and improvements in automatic electronic computing techniques, was demonstrated at the National Physical Laboratory at Teddington, where it was designed and built by a small team of mathematicians, electronics and research engineers, with the assistance of a number of manufacturing engineers from the English Electric Company Ltd. The equipment is still in the lab, but only as a prototype of a much larger setup known as the Automatic Computing Engine. But despite its relatively low weight and containing only 800 thermionic valves, it is an extremely fast and versatile calculating machine.

The basic concepts and abstract principles of calculation using a machine were formulated by Dr. Turing on the basis of the same London Mathematical Society back in 1936, but work on such machines in Great Britain was delayed by the war. In 1945, consideration of the problems of creating such devices continued at the National Physical Laboratory by Dr. Wormsley, laboratory superintendent of the Department of Mathematics. He joined Turing with his small staff of specialists, and by 1947 preliminary planning was sufficiently advanced to justify the creation of a special group.

The first computers based on von Neumann architecture

The first project describes a design that has been used by many universities and corporations to build their computers. Among them, only ILLIAC and ORDVAC had compatible instruction sets.

The classical von Neumann architecture was embodied in the Manchester Small Experimental Machine (SSEM), nicknamed Baby, from the University of Manchester, which made its first successful run as a program memory device on June 21, 1948.

EDSAC from the University of Cambridge, the first practical electronic computer of its type, was launched successfully for the first time in May 1949.

Development of created models

IBM SSEC had the ability to treat instructions as data and was publicly demonstrated on January 27, 1948. This ability was claimed in a US patent. However, it was a partially electromechanical machine rather than a fully electronic one. In practice, the instructions were read from a paper tape due to its limited memory.

Baby was the first fully electronic computer to run stored programs. He ran the factoring program for 52 minutes on June 21, 1948, after running a simple division calculation and a calculation that shows two numbers are coprime.

ENIAC was modified to operate as a primitive read-only computer, but using the same architecture, and was demonstrated on September 16, 1948, with Adele Goldstein launching the program with the help of von Neumann.

BINAC conducted several test programs in February, March and April 1949, although it was not completed until September 1949. In addition, test runs (some successful) of other electronic computers, which are characterized by this architecture, were carried out. Von Neumann, by the way, continued to work on the Manhattan Project. Such a universal person.

Evolution of bus system architecture

Decades later, already in the 60s and 70s, computers in general became smaller and faster, which led to some of the evolutions that von Neumann's computer architecture underwent. For example, memory mapping of input and output allows the corresponding devices whose data and instructions for integration into the system will be processed to remain in memory. One bus system can be used to provide a modular system with smaller ones. This is sometimes called "rationalization" of the architecture. In subsequent decades, simple microcontrollers sometimes omit some features of the typical model in order to reduce cost and size. But large computers follow the established architecture as they have added features to improve performance.

At the everyday level, most people strongly associate the term “architecture” with various buildings and other engineering structures. So, we can talk about the architecture of a Gothic cathedral, the Eiffel Tower or an opera house. In other areas, this term is used quite rarely, but for computers the concept of “computer architecture” (electronic computer) has already been firmly established and has been widely used since the 70s of the last century. In order to understand how programs and scripts are executed on a computer, you must first know how each of its components works. The foundations of the doctrine of computer architecture, which are discussed in the lesson, were laid by John von Neumann. You can learn more about logical nodes, as well as the backbone-modular principle of the architecture of modern personal computers in this lesson.

The principles underlying computer architecture were formulated in 1945 by John von Neumann, who developed the ideas of Charles Babbage, who represented the operation of a computer as the operation of a set of devices: processing, control, memory, input-output.

Von Neumann's principles.

1. The principle of memory homogeneity. You can perform the same actions on commands as on data.

2. The principle of memory addressability. Main memory is structurally composed of numbered cells; Any cell is available to the processor at any time. This implies the ability to name memory areas so that the values ​​stored in them can later be accessed or changed during program execution using the assigned names.

3. The principle of sequential program control. It assumes that a program consists of a set of commands that are executed by the processor automatically one after another in a certain sequence.

4. The principle of architectural rigidity. Immutability of the topology, architecture, and list of commands during operation.

Computers built on von Neumann principles have a classical architecture, but besides it, there are other types of architecture. For example, Harvard. Its distinctive features are:

  • instruction store and data store are different physical devices;
  • The instruction channel and data channel are also physically separated.

In the history of the development of computing technology, a qualitative leap occurred approximately every 10 years. This leap is associated with the advent of a new generation of computers. The idea of ​​dividing machines appeared due to the fact that during the short history of its development, computer technology has undergone a great evolution both in terms of the elemental base (lamps, transistors, microcircuits, etc.), and in the sense of changes in its structure, the emergence of new opportunities, and expansion of areas application and nature of use. More details stages of computer development shown in Fig. 2. In order to understand how and why one generation was replaced by another, it is necessary to know the meaning of such concepts as memory, speed, degree of integration, etc.

Rice. 2. Computer generations ()

Among computers that are not classical, not von Neumann architecture, we can distinguish the so-called neurocomputers. They simulate the work of human brain cells, neurons, as well as some parts of the nervous system capable of exchanging signals.

Each logical node of the computer performs its own functions. Functions processor(Fig. 3):

- data processing (performing arithmetic and logical operations on them);

- control of all other computer devices.

Rice. 3. Computer central processing unit ()

The program consists of separate commands. The command includes an operation code, the addresses of the operands (the quantities involved in the operation) and the address of the result.

Execution of the command is divided into the following stages:

· team selection;

  • generating the address of the next command;
  • command decoding;
  • calculating operand addresses;
  • operand selection;
  • execution of the operation;
  • formation of a result sign;
  • recording the result.

Not all of the stages are present when executing any instruction (depending on the type of instruction), but the stages of fetching, decoding, generating the address of the next instruction, and executing the operation always take place. In certain situations, two more steps are possible:

  • indirect addressing;
  • response to interruption.

RAM(Fig. 4) is arranged as follows:

  • receiving information from other devices;
  • remembering information;
  • transfer of information on request to other computer devices.

Rice. 4. RAM (Random Access Memory) of the computer ()

The architecture of modern computers is based on backbone-modular principle(Fig. 5). The modular principle allows you to complete the desired configuration and make the necessary upgrades. It relies on the bus principle of information exchange between modules. The system bus or computer bus includes several buses for various purposes. The backbone includes three multi-bit buses:

  • data bus;
  • address bus;
  • control bus.

Rice. 5. Backbone-modular principle of PC construction

The data bus is used to transfer various data between computer devices; the address bus is used to address the transferred data, that is, to determine its location in memory or in input/output devices; The control bus includes control signals that serve to temporarily coordinate the operation of various computer devices, to determine the direction of data transfer, to determine the formats of transferred data, etc.

This principle is valid for various computers, which can be divided into three groups:

  • stationary;
  • compact (laptops, netbooks, etc.);
  • pocket (smartphones, etc.).

The system unit of a desktop computer or a compact case contains the main logical units - a motherboard with a processor, a power supply, external memory drives, etc.

References

1. Bosova L.L. Computer Science and ICT: Textbook for 8th grade. - M.: BINOM. Knowledge Laboratory, 2012.

2. Bosova L.L. Computer Science: Workbook for 8th grade. - M.: BINOM. Knowledge Laboratory, 2010.

3. Astafieva N.E., Rakitina E.A., Computer science in schemes. - M.: BINOM. Knowledge Laboratory, 2010.

4. Tannenbaum E. Computer architecture. - 5th ed. - St. Petersburg: Peter, 2007. - 844 p.

1. Internet portal “All Tips” ()

2. Internet portal “Electronic encyclopedia “Computer”” ()

3. Internet portal “apparatnoe.narod.ru” ()

Homework

1. Chapter 2, §2.1, 2.2. Bosova L.L. Computer Science and ICT: Textbook for 8th grade. - M.: BINOM. Knowledge Laboratory, 2012.

2. What does the abbreviation COMPUTER stand for?

3. What does the term “Computer Architecture” mean?

4. Who formulated the basic principles underlying computer architecture?

5. What is the architecture of modern computers based on?

6. Name the main functions of the central processor and RAM of a PC.

3. Von Neumann's principles. Architecture of a classical computer, P von Neumann principles

The functioning of a computer is based on two fundamental concepts in computing. technology: concept of algorithm; principle of program control. An algorithm is a uniquely defined sequence of actions, consisting of formally defined operations on initial data, leading to a solution in a finite number of steps.

Properties algorithms

    discreteness of information with which algorithms work; the finiteness and elementary nature of the set of operations performed when implementing the algorithm;

    determinism - reproducibility of the results of the algorithm;

    mass character - the possibility of using the algorithm for various initial data from an admissible set

A program is a description of an algorithm in any language.

Principle software management(PPU) was first formulated by the Hungarian mathematician and physicist John von Neumann, with the participation of Holtztein and Bertz in 1946 and is dominant at this stage in the development of computing technology.

PPU includes several architectural and functional principles.

1) Binary coding principle Information is encoded in binary form and divided into units (elements) of information called words. The use of the binary number system is determined by the specifics of electronic circuits. A word is an indivisible unit of information.

2) Uniformity of information coding. Different types of information words differ in the way they are used, but not in the way they are encoded. Words representing different types of information are indistinguishable (data, commands). The order in which they are used determines their specificity. The same commands can be used to process different data.

3) Address organization of RAM. Words of information are placed in machine memory cells and are identified by cell numbers called word addresses. Determines the specifics of storing and identifying information. The cell address is the machine identifier for the value and command.

4) The computer has a limited set of commands. Each individual command defines a simple (single) step of converting information.

5) The algorithm is implemented through sequential execution of commands. Performing calculations prescribed by the algorithm comes down to sequential execution of commands in an order uniquely determined by the program. The address of the next command is uniquely determined during the execution of the current command (conditional jumps are possible). The computation process continues until the command is executed to complete the computation. Advantages:

Ease of hardware implementation.

High versatility, which is limited only by the set of processor commands.

Flaws:

point 2: requires the programmer to correctly use data of various types; if they are not followed, errors appear that are often difficult to identify. When solving complex computing problems, this greatly increases the complexity of software development.

pz. assumes a linear organization of memory. This makes it difficult to calculate the layout elements of complex data types.

Classical architecturecomputer

Computer structure

In 1946, D. von Neumann, G. Goldstein and A. Berks, in their joint article, outlined new principles for the construction and operation of computers. Subsequently, the first two generations of computers were produced on the basis of these principles. There have been some changes in later generations, although Neumann's principles are still relevant today.

In fact, Neumann managed to summarize the scientific developments and discoveries of many other scientists and formulate something fundamentally new on their basis.

Von Neumann's principles

    Use of the binary number system in computers. The advantage over the decimal number system is that devices can be made quite simple, and arithmetic and logical operations in the binary number system are also performed quite simply.

    Computer software control. The operation of the computer is controlled by a program consisting of a set of commands. Commands are executed sequentially one after another. The creation of a machine with a stored program was the beginning of what we call programming today.

    Computer memory is used not only to store data, but also programs.. In this case, both program commands and data are encoded in the binary number system, i.e. their recording method is the same. Therefore, in certain situations, you can perform the same actions on commands as on data.

    Computer memory cells have addresses that are numbered sequentially. At any time, you can access any memory cell by its address. This principle opened up the possibility of using variables in programming.

    Possibility of conditional jump during program execution. Despite the fact that commands are executed sequentially, programs can implement the ability to jump to any section of code.

The most important consequence of these principles is that now the program was no longer a permanent part of the machine (like, for example, a calculator). It became possible to easily change the program. But the equipment, of course, remains unchanged and very simple.

By comparison, the program of the ENIAC computer (which did not have a stored program) was determined by special jumpers on the panel. It could take more than one day to reprogram the machine (set jumpers differently). And although programs for modern computers can take years to write, they work on millions of computers after a few minutes of installation on the hard drive.

How does a von Neumann machine work?

A von Neumann machine consists of a storage device (memory) - a memory, an arithmetic-logical unit - ALU, a control device - CU, as well as input and output devices.

Programs and data are entered into memory from the input device through an arithmetic logic unit. All program commands are written to adjacent memory cells, and data for processing can be contained in arbitrary cells. For any program, the last command must be the shutdown command.

The command consists of an indication of what operation should be performed (from the possible operations on a given hardware) and the addresses of memory cells where the data on which the specified operation should be performed is stored, as well as the address of the cell where the result should be written (if it needs to be saved in memory).

The arithmetic logic unit performs the operations specified by the instructions on the specified data.

From the arithmetic logic unit, the results are output to memory or an output device. The fundamental difference between a memory and an output device is that in a memory, data is stored in a form convenient for processing by a computer, and it is sent to output devices (printer, monitor, etc.) in a way that is convenient for a person.

The control unit controls all parts of the computer. From the control device, other devices receive signals “what to do”, and from other devices the control unit receives information about their status.

The control device contains a special register (cell) called the “program counter”. After loading the program and data into memory, the address of the first instruction of the program is written to the program counter. The control unit reads from memory the contents of the memory cell, the address of which is in the program counter, and places it in a special device - the “Command Register”. The control unit determines the operation of the command, “marks” in memory the data whose addresses are specified in the command, and controls the execution of the command. The operation is performed by the ALU or computer hardware.

As a result of the execution of any command, the program counter changes by one and, therefore, points to the next command of the program. When it is necessary to execute a command that is not next in order to the current one, but is separated from the given one by a certain number of addresses, then a special jump command contains the address of the cell to which control must be transferred.

Law and CCA

Lesson 9. Backbone-modular principle of computer construction.

Assignment: using the educational text, answer the following questions (write in your notebook).

1. Who was the founder of the backbone-modular principle of modern PC architecture.

2. Computer architecture is...

3. List the basic principles underlying the backbone-modular construction of PC architecture.

4. What parts does the highway consist of?

5. What is the device interface for?

6. What is used to negotiate interfaces? How does this coordination work (draw a diagram)?

7. How is data processed on a computer?

8. Draw a schematic diagram of the backbone-modular principle of a PC.

9. The highway is...

10. What is the purpose of the control bus, address bus, data bus?

12. What does the modular principle allow the PC user? List the main advantages of the modular-backbone principle.

D/z. Answer questions, prepare to answer the educational text.

Educational text

Backbone-modular principle of computer construction

Let's remember the information received in previous lessons:

A computer is an electronic device designed to work with information, namely introduction, processing, storage, output and transmission of information. In addition, a PC is a single entity of two entities - hardware and software.

Computer architecture is a description of its logical organization, resources and operating principles of its structural elements. Includes the main computer devices and the structure of connections between them.

Usually, when describing the architecture of a computer, special attention is paid to those principles of its organization that are characteristic of most machines belonging to the family being described, and also that influence programming capabilities.

The architecture of modern computers is based on principles of John von Neumann and the backbone-modular principle.

In 1946, D. von Neumann, G. Goldstein and A. Berks, in their joint article, outlined new principles for the construction and operation of computers. Subsequently, the first two generations of computers were produced on the basis of these principles. There have been some changes in later generations, although Neumann's principles are still relevant today.

In fact, Neumann managed to summarize the scientific developments and discoveries of many other scientists and formulate something fundamentally new on their basis.

Von Neumann's principles

1. Use of the binary number system in computers. The advantage over the decimal number system is that devices can be made quite simple, and arithmetic and logical operations in the binary number system are also performed quite simply.


2. Computer software control. The operation of the computer is controlled by a program consisting of a set of commands. Commands are executed sequentially one after another. The creation of a machine with a stored program was the beginning of what we call programming today.

3. Computer memory is used not only to store data, but also programs.. In this case, both program commands and data are encoded in the binary number system, i.e. their recording method is the same. Therefore, in certain situations, you can perform the same actions on commands as on data.

4. Computer memory cells have addresses that are numbered sequentially. At any time, you can access any memory cell by its address. This principle opened up the possibility of using variables in programming.

5. Possibility of conditional jump during program execution. Despite the fact that commands are executed sequentially, programs can implement the ability to jump to any section of code.

6. Availability of information input and output devices. These devices are basic and sufficient for computer operation at the user level.

7. Open Architecture Principle– rules for building a computer, according to which each new block must be compatible with the old one and be easily installed in the same place in the computer. In a computer, you can just as easily replace old blocks with new ones, wherever they are located, as a result of which the operation of the computer is not only not disrupted, but also becomes more productive. This principle allows you not to throw away, but to modernize a previously purchased computer, easily replacing outdated units in it with more advanced and convenient ones, as well as purchasing and installing new units. Moreover, in all of them, the connectors for connecting them are standard and do not require any changes in the design of the computer itself.

The most important consequence of these principles is that now the program was no longer a permanent part of the machine (like, for example, a calculator). It became possible to easily change the program. But the equipment, of course, remains unchanged and very simple.

A computer is not an indivisible, integral object. It consists of a number of devices - modules.(The user can complete his computer from these modules at his own request). For each device in the computer there is an electronic circuit that controls it. This circuit is called a controller, or adapter. Some controllers can control several devices at once. All controllers and adapters interact with the processor and RAM through the system bus (a set of electronic lines. A bus is a cable consisting of many wires.

The backbone provides data exchange between computer devices.

The highway consists of three parts:

1. address bus, on which the address of the required memory cell or device with which information will be exchanged is set.

2. Data bus, through which the necessary information will be transmitted.

3. Control bus regulating this process. (signals are transmitted via the control bus that determine the nature of the exchange of information along the highway. These signals indicate what operation should be performed).

In order for a computer to function correctly, it is necessary that all its devices work together, “understand” each other and “do not conflict.” This is ensured thanks to the same interface that all computer devices have.
An interface is a means of connecting two devices, in which all physical and logical parameters are consistent with each other.

Since data exchange between devices occurs through the bus, to coordinate interfaces, all external devices are connected to the bus not directly, but through their controllers (adapters) and ports.

Ports can be serial or parallel. Slow or remote devices (mouse, modem) are connected to serial ports, and faster ones (scanner, printer) are connected to parallel ports. The keyboard and monitor are connected to specialized ports.

In order to avoid connecting a device to someone else’s port by mistake or ignorance, each device has an individual plug shape that does not fit into the “foreign” connector.

Information presented in digital form and processed on a computer is called data.

The sequence of commands that a computer executes while processing data is called program.

Processing data on a computer:

1. The user launches a program stored in long-term memory, it is loaded into operational memory and begins to execute.

2. Execution: The processor reads the instructions and executes them. The necessary data is loaded into RAM from long-term memory or entered using input devices.

3. The output (received) data is written by the processor into RAM or long-term memory, and is also provided to the user using information output devices.

To ensure information exchange between different devices, some kind of backbone must be provided to move information flows.

Trunk (system bus) includes three multi-bit buses: data bus, address bus and control bus, which are multi-wire lines. The processor and RAM, as well as peripheral input, output and information storage devices that exchange information in machine language (sequences of zeros and ones in the form of electrical pulses) are connected to the backbone.

Data bus. This bus transfers data between different devices. For example, data read from RAM may be sent to the processor for processing, and then the received data may be sent back to RAM for storage. Thus, data on the data bus can be transferred from device to device in any direction, i.e. the data bus is bidirectional. The main operating modes of the processor using the data bus include the following: writing/reading data from RAM, writing/reading data from external memory, reading data from an input device, sending data to an output device.

The data bus width is determined by the processor capacity, that is, the number of binary bits that can be processed or transmitted by the processor simultaneously. The capacity of processors is constantly increasing as computer technology develops.

Address bus. The choice of the device or memory cell to which data is sent or read via the data bus is made by the processor. Each device or RAM cell has its own address. The address is transmitted along the address bus, and signals are transmitted along it in one direction - from the processor to RAM and devices (unidirectional bus).

The width of the address bus determines the amount of addressable memory (address space), that is, the number of one-byte RAM cells that can have unique addresses.

The number of addressable memory cells can be calculated using the formula:

N=2 I, where I is the address bus width.

Each bus has its own address space, i.e. the maximum amount of addressable memory:

2 16 = 64 KB

2 20 = 1 MB

2 24 = 16 MB

2 32 = 4 GB

Control bus. The control bus transmits signals that determine the nature of information exchange along the highway. Control signals indicate what operation - reading or writing information from memory - needs to be performed, synchronize the exchange of information between devices, and so on.

Modular principle allows the consumer to assemble the computer configuration he needs and, if necessary, upgrade it. Each individual computer function is implemented by one or more modules - structurally and functionally complete electronic units in a standard design. Organizing a computer structure on a modular basis is similar to building a block house.

The backbone-modular principle has a number of advantages:

1. To work with external devices, the same processor commands are used as for working with memory.

2. Connecting additional devices to the backbone does not require changes to existing devices, processor, or memory.

3. By changing the composition of the modules, you can change the power and purpose of the computer during its operation.

Today it’s hard to believe, but computers, without which many can no longer imagine their lives, appeared only some 70 years ago. One of those who made a decisive contribution to their creation was the American scientist John von Neumann. He proposed the principles on which most computers operate to this day. Let's look at how a von Neumann machine works.

Brief biographical information

Janos Neumann was born in 1930 in Budapest, into a very wealthy Jewish family, which later managed to obtain a noble title. Since childhood, he was distinguished by outstanding abilities in all areas. At the age of 23, Neumann had already defended his PhD thesis in the field of experimental physics and chemistry. In 1930, the young scientist was invited to work in the USA, and at the same time, Neumann became one of the first employees of the Institute for Advanced Study, where he worked as a professor until the end of his life. Neumann's scientific interests were quite extensive. In particular, he is one of the creators of the quantum mechanics apparatus and the concept of cellular automata.

Contributions to computer science

Before finding out which principle von Neumann's architecture does not comply with, it will be interesting to know how the scientist came up with the idea of ​​​​creating a modern type of computing machine.

An expert in the mathematics of explosions and shock waves, von Neumann served as a scientific consultant to one of the United States Army Ordnance Survey laboratories in the early 1940s. In the fall of 1943, he arrived in Los Alamos to participate in the development of the Manhattan Project at the personal invitation of its leader. He was given the task of calculating the force of implosion compression of an atomic bomb charge to a critical mass. To solve it, large calculations were required, which at first were carried out on hand calculators, and later on mechanical tabulators from IBM, using punched cards.

I got acquainted with information on the progress of creating electronic-mechanical and fully electronic computers. He soon became involved in the development of the EDVAC and ENIAC computers, which led him to write the unfinished First Draft Report on EDVAC, in which he presented to the scientific community a completely new idea of ​​what computer architecture should be.

Von Neumann's principles

Computer science as a science had reached a dead end by 1945, since everyone stored processed numbers in their memory in 10th form, and programs for performing operations were specified by installing jumpers on the switchboard.

This significantly limited the capabilities of computers. The real breakthrough was von Neumann's principles. They can be briefly expressed in one sentence: the transition to the binary number system and the principle of a stored program.

Analysis

Let us consider on what principles the classical structure of the von Neumann machine is based, in more detail:

1. Transition to binary system from decimal

This principle of Neumann architecture allows the use of fairly simple logic devices.

2. Software control of an electronic computer

The operation of a computer is controlled by a set of commands executed sequentially one after another. The development of the first machines with a program stored in memory marked the beginning of modern programming.

3. Data and programs are stored together in computer memory

At the same time, both data and program commands have the same way of being written in the binary number system, so in certain situations it is possible to perform the same actions on them as on the data.

Consequences

In addition, the architecture of the Fonneyman machine has the following features:

1. Memory cells have addresses that are numbered sequentially

Thanks to the application of this principle, the use of variables in programming became possible. In particular, at any time you can access a particular memory cell by its address.

2. Possibility of conditional jump during program execution

As already mentioned, commands in programs must be executed sequentially. However, it is possible to jump to any section of the code.

How does a von Neumann machine work?

Such a mathematical model consists of a storage (memory), control, and input and output devices. All program commands are written in memory cells located nearby, and the data for processing them is in arbitrary cells.

Any team must consist of:

  • indications of what operation is to be performed;
  • addresses of memory cells in which the source data affected by the specified operation is stored;
  • addresses of cells in which the result should be written.

Operations on specific source data specified by commands are performed by the ALU, and the results are recorded in memory cells, i.e., they are stored in a form convenient for subsequent machine processing, or transmitted to an output device (monitor, printer, etc.) and become accessible to humans.

The control unit controls all parts of the computer. From it, other devices receive signals-orders “what to do,” and from other devices it receives information about what state they are in.

The control device has a special register called the “program counter” SC. After loading the source data and program into memory, the address of its 1st command is written to the CS. The control unit reads from the computer memory the contents of the cell whose address is in the IC and places it in the “Command Register”. The control device determines the operation corresponding to a specific command and “marks” in the computer memory the data whose addresses are indicated in it. Next, the ALU or computer begins executing an operation, upon completion of which the contents of the CS change by one, i.e., point to the next command.

Criticism

The shortcomings and current prospects continue to be a subject of debate. The fact that machines created on the principles put forward by this outstanding scientist are not perfect was noticed a very long time ago.

Therefore, in exam papers in computer science you can often find the question “what principle does the von Neumann architecture not meet and what shortcomings does it have?”

When answering the second part, be sure to indicate:

  • for the presence of a semantic gap between high-level programming languages ​​and command systems;
  • on the problem of matching OP and processor throughput;
  • to the emerging software crisis, caused by the fact that the costs of its creation are much lower than the cost of hardware development, and there is no possibility of fully testing the program;
  • lack of prospects in terms of performance, since its theoretical limit has already been reached.

As for which principle the von Neumann architecture does not comply with, we are talking about the parallel organization of a large number of data streams and commands, characteristic of a multiprocessor architecture.

Conclusion

Now you know which principle the von Neumann architecture does not comply with. It is obvious that science and technology do not stand still, and perhaps very soon a completely new type of computer will appear in every home, thanks to which humanity will reach a new level of development. By the way, the training program “Von Neumann Architecture” will help you prepare for the exam. Such digital educational resources make it easier to learn the material and provide an opportunity to evaluate your knowledge.



Loading...
Top