When and how did the first programs appear. Who invented the first computer virus? The task of "cutting bread" in a high-level programming language

Countess
Ada Lovelace

At a technological exhibition in 1834, Charles Babbage first publicly announced his new development - great-grandmother modern computer.

Naturally, his speech was full of mathematical terms and logical calculations, which were difficult for an unprepared person to understand.

And Ada Lovelace (1815-1852) not only understood everything, but also bombarded Charles with questions on the merits of the problem.

Babbage was struck by the sharpness of the girl's mind, besides, Ada was almost the same age as his daughter, who died early.

Who was this girl?

Ada Augusta Lovelace, née Byron, was born December 10, 1815 in the family of the famous English poet Lord Byron and his wife Anabella. A month after the birth of the child, Lord Byron left the family and never saw his daughter again.

Anabella did her best to ensure that her daughter would never become a poetess. She hired the daughters of outstanding teachers at that time to interest her in mathematics and music, and quite succeeded in this. During a serious illness, Ada, who lost the ability to walk for three years, continued her studies.

In 1834, at a technological exhibition, the young lady's obsession with mathematics found expression. A new, great opportunity has opened up with the help of mathematics to make a machine help a person solve mathematical problems! Subsequently, Babbage supervised Ada's scientific studies, sent her articles and books of interest, and introduced her to his work.

Looking ahead, I can say from my own experience that when I started writing my first programs on a computer as a student, I was also literally shocked by the capabilities of the machine in the field of mathematical calculations. And in terms of the volume of calculations, and in terms of speed, and in the absence of errors in the calculations, the computer, of course, did everything cool!

In 1835, Ada married Lord King, who later received the title of Earl of Lovelace. They had two sons and a daughter, but neither children, nor husband, nor social life could tear Ada away from her beloved mathematics. They don't call her "The Lady of Numbers" for nothing!

In 1842, the Italian mathematician Luis Menebrea, lecturer in ballistics at the Turin Artillery Academy, published An Essay on the Analytical Engine Invented by Charles Babbage. The book was written in French, and Babbage asked Ada Augusta to translate it into English.

The Countess of Lovelace, reasonably judging that her mother was quite enough to work with her grandchildren and with a large staff of domestic servants, happily returned to the world of mathematics. Ada Augusta decided to devote herself completely to her beloved science, to work on Babbage's machine and its wide popularization.

By the way, her husband fully supported her. Perhaps that is why his name went down in the history of computing.

For nine months, the countess worked on the text of the book, supplementing it with her own comments and remarks along the way. It was these comments and remarks that made her famous in the world of science, and at the same time introduced her into history.

In one of her notes, she independently wrote the first in the history of mankind computer program- an algorithm that is a list of operations for calculating Bernoulli numbers.

Anticipating the “stages” of computer programming, Ada Lovelace, like modern mathematicians, starts with a problem statement, then chooses a method of calculation that is convenient for programming, and only then proceeds to compose a program.

Lovelace's "notes" laid the foundation for modern programming. One of the most important concepts of programming is the concept of a cycle, which she defines as follows:

“A cycle of operations is to be understood as any group of operations that is repeated more than once.”

The organization of cycles in the program significantly reduces its volume. Without such a reduction, practical use analytical engine it would have been unrealistic, since she worked with punched cards, and a huge number of them would have been required for each task being solved.

“It can be said with good reason that the Analytical Engine weaves algebraic patterns in the same way that the Jacquard loom reproduces flowers and leaves”

wrote the Countess of Lovelace. She was one of the few who understood how the machine worked and what its prospects were.

Already at that time, Ada Lovelace was fully aware of the colossal possibilities of the universal computer.

However, she was well aware of the limits of these possibilities:

“It is advisable to warn against exaggerating the possibilities of the Analytical Engine. The Analytical Engine does not claim to create anything truly new. The machine can do everything that we know how to prescribe to it. She can follow analysis; but it cannot foresee any analytic dependencies or truths. The function of the machine is to help us get what we are already familiar with.”

At the same time, already in the 40s of the 19th century, she saw in the machine what her inventor Babbage was afraid to think about: “The essence and purpose of the machine will change depending on what information we put into it. The machine will be able to write music, paint pictures, and show science in ways that we have never seen anywhere.”

In her first and, unfortunately, the only scientific work, Ada Lovelace considered a large number of issues that are also relevant for modern programming. Countess Lovelace's notes to Louis Menebrea's book are only 52 pages long. Actually, this is all that Ada Lovelace left for history. But this brevity is the sister of great talent. Even 52 pages can turn the world around beyond recognition.

At the end of the 19th century, Herman Hollerith in America invented perforating machines. They used punched cards to store numerical information.

Each such machine could execute only one specific program, manipulating with punched cards and numbers punched on them.

Counting and perforating machines perforated, sorted, summed up, printed numerical tables. On these machines it was possible to solve many typical tasks of statistical processing, accounting and others.

G. Hollerith founded a company for the production of counting and punching machines, which was then transformed into a company IBM- now the most famous computer manufacturer in the world.

The immediate forerunners of computers were relay computing machines.

By the 30s of the 20th century, relay automation was greatly developed. , which allowed encode information in binary form.

During the operation of a relay machine, thousands of relays switch from one state to another.

Radio technology developed rapidly in the first half of the 20th century. The main element of radio receivers and radio transmitters at that time were vacuum tubes.

Electronic lamps became the technical basis for the first electronic computers (computers).

First computer - universal machine on vacuum tubes built in the USA in 1945.

This machine was called ENIAC (stands for Electronic Digital Integrator and Computer). The designers of ENIAC were J. Mouchli and J. Eckert.

The counting speed of this machine exceeded the speed of relay machines of that time by a thousand times.

First electronic computer ENIAC was programmed using the plug-and-switch method, that is, the program was built by connecting the individual blocks of the machine on a switching board with conductors.

This complex and tedious procedure for preparing the machine for work made it inconvenient to operate.

The main ideas that have been developed over the years Computer Engineering, were developed by the famous American mathematician John von Neumann

In 1946, the journal "Nature" published an article by J. von Neumann, G. Goldstein and A. Burks "Preliminary consideration of the logical design of an electronic computing device."

This article outlined the principles of the design and operation of computers. Chief among them is the principle of the stored program in memory, according to which the data and the program are placed in the general memory of the machine.

Principal description devices and operation of a computer are called computer architecture. The ideas outlined in the article mentioned above were called "computer architecture by J. von Neumann."

In 1949, the first computer with the Neumann architecture was built - the English machine EDSAC.

A year later, the American computer EDVAC appeared. The named machines existed in single copies. Serial production of computers began in the developed countries of the world in the 50s.

In our country, the first computer was created in 1951. It was called MESM - a small electronic calculating machine. The MESM designer was Sergey Alekseevich Lebedev

Under the leadership of S.A. Lebedev in the 50s, serial tube computers BESM-1 (large electronic calculating machine), BESM-2, M-20 were built.

At that time, these machines were among the best in the world.

In the 60s, S.A. Lebedev led the development of semiconductor computers BESM-ZM, BESM-4, M-220, M-222.

The outstanding achievement of that period was the BESM-6 machine. This is the first domestic and one of the first computers in the world with a speed of 1 million operations per second. Subsequent ideas and developments by S.A. Lebedev contributed to the creation of more advanced machines of the next generations.

Electronic computing technology is usually divided into generations

Generational changes were most often associated with a change in the element base of computers, with the progress of electronic technology.

This has always led to an increase in the computing power of computers, that is, speed and memory.

But this is not the only consequence of generational change. With such transitions, there were significant changes in the architecture of the computer, the range of tasks solved on the computer expanded, the way of interaction between the user and the computer changed.

First generation of computers - lamp cars of the 50s. The counting speed of the fastest machines of the first generation reached 20 thousand operations per second (computer M-20).

To enter programs and data, punched tapes and punched cards were used.

Since the internal memory of these machines was small (could contain several thousand numbers and program instructions), they were mainly used for engineering and scientific calculations not related to the processing of large amounts of data.

These were rather bulky structures containing thousands of lamps, sometimes occupying hundreds of square meters, consuming hundreds of kilowatts of electricity.

Programs for such machines were compiled in machine instruction languages. This is quite labor intensive work.

Therefore, programming in those days was accessible to few.

In 1949, the first semiconductor device was created in the United States, replacing the vacuum tube. It's called the transistor. Transistors quickly took root in radio engineering.

The second generation of computers

In the 60s, transistors became the element base for computers second generation.

The transition to semiconductor elements has improved the quality of computers in all respects: they have become more compact, more reliable, less energy intensive.

The speed of most machines reached tens and hundreds of thousands of operations per second.

Volume internal memory increased hundreds of times compared to the first generation computers.

External (magnetic) memory devices have been greatly developed: magnetic drums, magnetic tape drives.

Thanks to this, it became possible to create information-reference, search systems on a computer.

Such systems are associated with the need to store large amounts of information on magnetic media for a long time.

During the second generation programming languages ​​have evolved high level. The first of them were FORTRAN, ALGOL, COBOL.

Programming has ceased to depend on the model of the machine, it has become simpler, clearer, more accessible.

Programming as an element of literacy has become widespread, mainly among people with higher education.

Third generation of computers was created on a new element base - integrated circuits. With the help of a very complex technology, specialists have learned how to mount quite complex electronic circuits on a small plate of semiconductor material, with an area of ​​\u200b\u200bless than 1 cm.

They were called integrated circuits (ICs)

The first ICs contained dozens, then hundreds of elements (transistors, resistances, etc.).

When the degree of integration (the number of elements) approached a thousand, they began to be called large integrated circuits - LSI; then appeared very large integrated circuits - VLSI.

Third-generation computers began to be produced in the second half of the 60s, when the American company IBM began producing the IBM-360 system of machines. These were IS machines.

A little later, machines of the IBM-370 series, built on LSI, began to be produced.

In the Soviet Union in the 70s, the production of machines of the ES EVM series began ( One system computer) modeled on the IBM-360/370.

Transition to the third generation associated with significant changes in computer architecture.

Now you can run several programs on the same machine at the same time. This mode of operation is called multi-program (multi-program) mode.

The speed of the most powerful computer models has reached several million operations per second.

On machines of the third generation, a new type of external storage devices appeared - magnetic discs .

Like magnetic tapes, disks can store an unlimited amount of information.

But the storage magnetic disks(NMD) work much faster than NML.

New types of I/O devices are widely used: displays , plotters.

During this period, the areas of application of computers were significantly expanded. Databases, the first artificial intelligence systems, computer-aided design (CAD) and control (ACS) systems began to be created.

In the 1970s, a line of small (mini) computers received a powerful development. The machines of the American company DEC of the PDP-11 series have become a kind of standard here.

In our country, according to this model, a series of machines SM EVM (Small Computer System) was created. They are smaller, cheaper, more reliable than big machines.

Machines of this type are well adapted for the purpose of controlling various technical objects: production plants, laboratory equipment, vehicles. For this reason they are called control machines.

In the second half of the 1970s, the production of minicomputers exceeded the production of large machines.

Fourth generation of computers

Another revolutionary event in electronics occurred in 1971, when the American Intel announced the creation microprocessor .

The microprocessor is a very large integrated circuit capable of performing the functions of the main unit of a computer - processor

Microprocessor is a miniature brain that works according to a program embedded in its memory.

Initially, microprocessors began to be built into various technical devices: machines, cars, aircraft . Such microprocessors automatically control the operation of this technique.

By connecting the microprocessor with input-output devices, external memory, a new type of computer was obtained: a microcomputer

Microcomputers belong to the fourth generation machines.

A significant difference between microcomputers and their predecessors is their small size (the size of a household TV) and comparative cheapness.

This is the first computer type which appeared in retail sale.

The most popular type of computer today are personal computers.

The emergence of the phenomenon of personal computers is associated with the names of two American specialists: Steve Jobs and Steve Wozniak.

In 1976, their first production PC, the Apple-1, was born, and in 1977, the Apple-2.

The essence of what is Personal Computer, can be summarized as follows:

A PC is a microcomputer with user-friendly hardware and software.

The PC hardware uses

    color graphic display,

    mouse manipulators,

    "joystick",

    comfortable keyboard,

    user-friendly compact discs (magnetic and optical).

Software allows a person to easily communicate with the machine, quickly learn the basic techniques of working with it, get the benefits of a computer without resorting to programming.

Communication between a person and a PC can take the form of a game with colorful pictures on the screen, sound accompaniment.

It is not surprising that machines with such properties quickly gained popularity, and not only among specialists.

The PC is becoming as common household appliances as a radio or TV. They are produced in huge quantities, sold in stores.

Since 1980, the American company IBM has become a “trendsetter” in the PC market.

Its designers managed to create an architecture that has become the de facto international standard for professional PCs. The machines of this series were called IBM PC (Personal Computer).

In the late 80s and early 90s, Apple Corporation Macintosh machines became very popular. In the US, they are widely used in the education system.

The emergence and spread of the PC in terms of its significance for social development is comparable to the emergence of book printing.

It was the PC made computer literacy mass phenomenon.

With the development of this type of machine, the concept of "information technology" appeared, without which it is already becoming impossible to manage in most areas of human activity.

There is another line in the development of fourth-generation computers. This is a supercomputer. Machines of this class have a speed of hundreds of millions and billions of operations per second.

The first fourth-generation supercomputer was the American machine ILLIAC-4, followed by CRAY, CYBER, etc.

Of the domestic machines, the ELBRUS multiprocessor computer complex belongs to this series.

fifth generation computer These are the machines of the near future. Their main quality should be a high intellectual level.

Fifth generation machines are realized artificial intelligence.

Much has already been practically done in this direction.

Computers and other computing devices occupy a huge part of our lives. With the help of such devices, we are not only looking for necessary information or use useful programs, but also make purchases, communicate with friends and relatives, do work, spend leisure time and much more. Today it will not be difficult to scan a document or, for example, download your favorite tune. But until quite recently, humanity did not know such opportunities.

So, modern users may complain that the video file is loaded for several minutes longer than it should. Even some 30-40 years ago, in order to watch a new film, you had to go to the cinema at the appointed time. In order to listen to a beautiful melody 100 years ago, one would have to invite a musician and pay good money for it. And that's just for entertainment. It is hard to imagine how much time was spent on calculations and drawing up documents, on communication and receiving important information. Today, machines do all this for us thanks to one main process - programming. Even if you look at modern washing machine or a slow cooker, then it is equipped with a simple, but still artificial intelligence. We use such devices almost every day, but we don’t even think about who made this all possible. Today we will talk about people who have made our lives much easier and opened up an incredible world for us. program code- programmers. You will find out who was the first programmer in history and how it all began.

First steps towards the program

It is generally accepted that only men have passion and ability. If you look at the list of the most outstanding programmers, only male names catch your eye. However, few people know that it was a woman who was the first programmer in the history of mankind. Who was this significant person?

Many of us have heard of such a famous English writer as George Gordon Byron. His daughter, Ada Augusta Lovelace (Byron), is the first programmer in the world. Love for mathematics was instilled in the girl by her mother from childhood. The best scientists in the district where the young lady lived worked with her. So, her first teacher was the outstanding August de Morgan, who was considered an outstanding mathematician and logician. It is these two components that form the basis of programming. They helped the girl in her subsequent scientific works.

The world's first programmer - Ada Augusta Byron

In history information technologies one of the first is the name of Charles Babbage. This man worked on the theory of functions and the mechanization of counting. Babbage is rightfully considered the progenitor of the first and is called the "father of the computer". He created the first digital machine and called it analytical. A significant event in the life of Ada Augusta is the acquaintance with this outstanding inventor. The girl's mother was well acquainted with him, and Babbage himself sincerely rejoiced at each new achievement in the development of Ada's mathematical science.

Introduction to the Analytical Engine

The young talent had a chance to visit the workshop of the "father of the computer". She paid a visit in the company of Mrs. de Morgan, the wife of her mathematics teacher and part-time family friend. In her memoirs of this visit, de Morgan noted that all the guests looked at the analytical engine with great amazement, for them it was something unusual and completely strange.

And only Ada Augusta, according to de Morgan, did not see anything supernatural in front of her. She carefully examined the machine, was able to understand the principle of its work and appreciated the invention. So the first woman programmer first got acquainted with computer technology. After this incident, the girl caught fire even more. scientific activity. She knew and believed that this invention was a step into the future and only the beginning of achievements that could mechanize any processes. And, as we can see today, it did not fail.

The first programmer and his everyday life

At the age of nineteen, Ada Augusta marries. Her chosen one is Lord King, later Earl Lovelace. At that time, the lord was 29 years old, and Ada's family life proceeded happily and measuredly. The girl's husband supported all her scientific endeavors and even admired her mentality. The couple quite often attended secular receptions, but the young lady was interested in something completely different. Even despite her marriage, her communication with Charles Babbage became closer and more cordial to her. The girl reminded Babbage of his dead daughter, especially since Ada was almost her age. The "father of the computer" also admired the girl's abilities, they often exchanged interesting ideas and showed each other their calculations. Over time, they became not only colleagues, but also good friends. Ada could not stand superficial society and stupid people. She was demanding of herself and those around her. With a mathematical mindset, she was attracted to things that were not characteristic of women. The girl became a real genius of her time and devoted her life to science.

Ada Augusta does not stop in her scientific calculations

Over time, the first programmer was forced to move away from science a little. The reason for this was the birth of three children, and Ada had to spend all her time with her family. But her love for mathematics was so strong that she was not ready to sacrifice science for the sake of a quiet family life with her husband and children. When the girl realizes that she can no longer exist without mathematics, she asks Babbage to find her a good teacher to continue her studies. It is at this moment that she is more confident in her abilities than ever before, and is ready to go far in her developments. Babbage answers the young scientist with a letter, in which he indicates that at the present time he cannot find a worthy teacher for her, but continues to search. He also noted that her knowledge in the mathematical field is simply brilliant, and that he doubts at all whether she needs a teacher.

Exploring Babbage's machines

A little later, Ada Augusta begins to study in detail computing machines constructed by Babbage. She asks the inventor to send her detailed information, calculations and drawings of the device. The girl seriously believes that cooperation with the inventor can become more than productive.

The Italian scientist Maniber publishes his article on Babbage's machines, and the first programmer undertakes to translate it. Together with the "father of the computer," she makes detailed comments on the publication, which later will make her famous in certain circles.

First programs

The girl made her first programs for calculating Bernoulli numbers. Most of all, Ada Augusta explained in her writings the solution of a system of two linear equations. Then for the first time such a thing as working variables and their sequential change in the program appeared. The girl was able to apply which is still an integral part of even the most complex modern program. The second program, described in the comments to Maniber's article, was compiled by Ada Augusta to calculate trigonometric functions and included a loop. Recurrent nested loops were the basis of her third program.

The name of the first programmer, however, is rarely found in publications about the history of technological progress. For the most part, this is due to the fact that during the life of Ada, not a single program was launched into work. This happened after the death of this outstanding woman.

The last years of the scientist's life

Ada dies at the age of 36. At the same age, her father died of bloodletting. Father and daughter died due to the same disease - cancer. Even though Ada Augusta tried to heal, last years her life was agonizing. Each new calculation was more and more tiring for the woman, but she did not stop doing science until her death. One of the unique programming languages ​​"ADA", two small towns in America and a college are named after Ada.

It is surprising that the first programmer in the world is a woman. But this young lady gave the world her developments, which became the basis for modern programming.

When did the first computers appear? It is not so easy to give an answer to this question, since there is no single correct classification of electronic computers, as well as formulations of what can be attributed to them and what cannot.

First mention

The very word "computer" was first documented in 1613 and meant a person who performs calculations. But in the 19th century, people realized that the machine never gets tired of working, and it can do the job much faster and more accurately.

To start the countdown of the era of computers, most often take 1822. The first computer was invented by English mathematician Charles Babbage. He conceptualized and proceeded to manufacture the Difference Engine, which is considered to be the first automatic computing device. She was able to count multiple sets of numbers and print out the results. But, unfortunately, due to funding problems, Babbage was never able to complete its full-fledged version.

But the mathematician did not give up, and in 1837 he introduced the first mechanical computer, called the analytical engine. It was the very first general purpose computer. At the same time, his collaboration with Ada Lovelace began. She translated and supplemented his works, and also made the first programs for his invention.

The analytical engine consisted of the following parts: an arithmetic logic unit, an integrated memory unit, and a device for controlling the movement of data. Due to financial difficulties, it was also not completed during the lifetime of the scientist. But Babbage's schemes and designs helped other scientists who built the first computers.

Almost 100 years later

Oddly enough, for a whole century, computers have hardly advanced in their development. In 1936-1938, the German scientist Konrad Zuse created the Z1, the first electromechanical programmable binary computer. Then, in 1936, Alan Turing built the Turing machine.

It became the basis for further theories about computers. The machine emulated the actions of a person following a list of logical instructions, and printed the result of the work on a paper tape. Zuse and Turing machines are the first computers in the world. modern understanding, without which the computers we are used to today would not have appeared.

Everything for the front

The Second World War also influenced the development of computers. In December 1943, Tommy Flowers introduced a secret machine called the Collos, which helped British agents break the ciphers of German messages. It was the first all-electric programmable computer. The general public learned about its existence only in the 70s. Since then, computers have attracted the attention of not only scientists, but also the ministries of defense, which actively supported and financed their development.

As for which digital computer to consider the first, there are disputes. In 1937-1942 University of Iowa professor John Vincent Atanasoff and Cliff Berry (graduate student) developed their ABC computer. And in 1943-1946, J. Presper Eckert and D. Mauchly, scientists at the University of Pennsylvania, built the most powerful ENIAC weighing 50 tons. Thus, Atanasoff and Berry built their machine before, but since it was never fully functional, the title of "very first computer" often goes to ENIAC.

First commercial samples

With huge dimensions and complexity of design, computers were available only to the military departments and large universities, which assembled them on their own. But already in 1942, K. Zuse began work on the fourth version of his brainchild - Z4, and in July 1950 he sold it to the Swedish mathematician Eduard Stiefel.

And the first computers that began to be mass-produced were models with the laconic name 701, produced by IBM on April 7, 1953. A total of 19,701 pieces were sold. Of course, these were still machines intended only for large institutions. In order to become truly massive, they needed a few more important improvements.

So, on March 8, 1955, the Whirlwind, a computer that was originally conceived during the Second World War as a simulator for pilots, was launched on March 8, but by the time of its creation came to the rescue by the beginning of the Cold War. Then it became the basis for the development of SAGE - an air defense subsystem designed for automatic targeting of interceptor aircraft. Key Features"Whirlwind" steel presence random access memory 512 bytes and output graphic information on the screen in real time.

Technology to the masses

The TX-O computer, introduced in 1956 at MIT, was the first to use transistors. This greatly reduced the cost and dimensions of the equipment.

Then the team of scientists who developed the TX-O left the institute, founded the Digital Equipment Corporation, and in 1960 introduced the PDP-1 computer, which began the era of minicomputers. Their size was no more than one room or even a closet, and they were intended for a wider range of clients.

Well, the first desktop computers began to be produced by Hewlett Packard in 1968.


The first computer program was written by a woman, a mother of three children and an aristocrat. And she wrote it even before the world's first computer appeared.

Princess Lovelace or Ada A. Byron-King is the daughter of the great British poet Lord Byron. Her father left her mother when she was little. The mother was extremely happy that her little daughter was very interested in mathematics, although there were attempts to follow in her father's footsteps and write poetry. Once, at the age of 12, she showed her mother scribbled sheets of paper, on which young Ada depicted a drawing of an aircraft.

At the age of 17, assigned to the court, the girl did not look for a boyfriend, but joined the researcher mathematician Charles Babbage. She was so fascinated by the idea of ​​an automatic adding machine, which was considered insane at the time, that she spent all her energy on designing it. Babbage was inspired by the fact that Napoleon had already ordered something similar and that his court scientists were unable to complete the invention due to the outbreak of war.

Babbage came up with a name for his future machine and called it "differential". In 1882, the scientist intrigued the British Admiralty and they became sponsors of his developments. The size of the machine was huge, it had to take up an entire room and calculate to the 10th decimal place. For 10 years, the scientist built only one block of his device. The idea of ​​the Analytical Engine captured Babbage, he essentially offered the world a scheme for an almost modern computer. CPU he called the mill, there were punched cards, instruction programs. The machine consisted of many gears and had to be powered by steam. In 1871, Charles Babbage died and the government of England decided that no one else was able to invent such a machine and closed the project.

Nevertheless, on July 13, 1843, Ada sent a letter to the mathematician, in which she outlined the algorithm for machine calculations of Bernoulli numbers. Ada believed that the processing of data by a machine did not have to be analytical or arithmetic at all, she considered this a delusion. The machine understands numbers in the same way as letters or other characters. The Countess believed that in the future machines would be able to write music and even poetry.

She herself had entertainment - the search for a formula that would always allow her to win in the sweepstakes at the races. Ada died at the age of 37, lived as long as her father and was buried in the same tomb as Lord Byron. On her birthday, December 10, many countries celebrate Programmer's Day, and in the 70s, the Pentagon named the ADA programming language in her honor.



Loading...
Top