Getting Started with Amazon Web Services. Amazon Free VPN: Raising a VPN Server Using Cloud Computing Amazon Web Service

Amazon Web Services (AWS) is a secure cloud services platform that provides computing power, access to storage, databases, content delivery services, and more. functionality to help you scale and grow your business.

Sounds cool =) But in practice it is very scary. There are a lot of services there. What, why, for what and how??? Here's a little cheat sheet for Amazon services.

Services "Run an App:

EC2

Should be called
Amazon Virtual Servers
Scope of application
Host things that you think are computers.
Look like
VPS provided by Linode, DigitalOcean and Rackspace

IAM

Should be called
Users, Keys and Certs
Scope of application
Set up users, add new AWS Keys and certificates.

S3

Should be called
Amazon Unlimited FTP Server
Scope of application
Store photos and other materials for sites. Keep backups and files in public access. Keep static sites. By the way, many services also store their data in S3.

VPC

Should be called
Amazon Virtual Colocated Rack
Scope of application
Add an extra layer of protection to everything you store online. Make it look like all of your AWS services are in one small network, rather than scattered across a huge one.
Look like
VLANs if you are network savvy

Lambda

Should be called
AWS App Scripts
Scope of application
Run small snippets in JS, Java or Python to perform individual tasks.

Services for a web developer

API Gateways

Should be called
API Proxy
Scope of application
Proxy your application's APIs through this service to process traffic, test new versions, and more.
Look like
3Scale

RDS

Should be called
Amazon SQL
Scope of application
Simultaneously MySQL-, Postgres-, and Oracle-DB for your application.
Look like
Heroku Postgres

Route53

Should be called
Amazon DNS + Domains
Scope of application
Buy a new domain and set up DNS records.
Look like
DNSimple, GoDaddy, Gandhi

SES

Should be called
Amazon Transactional Email
Scope of application
Send single emails (to change passwords, deliver notifications, etc.). You can also use it as a newspaper if you sell it, but it's better not to.
Look like
SendGrid, Mandrill, Postmark

cloudfront

Should be called
Amazon CDN
Scope of application
Speed ​​up site loading by optimally distributing the delivery of static files to users.
Look like
MaxCDN, Akamai

CloudSearch

Should be called
Amazon Fulltext Search
Scope of application
Get all the data from S3 or RDS and look for what you need in them.
Look like
Sphinx, Solr, Elasticsearch

DynamoDB

Should be called
Amazon NoSQL
Scope of application
A scalable keystore for your application.
Look like
MongoLab

elasticache

Should be called
Amazon Memcached
Scope of application
Memcached or Redis for your application.
Look like
Redis to Go, Memcachier

Elastic Transcoder

Should be called
Amazon Beginning Cut Pro
Scope of application
Processing various video oddities (formats, compression, etc.).

SQS

Should be called
Amazon Queue
Scope of application
Store data for further processing in a queue.
Look like
RabbitMQ, Sidekiq

WAF

Should be called
AWS Firewall
Scope of application
Block dangerous requests to sites protected by Cloudfront (don't let people try to guess 10,000 passwords for /wp-admin).
Look like
Sophos, Kapersky

Services for mobile developers

Cognito

Should be called
Amazon OAuth as a Service
Scope of application
Allow users to login with Google, Facebook, etc.
Look like
OAuth.io

device farm

Should be called
Amazon Drawer of Old Android Devices
Scope of application
Test your app on multiple iOS and Android devices at the same time.
Look like
MobileTest, iOS emulator

Mobile Analytics

Should be called
Spot on Name, Amazon Product Managers take note
Scope of application
Keep track of what users are doing in your applications.
Look like
Flurry

SNS

Should be called
Amazon Messenger
Scope of application
send mobile notification, letters and / or SMS.
Look like
UrbanAirship, Twilio

Services for code injection

CodeCommit

Should be called
Amazon GitHub
Scope of application
Version control of your code.
Look like
Github, BitBucket

Code Deploy

Should be called
And so good
Scope of application
Upload your code from the CodeCommit (or Github) repository to multiple EC2 entities.
Look like
Heroku, Capistrano

CodePipeline

Should be called
Amazon Continuous Integration
Scope of application
Run automated tests of your code and make the necessary changes.
Look like
CircleCI Travis

EC2 Container Service

Should be called
Amazon Docker as a Service
Scope of application
Place a Dockerfile on an EC2 entity to run the site.

Elastic Beanstalk

Should be called
Amazon Platform as a Service
Scope of application
Migrate your app from Heroku to AWS when you can't afford the first one.
Look like
Heroku, BlueMix, Modulus

Corporate Services

AppStream

Should be called
Amazon Citrix
Scope of application
Place a copy of the Windows application on the Windows system that you are granting remote access to.
Look like
Citrix RDP

Direct Connect

Should be called
Scope of application
Pay for dedicated line access from your data center or network to AWS.
Look like
Tunnel to bypass the traffic jam

Directory Service

Should be called
Actually a very accurate name.
Scope of application
Consolidate applications that require Microsoft Active Directory to manage.

WorkDocs

Should be called
Amazon Unstructured Files
Scope of application
Share Word documents with colleagues.
Look like
Dropbox, Data Anywhere

WorkMail

Should be called
Amazon Company Email
Scope of application
Set everyone in the company the same postal system and calendar.
Look like
Google Apps for Domains

Workspaces

Should be called
Amazon Remote Computer
Scope of application
interface for remote control computer.

Service Catalog

Should be called
Amazon Setup Already
Scope of application
Give all AWS users in your group access to the applications you write so they don't have to read guides like this.

Storage Gateway

Should be called
S3 that considers itself part of your corporate network
Scope of application
Stop buying storage space Word documents. Make it easy to transfer files from the network to S3.

Big Data services

data pipeline

Should be called
Amazon ETL
Scope of application
Extract, Process and Store all data from AWS, as well as set up schedules and receive error messages.

Elastic Map Reduce

Should be called
Amazon Hadooper
Scope of application
Handle large amounts of text or raw data stored in S3.
Look like
treasure data

Glacier

Should be called
Very slow Amazon S3
Scope of application
Make backups of backups stored in S3. Reserves for a rainy day.

Kinesis

Should be called
Amazon High Throughput
Scope of application
Quickly save large amounts of data (like analytics or a list of people retweeting Kanye West) to analyze later.
Look like
Kafka

RedShift

Should be called
Amazon Data Warehouse
Scope of application
Store analytical data, process it and upload it.

machine learning

Should be called
skynet
Scope of application
Predict behavior based on current data to solve various problems.

SWF

Should be called
Amazon EC2 Queue
Scope of application
Create a service of "thinkers" and "workers" on top of EC2 to complete the task. Unlike SQS, it has built-in logic.
Look like
iron worker

snowball

Should be called
AWS Big Old Portable Storage
Scope of application
AWS Snowmobile in miniature.
Look like
Shipping a Network Attached Storage device to AWS

Services for managing AWS

Cloud Formation

Should be called
Amazon Services Setup
Scope of application
Set up multiple related services in one go.

cloud trail

Should be called
Amazon Logging
Scope of application
Log your AWS stack activity.

cloud watch

Should be called
Amazon Status Pager
Scope of application
Get notified when your AWS services are misbehaving.
Look like
PagerDuty, Statuspage

Config

Should be called
Amazon Configuration Management
Scope of application
Don't go crazy when you need to keep an eye on a large AWS system.

Ops Works

Should be called
Amazon Chef
Scope of application
Control how your app runs with things like autoscaling.

Trusted Advisor

Should be called
Amazon Pennypincher
Scope of application
Find out what you are paying for.

Inspector

Should be called
Amazon Auditor
Scope of application
Check your AWS system for security issues.
Look like
Alert Logic

A year of free use of cloud services - this offer is made by Amazon for all new users. The resources provided are, of course, limited, but they are enough to get to know the platform and, for example, raise your own VPN server.

The problem with all cloud computing (the so-called cloud computing) is that many still do not realize what it is and how to deal with it. The fashionable word "cloud" is on everyone's lips, but that's all. In the article "Amazon S3 for ordinary mortals" we already talked about cloud storage that provides any necessary amount of file storage and can withstand any load (even from a huge influx of users). S3 is just one of a number of innovative technologies from Amazon Web Services (AWS for short). Starting in November, the provider offers to get to know their services better without charging a fee (subject to certain conditions). This only reinforced our desire to talk about them in more detail.

Amazon Web Services

Amazon's arsenal of cloud services is quite large, but three are the most common: Amazon Elastic Compute Cloud (EC2 for short), Amazon Elastic Block Store (or EBS), Amazon Simple Storage Service (or S3). Today we are primarily interested in the first technology. In fact, this is the embodiment of the concept of cloud computing in practice. With EC2, you can run any number of computers in the "cloud" with the config you need and operating system. And all this in a few minutes. Each such virtual computer called Instance. After starting (usually it takes a couple of minutes), you immediately get root access via SSH or access to the desktop via RDP, depending on the operating system. It's funny that the payment for such an instance is carried out by the hour. Those. you can stop the virtual server at any time, and the money will not be withdrawn. Or in general, turn it on only if necessary: ​​the cost of using it in this case will be measured in cents. However, in addition to "computer time", traffic is also paid, both incoming and outgoing.

Depending on the type of virtual server, it will be equipped with the appropriate processor and number of random access memory. However, "disk drive" is not included in this configuration. For virtualization hard drive another Amazon technology is used - EBS. You can say "I want a 25GB drive" and she'll do it. And will do as much as you want if you ask. This drive is called Volume and is connected to the instance. Thus, the system has HDD. Everything that is written to it is preserved regardless of the life of the instance itself. The latest technology - S3 - is also designed for storing files, but in a completely different plane. In fact, this is an endless container for files that, if desired, become available via the web. You are given exactly as much storage space as you need. Though 10 MB, 1 GB or even 5000 GB - no restrictions. Except maximum size per file, equal to 5 GB.

As I said, under the "AWS Free Usage Tier" promotion, each New user gets the opportunity to try these services for free. The word "try" means that without a fee, the resources provided will be limited. If you want more, please pay. In short, we get 750 hours of EC2 instance usage every month (enough time to use a virtual server around the clock), 10 GB for EBS (enough to install, say, Ubuntu on a server), and 5 GB for S3. You can also try other Amazon technologies, but we will not touch on them in this article. The main thing now is that we, in fact, get a server for experiments for free, which also works on the basis of cloud technologies. What you want to do with it is only limited by your imagination. But one of the most interesting options is to raise your own VPN, which will be located in the States!

Registration in the service

Before you start using any of the Amazon services, you need to create an account. To do this, go to home page AWS (aws.amazon.com) and click "Sing up Now". On the registration page, select the option "I am a new user" and proceed to the procedure for creating an Amazon account. To do this, you will definitely need a plastic card, this is the only condition. Don't worry, as long as you don't go beyond the special offer for beginners, no fee will be charged. Amazon will write off 1 or 2 dollars to check the validity of the "plastic" and then return them back. A Visa or MasterCard system card is suitable: it is not even necessary to start it in a bank, a virtual credit card can be purchased at Qiwi vending machines. This Amazon account is more of an economic nature and is intended for billing.

To access cloud services, you must additionally subscribe to the necessary services (EC2, EBS, S3, etc.). The security system obliges to check the phone number. At one of the stages of registration, the service will automatically call, requesting a 4-digit PIN code, which at that moment will be displayed on the screen. An important step is to obtain a key pair for access. To work with EC2 and S3, you will need two types of keys: Access Key ID and Secret Access Key, as well as X.509 Certificate. In order to take advantage of a free test drive, you do not need to specify anything else anywhere: Amazon will sign you up for all the necessary services. Once registered, you will have access to the AWS Management Console. Our task is to raise a virtual server, so feel free to go to the EC2 management section.

Getting started with EC2

The technology is designed so that you can start and stop any number of instances (Amazon EC2 instance) within a couple of minutes. At the same time, the service level agreement guarantees 99.95% of their availability - this is a very impressive figure. To do this, from the EC2 management console, click on the "Launch Instance" button. The user can choose from several types of virtual servers with different configurations.

A standard instance has the following characteristics: "Small Instance (Default) 1.7 GB of memory, 1 EC2 Compute Unit (1 virtual core with 1 EC2 Compute Unit), 160 GB of instance storage, 32-bit platform" and costs $0.10 (Unix system ) and $0.125 (Windows as OS). In addition, you need to pay $0.10 per gigabyte of incoming traffic and $0.17 per gigabyte of outgoing traffic. This is a note (because after a while the freebie will end). We are actually interested in the type of instance that Amazon created specifically for the test period - Micro Instance. Its use is free.

The cloud platform offers a choice various options OS to install. The image with the operating system is called AMI (Amazon Machine Image), and, in addition to the files of the system itself, it can include the right software(for example, Apache, MySQL, Memcached, etc.), as well as all the necessary files. In the future, you will be able to create such assemblies yourself. There are a large number of ready-made AMI images from Amazon itself and enthusiasts: based on Linux and Windows. There are more than 6067 options in the Community AMIs database. It is important for us to choose a convenient distribution kit - let it be Ubuntu. keyword there are quite a few AMIs with ubuntu, but almost all involve the use of 15 GB of EBS, which does not fit into the free 10 GB limit. Fortunately, enthusiasts have assembled Ubuntu 10.04 build number ami-c2a255ab, which "occupies" just 10 GB. Find it in IE and click "Install". A special wizard will ask for various parameters: you can leave everything by default. But it is very important here, as I said earlier, to set the instance type to Micro Instance. Otherwise, for every hour you use the server, Amazon will charge you money.

Launching an instance

After completing all the steps of the wizard, you will receive a ready-made instance. On the Instances tab, you can observe the startup process. We need to wait until the "Running" flag appears in the State column - this means that our server is ready to work. Here you can see the parameters of the running server. Public DNS defines the external name of the server. There is a nuance here: Domain name, and the IP address of the virtual server will change each time the instance starts. But! On the "Elastics IPs" tab, you can get the so-called static IP-shnik and bind it to the instance. It is important to immediately make such a binding: until you do this, the service will withdraw money from you. This is done on purpose so that users do not grab themselves static IP addresses that they do not really need. If you try to ping the host now or connect via SSH, you will be in for a big bummer. The reason is simple: by default, the firewall cuts all connections. This can be easily fixed by editing the security policy in the Security Group section. Do as shown in the screenshot.

From now on, we have a working EC2 instance and we can start configuring Ubuntu installed on it. To do this, connect to the server via SSH. Good old PuTTY is ideal for this. True, the server gave us keys in pem format, while PuTTY needs ppk. It doesn't matter, the PuTTYgen utility quickly converts the keys to a suitable format: first, we load the key ("Load private key file"), and then save it to the right place through the "File" menu. If you have not previously configured an SSH connection using keys, then this is done as follows:

  • in the "Sessions" section, enter the IP of our instance (Elastic IP) in the Host Name field;
  • in the "Connection -> Data" section, in the "Auto-Login" field, specify the username "ubunta", which will be used for authorization in the system;
  • in the "Connection -> SSH -> Auth" section, specify the path to our private key;
  • in the "Session" section, enter the name of the session and save it using the "Save" button.

From now on, all that is needed to connect is to select the desired session and click the "Open" button. You will be asked to enter a passphrase for your key.

Setting up PPTP

If you did everything right, the console of your virtual server will appear in the PuTTY window, namely the Ubuntu welcome message. Those. we already have a working virtual server in the cloud and SSH access to it. It would be possible now to raise hosting on it. Or, for example, set up SSH forwarding and securely tunnel application traffic. Anything is possible: it's a Dedik, only in the cloud. We, as planned, will raise a full-fledged VPN server on the instance. There are options here: you can configure OpenVPN, or you can use a regular PPTP daemon. Both approaches have drawbacks. A separate client is required to connect to OpenVPN. In the case of PPTP, the client is not needed, but you can break off with the connection if the provider cuts GRE packets. For me, the second option is more convenient.

Given that we have a handy Ubuntu at our disposal, raising a PPTP daemon is a piece of cake. Let's start by installing the service:

sudo aptitude install pptpd

The next step is to configure the daemon a bit. First you need to add ranges of IP addresses that will be issued to connected clients. To do this, uncomment and correct the last 2 lines in the /etc/pptpd.conf file:

localip 192.168.242.1
remoteip 192.168.242.2-5

With these settings, the PPTP daemon itself will receive the address 192.168.242.1, and there will be 4 possible addresses for clients: from 192.168.242.2 to 192.168.240.5. It will also not be superfluous to specify the addresses of the DNS server. This can be either Amazon's own server (172.16.0.23) or, for example, Google servers Public DNS. They are written in the /etc/ppp/pptpd-options file:

ms-dns 8.8.8.8
ms-dns 8.8.4.4

The last step is to add users to connect to the PPTP daemon:

sudo echo "<имя_пользователя>pptpd<пароль>*" >> /etc/ppp/chap-secrets

Instead of<имя_пользователя>And<пароль>you need to provide the required credentials. If necessary, there may be several such users. As soon as new entries are made in the /etc/ppp/chap-secrets file, you will need to restart the PPTP daemon:

sudo /etc/init.d/pptpd restart

In principle, you can already try to connect to the server. The connection will be established, but you will not be able to access the Internet through this VPN connection. This is because we haven't enabled packet forwarding and NAT yet. We fix this situation by uncommenting the following line in the /etc/sysctl.conf file:

net.ipv4.ip_forward=1

Reload the config:

And enable NAT by adding a new firewall rule:

sudo iptables -t nat -A POSTROUTING -o eth0 -j MASQUERADE

True, after a reboot, this rule will disappear. 🙂 Therefore, it is better to immediately add this command to the /etc/rc.local config, adding the following after the "exit 0" line:

iptables -t nat -A POSTROUTING -o eth0 -j MASQUERADE

Everything, now the VPN is fully operational. You can establish a connection and go to some server to determine the IP to make sure that the address we have is from the USA. A resource like speedtest.net will take a control measurement of the bandwidth. My VPN works pretty fast. Amazon gives 15 gigabytes of incoming traffic and the same amount of outgoing traffic. Exiting the limit costs astronomical money: 10 cents per gigabyte. 🙂

Instead of a conclusion

It is clear that the Amazon freebie is aimed primarily at attracting new customers and developers and will end after a while. But think. Even if you have to pay to use an EC2 instance, you can turn it on strictly as needed. With occasional use of the server, you can easily fit in a few bucks a month. And this is in any case cheaper than any VPN server. Such a flexible approach allows you to do even more interesting things: for example, create a cluster of a dozen servers that are turned on strictly as needed and perform some kind of resource-intensive task. This idea became even more interesting after Amazon introduced Instance types with powerful GPUs that support CUDA technology. And isn't it great to feel the progressive cloud technologies, which are used major projects online?

Convenient EC2 control

The web console for managing AWS, while providing everything you need, is not always convenient. For more comfortable work, it is better to install a special Elasticfox plugin for Firefox. Configuring the addon comes down to specifying the AWS Access Key and AWS Secret Access Key received during registration in the settings. In addition, Amazon itself provides a set of console utilities (s3.amazonaws.com/ec2-downloads/ec2-api-tools.zip) for interacting with EC2. They require the Java Runtime Environment to be installed.

DVD

The disk contains the necessary files for working with AWS.

1-Amazon

Google, Microsoft And IBM all are technology giants, leading in many fields and having billions of users. But there is a business segment in which they are always behind for many years. Amazon.

What should be said, that segment is not e-commerce, the sphere is tightly tied to the name Amazon. Few people know that this company also works in another area that affects billions of users and occupies the first position in the world.

Amazon's cloud computing infrastructure market share at the start of 2016 was even larger than that of Microsoft, IBM, and Google combined.

This is the field of cloud computing, with the name Amazon Web Services (AWS) and announced for the first time in 2006. At that time, many investors doubted the decision of the CEO Jeff Bezos, since this area is not related to e-commerce. At the same time, the main business segment Amazon is declining in profits, and the cost of investing in personnel, infrastructure for AWS not at all small.

In 2015, Bezos proved himself right when he announced "huge" profit from AWS. AWS brought in $12.2 billion, with a profit of $3.1 billion in 2016.

So what is Amazon Web Service? Amazon defines its product as a "cloud computing service". Main Functions AWS these are computing (computing), storage (storage), security, data analysis, service artificial intelligence, platform Internet of Things...

The two most popular services AWS This Amazon Elastic Compute Cloud (Amazon EC2) And Amazon Simple Storage Service (Amazon S3). Both of these services provide users with unlimited resources for storage, computing with the ability to quickly change the amount of work. This is also common strong point Cloud Computing: Flexibility in the use of resources and costs helps individual developers to be able to work with the equivalent infrastructure to large technology corporations.

A lot of well-known names are now clients of AWS How Netflix, Twitter, New York Times, Nasdaq. For these companies, infrastructure rental Amazon instead of independent development, helps them focus on other issues in technology, business, as he once said Technical Director Netflix. He also praised cloud computing as a good and flexible solution for companies such as Netflix when the user's need can lead to a rapid expansion of the infrastructure.

It's no coincidence that Amazon involved in this seemingly unrelated realm. In the early 2000s, they had a system that helped build websites for online sales. But at work Amazon encountered problems in the expansion of infrastructure and complexity with the software system. This is the moment when the engineers provided a freer and simpler solution.

When they realized that they could provide services beyond Amazon. Benjamin Black, a group member suggested an idea Jeff Bezos. The CEO really liked this idea and even introduced a platform that would allow anyone, even students living in dorms, to be able to work with tools to start a technology company.

So far this idea is correct, only it has been extended to the whole world.

Google And Microsoft certainly could not miss this potential market. Now, these two tech giants also have a product to compete with AWS, but they cannot catch the speed of providing new properties as Amazon. Market share of "Cloud Computing" Amazon almost equal to the share Google, Microsoft And sales force put together.

Amazon's business continues to grow rapidly, but the growth rate is declining. This was to be expected given the high sales volumes the company has achieved, notes Quartz.

The AWS division brought to Amazon more than 11% of revenue and more than a half of profit at the end of 2018. Profit in the cloud business in 2018 amounted to $7.3 billion compared to $4.33 billion a year earlier.

Profit increases despite rising costs. In 2018, they were measured at $18.36 billion, while in 2017 they were $13.13 billion.

In 2018, Amazon's entire revenue jumped 31%, and AWS contributed 47% to this rise. This is more than the contribution of the North American market (33%) and other countries combined (21%).

At Amazon's quarterly and annual earnings conference call, Amazon's CFO Brian Olsavsky said spending on buildings and equipment used to run cloud services will increase in 2019.

In 2018, AWS services covered two new regions, and by the first half of 2020, it is planned to master four more regions and 12 new availability zones within these regions.

AWS added many new customers in 2018, including Ellie Mae, Korean Air, Santander Openbank, and Pac-12. Mobileye and Guardian Life Insurance have named AWS as their cloud provider of choice, while the National Bank of Australia has selected AWS as its long-term cloud adoption partner.

Growth of revenue in a segment of software platform solutions based on AI for 106.3%

Story

2019

Amazon copies open source software and sells it in its cloud

In mid-December 2019, The New York Times (NYT) published an article that revealed that Amazon was taking open source software and selling it on its cloud infrastructure. The company reacted to this publication and denies everything.

During 2019, there were several publications in the media about how the business of software manufacturers was undermined: the Amazon Web Services (AWS) division used free versions of their products and began offering them as a fully managed cloud service.

For example, NYT reviewers cite Elastic. From her, as stated in the article, Amazon copied open source software and integrated it into its new Elasticsearch service. It turned out that Elastic began to compete with AWS, offering a service built on open source software, for the development and support of which the company spent time and money. Elastic sued Amazon, alleging that a competitor was illegally using its trademark.

The dishonesty of Amazon and other developers has led manufacturers to open solutions divided their products into freely distributed and licensed ones. The latter are intended for organizations wishing to use the product as a cloud service. MongoDB and Redis, for example, have switched to such a business model, according to the ComputerWeekly portal.


Although open source software is generally free, vendors make money from such products by providing support services to organizations, providing updates and bug fixes on a commercial basis.

Previously, software mostly had to be installed on client servers, but with the advent of public cloud services such as AWS, this need has disappeared. The fact is that the software is hosted in the cloud, and its manufacturer offers customers a full product management service.

NYT journalists interviewed several industry experts, and they spoke about Amazon's complex relationship with the community open source. Developers' main gripe is that the company, they say, is taking an open source product and embedding its code into its AWS platform. Amazon then offers software that the company claims is fully compatible with the original open source project, but is distributed as a service on AWS.

IN last years there was a shift in the monetization of open projects. As major cloud providers began offering the same services as open source software vendors, the original Open Source business models were under threat, says Aiven CTO and co-founder Heikki Nousiainen. - However, in the end, it is the users who dictate what they will work with in the future. And these users prefer true open source licenses more and look for licenses that reflect the tradition of the Open Source community.

Andi Gutmans, vice president of analytics and ElastiCache solutions at AWS, said in response to the NYT publication that open projects allow any company to use software on their computers or in the cloud, as well as run services based on them.

Launching Veeam Backup for AWS

ARIS Cloud Solution Availability

On August 1, 2019 it became known that the ARIS Cloud solution of Software AG company became available in the Amazon Web Services (AWS) Marketplace catalog. Customers can now purchase it directly as software as a service (SaaS). Read more.

2018: The number of vacancies of specialists in Amazon Web Services grew by 107.15% in three years

In December 2018, job search portal Indeed published a study reporting on the rapidly growing demand for cloud computing professionals.

So, by the end of 2018, the number of vacancies that require Google Cloud knowledge soared by 1082% compared to three years ago. In the case of Amazon Web Services (AWS) and Microsoft Azure clouds, growth rates were 107.15% and 165.9%, respectively. Read more.

2017

Record revenue of $17.5 billion

In 2017, Amazon Web Services (AWS) revenue was $17.5 billion, up from $12.2 billion a year earlier. Profit cloud business Amazon during this time has increased from 3.1 to 4.3 billion dollars.

Responsible for cloud services the division brings Amazon all the profits and about 10% of revenue. In 2016, the latter indicator was measured at 9%.

At the same time in division expenses continue to grow: in the 2017th they exceeded $13 billion whereas in 2016 they made $9.1 billion.

According to Amazon CFO Brian Olsavsky, AWS reached $20 billion in 12-month revenue during 2017.


As GeekWire points out, Amazon's cloud business grew in 2017 despite strong competition from Microsoft and other companies. At the same time, Amazon's competitors do not disclose revenues from services designed to deploy cloud infrastructure, which the American Internet giant specializes in. Judging by the results of AWS, the division is unlikely to have lost at least some share of this market, the publication adds.

During a conference on the publication of financial statements, Brian Olsavsky rejected the possibility of a spin-off of Amazon Web Services. According to the top manager, such a restructuring can be effective on the one hand, but on the other hand it will become a big problem. Both Amazon and AWS get more out of working together, although to some extent they function separately from each other, Olsavsky said.

Several major companies became AWS customers in 2017, including Expedia, Ellucian, DigitalGlobe, and The Walt Disney Company.

1300 new features

At the AWS re:Invent conference, which took place in Las Vegas from November 27 to December 1, 2017, Amazon announced a massive upgrade to its cloud infrastructure.

Amazon Web Services (AWS) solutions received 1,300 new features, up from just over 1,000 in 2016.

Among the cloud innovations presented as part of AWS re:Invent 2017, it is worth highlighting the SageMaker service, designed to create a machine learning model. SageMaker implements ten well-known supervised and unsupervised learning algorithms. Processes can be run in parallel on dozens of instances, which greatly speeds up model building.

Solutions such as Amazon Rekognition (identifies objects and faces in user videos), Amazon Transcribe (speech-to-text), Amazon Translate (language translator) and Amazon Comprehend (analyzes text for key phrases and emotional coloring) were also presented.

In addition, new tools for Amazon Web Services allow you to record phone calls and study them, determining, for example, whether the client is satisfied with the communication with the employee of the company or not. Thanks to these technologies, leaders of organizations can monitor the activities of their subordinates and train them.

Launch of the Secret Region service

Main article: Amazon Secret Region

According to an official statement from Amazon, service workers technical support Amazon S3 carried out regular maintenance of the billing system, for which they needed to shut down several servers payment system. A simple typo resulted in "more servers than planned" going down. Among the disabled were the servers that ensured the performance of the other two S3 subsystems.

The AWS Service Health Dashboard (SHD) system, which displays the status of all S3 services, also functioned incorrectly: when almost nothing was working, SHD showed that everything was in order. Amazon had to keep customers informed about the real situation via Twitter.

Disabled systems could not be restored for five hours. As it turned out, some of the deactivated servers have not been restarted for many years. And because S3 has grown significantly in recent years, "the process of restarting these services and performing the necessary security and metadata integrity checks has taken longer than expected."

As a result of the Amazon cloud outage, major sites and services using S3 hosting were experiencing disruptions. In particular, there were problems with the Apple Music music service. In addition, popular Western news outlets, including The Verge and Business Insider, have had problems hosting images for the site. The US Securities and Exchange Commission (SEC) also suffered.


AWS S3 also powers popular services such as Netflix, Spotify, and Airbnb. While none of these services went offline as a result of the crash, users have complained about bugs and slowdowns.

By approximately 1:00 a.m. on Wednesday, March 1, 2017, Amazon had fully restored S3, after which there were no problems. Media called this incident the largest Internet collapse since 2015.

After this failure, Amazon announced plans to implement new systems to ensure business continuity. As expected, the new measures will minimize the likelihood of "collapse" of tens of thousands network resources due to a simple typo.

It is also planned to make changes to the overall architecture of S3 to speed up the recovery of servers after a planned or unplanned outage.

More than 148,000 websites and 122,000 unique domains use Amazon S3, mostly located in the United States, according to SimilarTech. Amazon cloud services are preferred by 0.8% of the top million sites on the Internet. By comparison, CloudFlare is used by 6.2% of sites.

Amazon increases the cost of services in Russia

2016

Record revenue of $12 billion

After the release of financial statements, Amazon shares fell in price by more than 4% in electronic trading after the close of the exchange on February 2, 2017. The decline came as the company's fourth-quarter earnings and first-quarter guidance came in below Wall Street's expectations.

3 Reasons Why AWS Is Successful

At the end of July 2016, Amazon management spoke about the main factors due to which the online retailer came close to becoming the first company in the world with annual cloud revenue of $10 billion.

In the first half of 2016, Amazon Web Services (AWS) generated $5.5 billion in revenue, which means that Amazon Web Services could generate $11 billion in revenue for the full year.


1. Functionality and pace of innovation development. In January-June 2016, the AWS infrastructure was replenished with 422 new services and functions against 722 for the whole of 2015. Olsavsky noted that Amazon continues to develop important areas such as data analytics and machine learning.

2. Partners and Ecosystem. In the second quarter of 2016, AWS attracted such a giant of the cloud industry as Salesforce.com, and also signed contracts with a number of large customers (GE Oil & Gas, Kellogg "s, Brooks Brothers, etc.) who decided to use AWS running in the cloud software SAP. In addition, AWS expanded infrastructure by opening a new data center in India. Thus, the number of regions in which Amazon data centers are launched has reached 13.

3. Experience. Brian Olsavsky said that Amazon is a pioneer in the public cloud market. A strong start allowed the company to break away from competitors and by the end of July 2016 earn four times more in the cloud market than the closest pursuer Microsoft Azure.

At the same time, the financial director of Amazon does not exclude that competitors can expand their presence, given the large size of the public cloud market. However, AWS will still remain the leader, he assured.

Salesforce.com leases Amazon data centers for $400 million for 4 years

2015

Growth of revenue to $8 billion

Q1: Cloud business revenue up 50% to $1.57 billion

Revenue from the cloud business, compared to the previous year, grew by almost 50% and reached $1.57 billion, which accounted for about 7% of the total income of the online retailer. The division's operating income rose 8% to $265 million, a solid margin of 16.9%.

Jeff Bezos, CEO of Amazon, said: "Amazon Web Services is a $5 billion business and growing at an ever-increasing rate."

Frankfurt has a developed network infrastructure. It's easier to find a dedicated fiber optic cable from Paris to Frankfurt than it is to Ireland, Amazon notes. At the same time, service in the Frankfurt center will be somewhat more expensive than in Ireland due to higher electricity prices, salaries and other costs. The center will operate two accessibility zones with independent power supply, cooling and security, connected by a high-speed network.

Gartner Magic Quadrant

Strengths: AWS has built its cloud to handle almost any workload, making it popular with small businesses, mission-critical organizations, and web developers. No one disputes the vendor's market share - according to Gartner (2014), Amazon sells 5 times more computing power than the other 14 companies in the quadrant combined.

What to Consider: The biggest complaint about AWS is that all "add-ons" are paid separately. Amazon doesn't create prepackaged services like some of its competitors. This lack of service and support is particularly repulsive to some users, as these points mean an extra line of expense on their bills for services.

2011

In February, 2011 it became known of the conclusion of the agreement on providing technical support and optimization of work of Citrix products and Windows applications which are launched based on Amazon Web Services (AWS). The agreement will improve the compatibility and performance of AWS-based Windows applications while providing constant update Xen virtual platform. Citrix plans to streamline the deployment of Citrix XenServer, a commercial server virtualization platform, in this way, which will provide users with an easy transition from enterprise data centers to Amazon Elastic Compute Cloud (Amazon EC2) infrastructure. As a result, AWS customers will benefit from Citrix's rich experience in virtualization and Windows Application Delivery Optimization. The collaboration between Citrix and Amazon Web Services will bring users the following benefits:

    • Expanded Interoperability – Through collaboration, XenServer customers will be able to take advantage of the scalable and flexible AWS infrastructure and pay-per-service cloud computing. XenServer users can now easily connect, migrate and manage virtual machines both based on AWS and based on a local installation of XenServer.
    • Optimized for Windows - As an expert in Windows virtualization and delivery, Citrix will leverage AWS technologies and optimize copies of Windows for enterprise deployments Windows Applications on the AWS platform.
    • Enhanced Cloud Solutions – Citrix will enhance essential cloud solutions for enterprise users such as disaster recovery, on-demand application delivery, improved security and regulatory compliance.

Amazon Web Services (AWS) cloud users will be able to use Oracle Database 11g in the second quarter of 2011 as part of the Amazon Relational Database Service (Amazon RDS). Holders of valid Oracle licenses will be able to start using the service immediately at no additional cost. It will also be possible to use Oracle in a rental mode with hourly payment for actually used computing power. Oracle will be fully integrated into the existing AWS infrastructure, and users will be able to request the required resources in a self-service mode using a unified web interface. They will also be able to somewhat reduce the cost of maintaining the DBMS, since backup, installing updates, and some other administrative tasks will be performed on the Amazon side. Currently, Oracle is not yet available under Amazon RDS, but Amazon is offering interested users to try the service with a free IaaS database, and has also been given the dubious status of having the worst quality of service agreement from them. However, a similar agreement for the recently launched cloud system HP Compute Cloud could be even worse.

Both Amazon and HP's agreements include a set of strict conditions that customers must meet in order for the QoS guarantees to take effect. For example, AWS requires applications to be hosted in at least two availability zones (separate data centers), and the agreement will be considered violated only if both zones fail. HP guarantees customers compensation only when all areas are unavailable. Thus, the analyst believes, it is almost impossible for customers to achieve compensation for failures of the cloud system. Amazon and HP agreements don't even work in case of CRN storage failures)

According to the company's expert Dmitry Bestuzhev, the Amazon cloud contains many inclusions of malicious code that can steal financial data. Some also believe that hackers used Elastic Compute Cloud (EC2) cloud services to launch one of the attacks on the Sony online entertainment network in April and May.

"Recently there have been reports that the Amazon cloud has served as a platform for successful attacks on Sony," Bestuzhev writes in a blog post about Amazon's troubles. “Well, today I discovered that Amazon Web Services [the cloud] is now being used to seed code that steals financial information.”

He discovered that the cybercriminals behind these attacks were located in Brazil and used several previously registered accounts. Bestuzhev writes that he alerted Amazon to the presence of malicious code, but 12 hours later, the dangerous links were still there and active.

These attacks on Sony, writes Bestuzhev, and the discovery of malicious links in the Amazon cloud indicate that cybercriminals are increasingly using official cloud services as a springboard for their attacks.

The financial data theft code he discovered comes in several forms; it is delivered to the victim's computer and acts different ways writes Bestuzhev. In one such case, it acts as a rootkit; it looks for four different antiviruses and blocks them from running, and special program GBPluggin protection used by many Brazilian banks for online transactions. This code is capable of stealing financial information from nine Brazilian and two international banks, stealing the identity of Microsoft Live Messenger, digital certificates used by eTokens in their system, as well as CPU information, hard disk volume number, PC name and other data used by some banks for login authentication.

Malicious code on Amazon transmits stolen data in two ways: e-mail to the cybercriminal's Google Gmail account or through a special php file that inserts them into a remote database. Moreover, Bestuzhev writes, malicious code protected by official anti-piracy software called The Enigma Protector to make it difficult to decompile.

All this shows, writes Bestuzhev, that cybercriminals will find new ways to use the cloud to launch their attacks, and cloud providers should increase their protection measures.

“I believe that official cloud services will continue to be used by criminals for cyberattacks. various types- writes Bestuzhev. "Cloud vendors should consider improving their monitoring systems and expanding their security personnel to effectively mitigate attack attempts from and through their cloud."

2010

Improving the toolkit for developing mobile applications

In 2010, Amazon announced the release of tools to make it easier for developers to create mobile applications with access to the Amazon Web Services (AWS) platform. This was reported in the company's blog. AWS is a set of cloud services, including a storage service (Amazon S3), a database hosting service, and an instant messaging platform. All this can now be done with the appropriate development kit: AWS SDK SDK for PHP , which makes it easy to develop PHP applications to run in the Amazon cloud. The toolkit is designed to use PHP versions 5.2 and higher. With it, developers can build applications that use various elements of the Amazon cloud: Simle Storage Service (S3), Elastic Computing Cloud (EC2), and the SimpleDB database. The development kit is based on the CloudFusion toolkit. The AWS SDK for PHP includes an API library, code samples, and documentation. You can also contact Amazon for guidance on migrating from CloudFusion 2.5 to the AWS SDK for PHP. As emphasized by Amazon, CloudFusion will continue to exist as an open project. The APIs that accompany Amazon cloud services can be used in any programming language. However, to ensure ease of use, the company offers ready-made development kits for a number of platforms: Java, Microsoft .NET, and now PHP.



Loading...
Top