Transfer Data Fast with the New AWS Snowball and Kinesis Firehose
|Stuart Parkerson in Mobile Tech Sunday, October 11, 2015|
Amazon Web Services has released AWS Snowball and Amazon Kinesis Firehose both of which offer two new capabilities to help AWS users more quickly and cost-effectively transfer data of all types and sizes to the AWS Cloud.
AWS Snowball is a petabyte-scale data transport appliance that can securely transfer 50 TB per appliance of data into and out of AWS. Amazon Kinesis Firehose is a fully managed service for loading streaming data into AWS, available now for Amazon S3 and Amazon Redshift, with other AWS data stores coming soon.
These new products facilitate the increasing number of applications and large volumes of data moved into the AWS Cloud, including app log files, digital media, genomes, and petabytes of sensor data from connected devices. While AWS Direct Connect provides customers with a dedicated, fast connection to the AWS network, AWS Snowball and Amazon Kinesis Firehose were created for AWS users that need to transfer data in large batches, have data located in distributed locations, or require continuous loading of streaming data.
AWS Snowball offers a durable and tamper-resistant, encrypted, and portable storage appliance that customers can use to load streaming data into AWS. For example, users can move 100 TB of data to AWS in less than a week, and at as little as one-fifth of the cost of using high-speed Internet.
Users create a job using the AWS Management Console, AWS ships the appliance directly to the customer, and the customer, upon receiving the appliance, simply plugs it into their local network. AWS Snowball provides a simple data transfer client which customers use to encrypt and transfer 50TB of data to each appliance.
Customers can use multiple AWS Snowball appliances in parallel to transfer larger data sets within the same time frame. Once a customer’s data is completely loaded onto an AWS Snowball, its E Ink shipping label is automatically updated with the AWS shipping address, and customers can track the status of the transfer job using Amazon Simple Notification Service (SNS), text messages, or the AWS Management Console.
AWS Snowball appliances use multiple layers of security to protect customer data. In addition to tamper-resistant enclosures, AWS Snowball also employs end-to-end, 256-bit encryption, along with an industry-standard Trusted Platform Module (TPM) designed to ensure both security and full chain-of-custody for customer data.
Once a customer's data has been transferred from the AWS Snowball to AWS's data stores (initially Amazon S3), AWS erases all data from the AWS Snowball appliance, following the standards defined by the National Institute of Standards and Technology (NIST) guidelines for media sanitization. Customers can get a report and confirm that all of their data has successfully been loaded into AWS before deleting the local copy of their data.
Amazon Kinesis Firehose
Amazon Kinesis Firehose provides the ability to load streaming data into AWS. It can capture and automatically load streaming data into Amazon S3 and Amazon Redshift, enabling near real-time analytics with existing business intelligence tools and dashboards. It is a fully managed service that automatically scales to match the throughput of data and requires no ongoing administration.
AWS Amazon Kinesis Streams has been in service for a couple of years now, allowing customers to build applications that collect, process, and analyze streaming data with very high throughput. Many companies use Amazon Kinesis Streams to capture streaming data and load it into Amazon S3 or Amazon Redshift.
Until now, this required customers to manage the Amazon Kinesis data streams and write custom code to load the data. Now, Amazon Kinesis Firehose makes this as simple as an API call. Amazon Kinesis Firehose captures data from hundreds of thousands of different sources and loads it directly into AWS, in real-time.
Companies can create an Amazon Kinesis Firehose Delivery Stream in the AWS Management Console and specify the target Amazon S3 bucket or Amazon Redshift table, and the time frequency at which they want fresh data delivered to the destination. Users can also configure Amazon Kinesis Firehose to batch, compress, and encrypt streaming data before delivery at specified time intervals.
Read more: https://aws.amazon.com/new/reinvent/data-transfer/