No one can imagine today’s life without video. It became the main source of information. The information transferred through video is the most easily digestible. Now you can watch video everywhere and on every device. Video consumers are becoming more spoiled and expecting higher quality and faster services e.g. streaming. Nobody wants their content interrupted because of video buffering, skipped frames or the dreaded crashed players. So, that’s why it is very important that video is of equally high quality on all possible devices. This can be easily integrated with the help of AWS Elastic Transcoder. Let’s have a look at it. Continue reading “Video transcoding”
DynamoDB is an Amazon service which differs from other their services by allowing developers to purchase a service based on throughput, rather than storage. Although the database does not automatically scale, administrators can request more throughput and DynamoDB will spread the data and traffic over a number of servers using solid-state drives, predictable performance. It offers integration with Hadoop via Elastic MapReduce.
Continue reading “Amazon Dynamo DB”
Nowadays, more and more companies are starting to implement Real Time Big Data Analytics in their organisation. In fact, it is not a popular trend, but a necessary part of the business. Here are some examples of how big companies succeeded with Real Time Big Data Analytics. Continue reading “Real time Big Data Analytics”
There are plenty of use cases when AWS data pipelines could save a fortune and speed up business decisions. Firstly, it is serverless that means you do not have anything on your server if you do not want to. Everything is starting entirely on AWS side and you pay only per execution.
- Analyse daily users’ behaviour through extracting data from logs
- Analyse transactions for payment system
- Analyse stock exchange reports. And many more
So, datapipeline allows you to spin up entire infrastructure needed for Hadoop cluster. Run all logic you desire to process your data with and shut down. Main steps are:
Continue reading “Data Pipeline use cases”
In good old days every party connected to the internet and wishing to distribute data had to have own servers. This could lead to massive over budgeting and prevent business from being efficient. Literally, business had to take care of infrastructure, which was most of the time sitting and doing nothing but being extremely expensive. Then AWS came on stage and offered a quite unique kind of service. Now if business has a big CPU demand, it can be utilised through pay-as-you-go model. AWS will rent it for you and let you get all of your data with paying only fraction of the cost of own infrastructure.
Continue reading “How to start using AWS to improve your business performance.”