Analysing large scale data with Apache Hadoop

Apache Hadoop is a Java framework for large-scale distributed batch processing infrastructure which runs on commodity hardware. The biggest advantage is the ability to scale to hundreds or thousands of computers. Hadoop is designed to efficiently distribute and handle large amounts of work across a set of machines.

[…]

Salil Kalia

Salil has 7 years of experience on various Java based platforms (including mobile, desktop and web application development). In his recent projects, he has used various bleeding edge technologies including Hadoop where he has processed thousands of Gigabytes of data on Amazon cloud. He loves to learn the latest technologies. He follows agile methodologies and […]