What do you mean by Map-Reduce programming? MapReduce is a programming model designed for processing large volumes of data in parallel by dividing the work into a set of independent tasks. The MapReduce programming model is inspired by functional languages…
How to set up Hadoop on single node and multi node?
We will describe Hadoop setup on single node and multi node. The Hadoop environment setup and configuration will be described in details. First you need to download the following software (rpm). Java JDK RPM Apache Hadoop 0.20.204.0 RPM A) Single…
What is Apache Sqoop and how to use it to import/export data from Hadoop Distributed File System?
Apache Sqoop is a tool used for transferring data from/to Hadoop distributed file system. Hadoop architecture can process BIG data and store it in HDFS. But if we want to use that data then we need to use some tool…
What is Hadoop Streaming?
Ans : Hadoop streaming is a powerful utility which comes with Hadoop distribution.The basic concept of Hadoop framework is to split the job,process it in parallel and then join it back to get the end result.So there are two main…
What is Map/Reduce in Hadoop?
Ans : Processing vast amount of data (multi-terabyte data-sets) is a major concern in real life projects.As the size of data is increasing day by day, applications are finding it difficult to process it in a reliable,secured and fault-tolerant way.…