You are here

Apache Hadoop Java Framework by Apache

Apache Hadoop aims to be a framework for running applications on a large cluster of commodity hardware. The Hadoop framework intends to provide applications with reliability and data motion by offering a computational paradigm named Map/Reduce, which is an application that is divided into many small fragments of work, each of which can be executed or re-executed on any node in the cluster. The framework also aims to provide a distributed file system (HDFS) that stores data on the compute nodes, providing very high aggregate bandwidth across the cluster.