Hadoop Framework
Hadoop is a framework that allows for the distributed processing of large data sets across clusters of computers using simple programming models.
It is designed to scale up from a single server to thousands of machines, each offering local computation and storage.
Rather than rely on hardware to deliver high-availability, the library itself is designed to detect and handle failures at the application layer, delivering a highly-available service in addition to a cluster of computers, each of which may be prone to failures.