Hadoop

Deploy Spark-Hadoop in less than 30 minutes

Suppose that we want to install an Apache Hadoop cluster with 3 Hadoop Slaves. On top of that we deploy Spark, which can be used for processing the data stored on HDFS and Zeppelin to visualize...
Read More

Stay Informed

Follow our 500 other subscribers, and stay up-to-date of our latest posts and events.