Hadoop is indispensable when it comes to processing big data—as necessary to understanding your information as servers are to storing it. In this course, cloud architect Lynn Langit provides a thorough introduction to Hadoop. Find out how to set up Cloud Hadoop and learn about core components like JVMs, the HDFS file system, AWS S3, and cluster components. Step through the process of setting up and verifying your development environment. Explore ways you can use MapReduce with Hadoop and learn how to tune each MapReduce job. Go over scaling VM-based Hadoop clusters on GCP Dataproc HDFS. Learn how to select appropriate NoSQL options for Hadoop with Hive, HBase, and Pig. Plus, dive into Apache Spark architecture and how to run an Apache Spark job on a Hadoop cluster.
Learn More