HADOOP QUIZ DESCRIPTION

What was Hadoop named after?

  • Creator Doug Cutting’s favorite circus act
     

  • Cutting’s high school rock band
     

  • The toy elephant of Cutting’s son
     

  •  A sound Cutting’s laptop made during Hadoop development

_______ is the most popular high-level Java API in Hadoop Ecosystem

  • Scalding
     

  •  HCatalog
     

  • Cascalog
     

  •  Cascading

Mapper implementations are passed the JobConf for the job via the ________ method.

  •  JobConfigure.configure
     

  •  JobConfigurable.configure
     

  •  JobConfigurable.configurable
     

  •  None of the mentioned

Hadoop achieves reliability by replicating the data across multiple hosts and hence does not require ________ storage on hosts.

  • RAID
     

  •  Standard RAID levels
     

  •  ZFS
     

  •  Operating system

________ is the primary interface for a user to describe a MapReduce job to the Hadoop framework for execution.

  • Map Parameters
     

  • JobConf
     

  • MemoryConf
     

  •  None of the mentioned

Point out the wrong statement.

  • Reducer has 2 primary phases
     

  •  Increasing the number of reduces increases the framework overhead, but increases load balancing and lowers the cost of failures
     

  •  It is legal to set the number of reduce-tasks to zero if no reduction is desired
     

  • The framework groups Reducer inputs by keys (since different mappers may have output the same key) in the sort stage

Point out the wrong statement.

  • Hardtop processing capabilities are huge and its real advantage lies in the ability to process terabytes & petabytes of data
     

  • Hadoop uses a programming model called “MapReduce”, all the programs should conform to this model in order to work on the Hadoop platform
     

  • The programming model, MapReduce, used by Hadoop is difficult to write and test
     

  •  All of the mentioned

Point out the correct statement.

  • Hive is not a relational database, but a query engine that supports the parts of SQL specific to querying data
     

  •  Hive is a relational database with SQL support
     

  •  Pig is a relational database with SQL support
     

  • All of the mentioned

______ is a framework for performing remote procedure calls and data serialization.

  • Drill
     

  •  BigTop
     

  •  Avro
     

  •  Chukwa

What license is Hadoop distributed under?

  • Apache License 2.0
     

  • Apache License 2.0
     

  • Shareware
     

  •  Commercial

Which of the following platforms does Hadoop run on?

  • Bare metal
     

  •  Debian
     

  • Cross-platform
     

  •  Unix-like

_______ is a platform for constructing data flows for extract, transform, and load (ETL) processing and analysis of large datasets.

  • Pig Latin
     

  • Oozie
     

  •  Pig
     

  • Hive

_________ maps input key/value pairs to a set of intermediate key/value pairs.

  • Mapper

     

  •  Reducer
     

  •  Both Mapper and Reducer
     

  •  None of the mentioned

Which of the following phases occur simultaneously?

  • Shuffle and Sort
     

  •  Reduce and Sort
     

  •  Shuffle and Map
     

  •  All of the mentioned

Point out the correct statement.

  •  Hadoop do need specialized hardware to process the data
     

  •  Hadoop 2.0 allows live stream processing of real-time data
     

  •  In the Hadoop programming framework output files are divided into lines or records
     

  • None of the mentioned

The output of the _______ is not sorted in the Mapreduce framework for Hadoop.

  • Cascader
     

  • Cascader
     

  •  Scalding
     

  • None of the mentioned

Mapper and Reducer implementations can use the ________ to report progress or just indicate that they are alive.

  • Partitioner
     

  • OutputCollector
     

  •  Reporter
     

  • All of the mentioned

IBM and ________ have announced a major initiative to use Hadoop to support university courses in distributed computer programming.

  • Google Latitude
     

  • Android (operating system)
     

  • Google Variations
     

  • Google

According to analysts, for what can traditional IT systems provide a foundation when they’re integrated with big data technologies like Hadoop?

  • Big data management and data mining
     

  •  Data warehousing and business intelligence
     

  •  Management of Hadoop clusters
     

  •  Collecting and storing unstructured data

 As companies move past the experimental phase with Hadoop, many cite the need for additional capabilities, including _______________

  • Improved data storage and information retrieval
     

  •  Improved extract, transform and load features for data integration
     

  •  Improved data warehousing functionality
     

  •  Improved security, workload management, and SQL support

_______ function is responsible for consolidating the results produced by each of the Map() functions/tasks.

  •  Reduce
     

  •  Map
     

  •  Reducer
     

  • All of the mentioned

The right number of reduces seems to be ____________

  • 0.90
     

  • 0.80
     

  • 0.36
     

  • 0.95

 Point out the correct statement.

  • Applications can use the Reporter to report progress
     

  •  The Hadoop MapReduce framework spawns one map task for each InputSplit generated by the InputFormat for the job
     

  •  The intermediate, sorted outputs are always stored in a simple (key-len, key, value-len, value) format
     

  •  All of the mentioned

Point out the wrong statement.

  • Elastic MapReduce (EMR) is Facebook’s packaged Hadoop offering
     

  • Amazon Web Service Elastic MapReduce (EMR) is Amazon’s packaged Hadoop offering
     

  •  Scalding is a Scala API on top of Cascading that removes most Java boilerplate
     

  • All of the mentioned

__________ part of the MapReduce is responsible for processing one or more chunks of data and producing the output results.

  • Maptask
     

  •  Mapper
     

  •  Task execution
     

  • All of the mentioned

Point out the correct statement.

  • Hadoop is an ideal environment for extracting and transforming small volumes of data
     

  • Hadoop stores data in HDFS and supports data compression/decompression
     

  • The Giraph framework is less useful than a MapReduce job to solve graph and machine learning
     

  • None of the mentioned

The Pig Latin scripting language is not only a higher-level data flow language but also has operators similar to ____________

  • SQL
     

  •  JSON
     

  •  XML
     

  •  All of the mentioned

The number of maps is usually driven by the total size of ____________

  •  outputs
     

  • inputs
     

  •  tasks
     

  •  None of the mentioned

Point out the wrong statement.

  • A MapReduce job usually splits the input data-set into independent chunks which are processed by the map tasks in a completely parallel manner
     

  • The MapReduce framework operates exclusively on <key, value> pairs
     

  •  Applications typically implement the Mapper and Reducer interfaces to provide the map and reduce methods
     

  • None of the mentioned

Which of the following genres does Hadoop produce?

  • Distributed file system
     

  •  JAX-RS
     

  •  Java Message Service
     

  •  Relational Database Management System