Home »
MCQs
Apache Flink MCQs
Apache Flink MCQs: This section contains multiple-choice questions on Apache Flink. All MCQs have the correct answers and explanations. These MCQs will help students and professionals to test their skills and to enhance their knowledge of Apache Flink.
List of Apache Flink MCQs
1. Which of the following is True about Apache Flink?
- Apache Flink is a real-time processing framework.
- Apache Flink is a stream-processing and batch-processing framework
- Both
- None
Answer: C) Both
Explanation:
Apache Flink is a real-time processing framework, which can process both stream data and batch data.
Discuss this Question
2. Apache Flink works on which of the following architecture?
- Kappa
- Event-driven architecture
- Peer-to-peer architecture
- Lambda architecture
Answer: A) Kappa
Explanation:
Apache Flink works on Kappa architecture.
Discuss this Question
3. In the Kappa architecture, data is ingested as ____.
- Batches
- Continuous stream
- Both
- None
Answer: B) Continuous stream
Explanation:
Instead of being processed in batches, data is consumed in the Kappa architecture as a continuous stream.
Discuss this Question
4. Kappa architecture consisted of how many main components?
- 5
- 4
- 3
- 2
Answer: C) 3
Explanation:
The architecture consists of three main components:
- Ingestion layer
- Processing layer
- Storage layer
Discuss this Question
5. Is Apache Flink scalable?
- Yes
- No
Answer: A) Yes
Explanation:
Apache Flink is extremely scalable, with applications spread across several containers in a cluster.
Discuss this Question
6. What is a bounded stream?
- Bounded streams are datasets with a specified beginning and end point.
- Bounded streams are datasets with a specified start but no defined finish.
Answer: A) Bounded streams are datasets with a specified beginning and end point.
Explanation:
Bounded streams are datasets with a specified beginning and end point.
Discuss this Question
7. Which of the following is considered a building block of streaming applications?
- Stream
- State
- Time
- All of the above
Answer: D) All of the above
Explanation:
There are three building blocks of streaming applications:
Discuss this Question
8. What is an unbounded stream?
- Unbounded streams are datasets with a specified beginning and end point.
- Unbounded streams are datasets with a specified start but no defined finish.
Answer: B) Unbounded streams are datasets with a specified start but no defined finish.
Explanation:
Unbounded streams are datasets with a specified start but no defined finish.
Discuss this Question
9. Which of the following APIs handles data in a continuous stream?
- Dataset API
- DataStream API
- Table API
- CEP API
Answer: B) DataStream API
Explanation:
DataStream API handles data in a continuous stream.
Discuss this Question
10. Does Java and Scala programming languages support DataStream API?
- Yes
- No
Answer: A) Yes
Explanation:
Yes, Both java and Scala support DataStream API.
Discuss this Question
11. Which of the following API is used to conduct batch actions on data over time?
- Dataset API
- DataStream API
- Table API
- CEP API
Answer: A) Dataset API
Explanation:
The Apache Flink Dataset API is used to conduct batch actions on data over time.
Discuss this Question
12. Which of the following API is a relational API with SQL-like expression language?
- Database API
- Relation API
- Table API
- SQL-like API
Answer: C) Table API
Explanation:
Table API is a relational API with SQL-like expression language.
Discuss this Question
13. Table API can do both batch and stream processing?
- Yes
- No
Answer: A) Yes
Explanation:
Yes, Table API can do both batch and stream processing.
Discuss this Question
14. What do you mean by CEP?
- Complex engine processing
- Complex event processing
- Clear engine processing
- Complex engine program
Answer: B) Complex event processing
Explanation:
CEP stands for Complex event processing.
Discuss this Question
15. Which of the following API for analyzing event patterns in continuous streaming data?
- Dataset API
- Stream API
- Gelly API
- CEP API
Answer: D) CEP API
Explanation:
CEP API is for analyzing event patterns in continuous streaming data.
Discuss this Question
16. Which of the following is Apache Flink's Graph API?
- Graph QL API
- Graphy API
- Gelly API
- All
Answer: C) Gelly API
Explanation:
Gelly is Apache Flink's Graph API. Gelly is a suite of techniques and tools that is used to do graph analysis on Flink applications.
Discuss this Question
17. Apache Flink's Machine Learning library is called ____.
- Apache ML
- Apache machine
- Flink Learning
- FlinkML
Answer: D) FlinkML
Explanation:
Apache Flink's Machine Learning library is called FlinkML.
Discuss this Question
18. Which of the following processes makes use of a bounded data stream?
- Stream
- Batch
Answer: B) Batch
Explanation:
Batch processing makes use of a bounded data stream.
Discuss this Question
19. Flink's support of how many notions of time?
- 5
- 4
- 3
- 2
Answer: C) 3
Explanation:
Flink explicitly supports three different notions of time:
- event time
- ingestion time
- processing time
Discuss this Question
20. Which of the following method is used to convert one data stream into another by applying a function to each stream element?
- Stream()
- Transform()
- Filter()
- Map()
Answer: D) Map()
Explanation:
The map() method is used to convert one data stream into another by applying a function to each stream element.
Discuss this Question
21. Which of the following function removes undesired components from a data stream depending on a criterion?
- Clear()
- Filter()
- PassBy()
- Intersect()
Answer: B) Filter()
Explanation:
The filter() function removes undesired components from a data stream depending on a criterion.
Discuss this Question
22. Which of the following architectural component of Apache Flink is in charge of accepting code (software) and creating a job dataflow graph, which is subsequently passed to JobManager?
- Client
- TaskManager
- ConfigManager
Answer: A) Client
Explanation:
The client component is in charge of accepting code (software) and creating a job dataflow graph, which is subsequently passed to JobManager.
Discuss this Question
23. Which of the following method is used to aggregate data in a stream by merging several components based on a supplied function into a single element?
- Minimise()
- Lower()
- Dec()
- Reduce()
Answer: D) Reduce()
Explanation:
The reduce() method is used to aggregate data in a stream by merging several components based on a supplied function into a single element.
Discuss this Question
24. Each element of a data stream is transformed into zero, one, or many items of another stream using the ____ function.
- Map()
- Element()
- flatMap()
- StreamMap()
Answer: C) flatMap()
Explanation:
Each element of a data stream is transformed into zero, one, or many items of another stream using the flatMap() function.
Discuss this Question
25. Which of the following architectural component of Apache Flink is responsible for building the execution graph after receiving the Job Dataflow Graph from the Client?
- JobManager
- TaskManager
- Preschedule
Answer: A) JobManager
Explanation:
JobManager is responsible for building the execution graph after receiving the Job Dataflow Graph from the Client.
Discuss this Question
26. Which method writes data to an external system, such as a database or file system?
- Write()
- DB_Write()
- Sink()
- Faucet()
Answer: C) Sink()
Explanation:
Sink() writes data to an external system, such as a database or file system.
Discuss this Question
27. Which of the following method is used to build custom data processing logic on a stream?
- Stream()
- Process()
- Handler()
- Transform()
Answer: B) Process()
Explanation:
To build custom data processing logic on a stream, use the process() function.
Discuss this Question
28. Apache Flink is written in which of the following language?
- Java
- Scala
- Both
- None
Answer: C) Both
Explanation:
Apache Flink is written in Java and Scala language.
Discuss this Question
29. The Task Manager is in charge of carrying out the dataflow task and reporting the results to the ____.
- Client
- JobManager
- Program
Answer: B) JobManager
Explanation:
The Task Manager is in charge of carrying out the dataflow task and reporting the results to the JobManager.
Discuss this Question
30. The ____ function partitions a data stream based on a given key or key field.
- Key()
- KeyStream()
- KeyBy()
- KeyField()
Answer: C) KeyBy()
Explanation:
The keyBy() function partitions a data stream based on a given key or key field.
Discuss this Question