Confluent CCDAK - Confluent Certified Developer for Apache Kafka Exam

Question #1 (Topic: Exam A)
You need to consume messages from Kafka using the command-line interface (CLI).
Which command should you use?
A. kafka-console-consumer B. kafka-consumer C. kafka-get-message D. kafka- consume
Answer: A
Question #2 (Topic: Exam A)
You are experiencing low throughput from a Java producer. You monitor the Kafka metrics and notice that the metrics for that producer show low I/O thread ratio and low I/O thread wait ratio.
What is the most likely the cause of the slow producer performance?
A. Compression is enabled. B. The producer is sending large batches of messages. C. There is a bad data link layer (layer 2) connection from the producer to the cluster. D. The producer code has an expensive callback function.
Answer: D
Question #3 (Topic: Exam A)
Which configuration is valid for deploying a JDBC Source Connector to read all the rows from orders table and write it to db1-orders topic?
A. {
"name":"jdbc-source",
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"tasks.max": "1",
"connection.url": "jdbc:mysql://mysql:3306/db1?user=user&password=password&useSSL=false",
"topic.prefix": "db1-",
"table.whitelist": "orders"
}
B. {
"name":"db1-orders".
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"tasks.max": "1",
"connection.url": "jdbc:mysql://mysql:3306/db1?user=user&password=password&useSSL=false",
"topic.prefix": "db1-",
"table.blacklist": "ord*"
}
C. {
"name":"orders-connect",
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"tasks.max": "1",
"connection.url": "jdbc:mysql://mysql:3306/db1",
"topic.whitelist": "orders",
"auto.create": "true"
}
D. {
"name":"jdbc-source",
"connector.class": "io.confluent.connect.jdbc.JdbcSourceConnector",
"tasks.max": "1",
"connection.url": "jdbc:mysql://mysql:3306/db1?user=user&useAutoAuthentication=true", "topic.prefix": "db1-",
"table.whitelist": "orders"
}
Answer: A
Question #4 (Topic: Exam A)
You are developing a Kafka Streams application with a complex topology that has multiple sources, processors, sinks, and sub-topologies. You are working in a development environment and do not have access to a real Kafka Cluster or Topic.
You need to perform Unit Test on your Kafka Streams application.
Which should you use?
A. TopologyTestDriver B. MockProducer, MockConsumer C. TestProducer, TestСonsumer D. KafkaUnitTestDriver
Answer: A
Question #5 (Topic: Exam A)
Which two statements are correct about transactions in Kafka? (Choose two.)
A. All messages from a failed transaction will be deleted from a Kafka topic. B. Transactions are only possible when writing messages to a topic with single partition. C. Consumers can consume both committed and uncommitted transactions. D. Information about producers and their transactions is stored in the '__transaction_state' topic. E. Transactions guarantee at least once delivery of messages.
Answer: CD
Download Exam
Page: 1 / 14
Total 70 questions