The JDBC driver can be downloaded directly from Maven and this is done as part of the container's start up. Handle Arrays and Nested Arrays in Kafka JDBC Sink Connector From multiple tables to single topic - JDBC source connector - Kafka ... FlinkSQL Table API和SQL(三) | 王橘长的自留地 The Connect Service is part of the Confluent platform and comes with the platform's distribution along with Apache Kafka. But it seems table.whitelist and topic.prefix are tightly coupled. The Kafka Connect cluster supports running and scaling out connectors (components that support reading and/or writing between external systems). Hope you have enjoyed this read. JDBC Table - StreamSets Docs Auto-creation of tables, and limited auto-evolution is also supported. Data Ingestion From RDBMS: Leveraging Confluent's JDBC Kafka Connector Use a Kafka Streams topology before to "flatten" out the schema and then use this "simple" schema as input for the Kafka JDBC Sink Connector. Use Kafka Connect to read data from a Postgres DB source that has multiple tables into distinct kafka topics; Use Kafka Connect to write that PG data to a sink (we'll use file sink in this example) Setup mkdir kafka-connect-source-example cd kafka-connect-source-example/ mkdir data touch data/data.txt touch docker-compose.yml. Note: Most Apache Kafka ™ systems store all messages in the same format and Kafka Connect workers only support a single converter class for key and value. JDBC Source and Sink Connector. Kafka Connect JDBC Source Connector Configuration Examples, Working Modes, Moving Data from Postgresql. The JDBC sink operate in upsert mode for exchange UPDATE . The connector works with multiple data sources (tables, views; a custom query) in the database.