Here's a simple tutorial for sending data from CockroachDB directly to Confluent Cloud using CockroachDB Change Data Capture, which is typically referred to as a Changefeed.
The ID list here for your Kafka cluster will be needed in the steps below
ccloud kafka cluster list
The API Key and API Secret are needed for creating the CockroachDB Changefeed
ccloud api-key create --resource <RESOURCE ID>
The end point is needed to connect the Changefeed to Kafka
ccloud kafka cluster describe <RESOURCE ID>
ccloud kafka topic create demo_t --partitions 6
ccloud kafka topic consume demo_t
Open a new terminal window and leave the Kafka consumer one open for later
cockroach sql ...
create table t (k int default unique_rowid() primary key, v string);
When creating the changefeed, notice that you'll use 'kafka://' instead of using https:// or SASL_SSL://. Also, be sure to include your API Key and Secret in the Changefeed.
CREATE CHANGEFEED FOR TABLE t INTO 'kafka://<CONFLUENT CLOUD URL>:9092?sasl_enabled=true&sasl_password=<API SECRET>&sasl_user=<API KEY>&tls_enabled=true&topic_prefix=demo_' WITH updated, key_in_value, format = json;
insert into t (v) values ('one');
insert into t (v) values ('two');
insert into t (v) values ('three');
Comments
Post a Comment