3,868
questions
0
votes
0
answers
12
views
Single trace is not propagated from mongo-Kafka- sink-connector to mongoDb
I am using Mongo kafka sink connector to send the messages from kafka topic to mongodb.
I have integrate the openetelemetry agent with kafka connector with following environment variable
export ...
-1
votes
1
answer
22
views
Unable to sent records to Db from kafka topic using Microsoft sql server sink connector
Summary: I need to send records to kafka topic and using microsoft sql server sink connector have to fetch the record from topic and send to database.
**
Approaches i tried for sending records to ...
0
votes
0
answers
16
views
Kafka Connect sink connector filter for nested json
I have a sink connector to push data from a Kafka Topic to a Snowflake DB, my problem is that i need to add a transformation to filter out certain records based on a nested json.
A typical record ...
0
votes
0
answers
33
views
Snowflake kafka connector encrypted aes256 key fips 140-2 compliant?
Following the below url to set up Snowflake kafka connector
snowflake kafka connector
I then use the private key and pass phrase in the connector config and deploy to connect instance
I see the ...
0
votes
1
answer
14
views
Extracting Key and Appending It to the Message Value in Kafka Connect
I want to extract key and append it to the value by Kafka Connect. I read SMT and test some SMT but I can not do it.
I send this record value :
{"name":"ali"}
by this key :
Person
...
1
vote
0
answers
17
views
Logs not printed for Strimzi Kafkaconnect message transformer
I am using Strimzi kafkaconnect I have created Message Transformer but its logs are not getting printied in console how can i get those logs ?
following is log4j.properties file inside opt/kafka/...
1
vote
0
answers
31
views
kafka connector debezium stuck at snapshot of large data
I setup elasticsearch, kibana, mongodb, and kafka on the same linux server for development purposes. The server has 30GB Memory and enough disk space. I'm using a debezium connector and I'm trying ...
1
vote
0
answers
27
views
Data filtering in Debezium source connector (PostgreSQL)
{
"connector.class": "io.debezium.connector.postgresql.PostgresConnector",
"database.hostname": "<host>",
"database.port": "<port&...
1
vote
0
answers
19
views
I'm working on enabling change data capture from MySQL to Apache Kafka with Debezium but im getting an error in the sink connector
Error image
The error message is as follows:
[2024-09-03 06:57:22,110] ERROR [jdbc-connector|task-0] Failed to process record: 'org.apache.kafka.clients.consumer.ConsumerRecord org.apache.kafka....
0
votes
0
answers
34
views
About camel kafka source connector
I want to use the Camel HTTP Source Connector as the HTTP Source Connector among the Kafka Connectors.
As a result of REST Endpoint, json is as follows.
{
"list": [
{
...
0
votes
0
answers
14
views
problem with TimeBasePartitioner in s3 sink connector
I want to use TimeBasePartitioner and use RecordField for partitioning.
here is my configs:
"partition.duration.ms": "86400000",
"partitioner.class": "io.confluent....
1
vote
1
answer
28
views
Can we attach a init-container with strimzi kafka container?
Can we attach a init-container with strimzi kafka connect container? I want to copy some jars from another image (docker/my-sdk:9.1) to strimzi kafka connect container at runtime using init-container
...
0
votes
1
answer
20
views
ERROR Uncaught exception in REST call to /metrics (org.apache.kafka.connect.runtime.rest.errors.ConnectExceptionMapper:61)
I am running a kafka connector in distributed mode, and explicitly disabled the jmx metrics exposure, still the /metrics endpoint is getting hit in every 45,12 seconds interval and returning 404 ...
0
votes
1
answer
41
views
How to receive tombstone messages in debezium connector?
Im using Kafka and Debezium Postgres Connector.
I wanted to extract only record value without other details, so I am using the "transforms.unwrap.type": "io.debezium.transforms....
2
votes
1
answer
25
views
How to configure Apache Kafka File Sink Connector to write JSON data as pipe-separated string and rotate files hourly?
I'm working with Apache Kafka and need to configure the File Sink Connector to achieve the following:
Convert JSON data to pipe-separated strings: The Kafka topic contains messages in JSON format, ...