All Questions
5 questions
5
votes
1
answer
2k
views
Kafka to BigQuery, best way to consume messages
I need to receive messages to my BigQuery tables and I want to know what is the best way to consume those messages.
My Kafka servers who are at AWS they produce AVRO messages and from what I saw ...
0
votes
0
answers
409
views
kafka connect -> pubsub -> dataflow pipeline data loss
I'm creating a streaming data pipeline that streams data from a kafka cluster to Google BQ.
Currently, I am pushing these data to pubsub through pubsub kafka connector, then from pubsub to BQ through ...
3
votes
2
answers
2k
views
Google Cloud PubSub: How to read only latest records
In Kafka there is 2 settings earliest and latest where you either read from the earliest offset (0) of the topic or the latest event (useful for realtime)
I am now using PubSub with Dataflow and Beam,...
-2
votes
1
answer
2k
views
Google Pubsub vs Kafka comparison on the restart of pipeline
I am trying to write an ingestion application on GCP by using Apache Beam.I should write it in a streaming way to read data from Kafka or pubsub topics and then ingest to datasource.
while it seems ...
0
votes
3
answers
3k
views
is it possible to Use Kafka with Google cloud Dataflow
i have two question
1) I want to use Kafka with Google cloud Dataflow Pipeline program. in my pipeline program I want to read data from kafka is it possible?
2) I created Instance with BigQuery ...