Skip to main content

All Questions

Filter by
Sorted by
Tagged with
5 votes
1 answer
2k views

Kafka to BigQuery, best way to consume messages

I need to receive messages to my BigQuery tables and I want to know what is the best way to consume those messages. My Kafka servers who are at AWS they produce AVRO messages and from what I saw ...
João Maia's user avatar
0 votes
0 answers
409 views

kafka connect -> pubsub -> dataflow pipeline data loss

I'm creating a streaming data pipeline that streams data from a kafka cluster to Google BQ. Currently, I am pushing these data to pubsub through pubsub kafka connector, then from pubsub to BQ through ...
Nicholas Leong Zhi Hao's user avatar
3 votes
2 answers
2k views

Google Cloud PubSub: How to read only latest records

In Kafka there is 2 settings earliest and latest where you either read from the earliest offset (0) of the topic or the latest event (useful for realtime) I am now using PubSub with Dataflow and Beam,...
bp2010's user avatar
  • 2,482
-2 votes
1 answer
2k views

Google Pubsub vs Kafka comparison on the restart of pipeline

I am trying to write an ingestion application on GCP by using Apache Beam.I should write it in a streaming way to read data from Kafka or pubsub topics and then ingest to datasource. while it seems ...
Mat's user avatar
  • 37
0 votes
3 answers
3k views

is it possible to Use Kafka with Google cloud Dataflow

i have two question 1) I want to use Kafka with Google cloud Dataflow Pipeline program. in my pipeline program I want to read data from kafka is it possible? 2) I created Instance with BigQuery ...
Jerome's user avatar
  • 13