Skip to content
#

stream-processing

Here are 782 public repositories matching this topic...

pkaske
pkaske commented Dec 29, 2020

I figured out a way to get the (x,y,z) data points for each frame from one hand previously. but im not sure how to do that for the new holistic model that they released. I am trying to get the all landmark data points for both hands as well as parts of the chest and face. does anyone know how to extract the holistic landmark data/print it to a text file? or at least give me some directions as to h

good first issue type:research solution:holistic stat:awaiting googler
flink-learning

flink learning blog. http://www.54tianzhisheng.cn/ 含 Flink 入门、概念、原理、实战、性能调优、源码解析等内容。涉及 Flink Connector、Metrics、Library、DataStream API、Table API & SQL 等内容的学习案例,还有 Flink 落地应用的大型项目案例(PVUV、日志存储、百亿数据实时去重、监控告警)分享。欢迎大家支持我的专栏《大数据实时计算引擎 Flink 实战与性能优化》

  • Updated Mar 23, 2022
  • Java
j4freeman
j4freeman commented Feb 14, 2022

I have a use case where I need to create a new stream containing the bearing between two consecutive points in a pre-existing lat/lon stream. Normally bearing would be available in a standard lib but in a pinch can easily be implemented through sin, cos, and atan2 funcs, none of which are currently available in ksql.

Basic trig functions have a range of use cases in geometric and geographic co

enhancement good first issue user-defined-functions streaming-engine
benthos
heikkilamarko
heikkilamarko commented Jan 3, 2022

Under the hood, Benthos csv input uses the standard encoding/csv packages's csv.Reader struct.

The current implementation of csv input doesn't allow setting the LazyQuotes field.

We have a use case where we need to set the LazyQuotes field in order to make things work correctly.

enhancement inputs good first issue effort: lower
watermill
xorcare
xorcare commented Nov 22, 2021

This comment says that the message ID is optional,
but for SQL transport it is a mandatory attribute,
in turn it causes misunderstanding?

Is it possible to fix it or did I get something wrong?

https://github.com/ThreeDotsLabs/watermill/blob/b9928e750ba673cf93d442db88efc04706f67388/message/message.go#L20
https://github.com/ThreeDotsLabs/watermill/blob/b9928e750ba673cf93d442db88efc04706f6

help wanted good first issue S
danfojs
kylemcdonald
kylemcdonald commented Mar 2, 2022

I would like to convert a DataFrame to a JSON object the same way that Pandas does with to_dict().

toJSON() treats rows as elements in an array, and ignores the index labels. But to_dict() uses the index as keys.

Here is an example of what I have in mind:

function to_dict(df) {
    const rows = df.toJSON();
    const entries = df.index.map((e, i) => ({ [e]: rows[i] }));
  
enhancement good first issue
nisanharamati
nisanharamati commented Jul 24, 2018

It can be very difficult to piece together a reasonably estimate of a history of events from the current workers logs because none of them have timestamps.

So for that end, I think we should add timestamps to the logs.

This has some cons:

  1. We can't just use @printf like we have been until now. We need to either include a timestamp in every @printf call (laborious and error prone) or c
hazelcast-jet
jdormit
jdormit commented Aug 18, 2019

The mapcat function seems to choke if you pass in a mapping function that returns a stream instead of a sequence:

user> (s/stream->seq (s/mapcat (fn [x] (s/->source [x])) (s/->source [1 2 3])))
()
Aug 18, 2019 2:23:39 PM clojure.tools.logging$eval5577$fn__5581 invoke
SEVERE: error in message propagation
java.lang.IllegalArgumentException: Don't know how to create ISeq from: manifold.
yomo

🦖 Serverless Streaming Framework for Low-latency Edge Computing applications, running atop QUIC protocol, as Metaverse infrastructure, engaging 5G technology and Geo-distributed System.

  • Updated Mar 26, 2022
  • Go

Improve this page

Add a description, image, and links to the stream-processing topic page so that developers can more easily learn about it.

Curate this topic

Add this topic to your repo

To associate your repository with the stream-processing topic, visit your repo's landing page and select "manage topics."

Learn more