-
To use daft for processing in streaming ( micro batch or similar ) , it should be possible to use the time travel feature of iceberg/delta to do checkpointing. Is there any plans for to summarize ... something like .. spark.readStream.table(delta_table) |
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 2 replies
-
Hey @vibh3s! Could you elaborate on your use-case here? Streaming means quite different things depending on who you ask 😛 Daft today is mostly a batch engine, but micro-batching is definitely on the roadmap. Interested to learn more about how you're thinking about this problem though. |
Beta Was this translation helpful? Give feedback.
Hi @vibh3s, We're currently building out checkpoint functionality in newest execution engine. We should be able to snapshot state of our execution engine and therefore recover or resume from prior checkpoints.