You are viewing a plain text version of this content. The canonical link for it is here.
Posted to user@beam.apache.org by Sachin Mittal <sj...@gmail.com> on 2023/07/21 06:45:47 UTC

Can we use RedisIO to write records from an unbounded collection

Hi,
I was planning to use the RedisIO write/writeStreams function in a
streaming pipeline.

https://beam.apache.org/releases/javadoc/current/org/apache/beam/sdk/io/redis/RedisIO.html

The pipeline would read an unbounded collection from Kinesis and update
redis.
It will update data for which key exists and add new key/value where it
does not exist.

Please let me know if this IO is suitable for this purpose.

I was reading up on this IO here
https://beam.apache.org/documentation/io/connectors/ and it states that it
only supports batch and not streaming.

If this works then what is a better option to use write or writeStreams
function ?

Thanks
Sachin

Re: Can we use RedisIO to write records from an unbounded collection

Posted by Alexey Romanenko <ar...@gmail.com>.
Hi Sachin,


> On 21 Jul 2023, at 08:45, Sachin Mittal <sj...@gmail.com> wrote:
> 
> I was reading up on this IO here https://beam.apache.org/documentation/io/connectors/ and it states that it only supports batch and not streaming.

I believe it states only about Reading support. For Writing, it mostly depends on what is a type of your pipeline (Bounded or Unbounded) and, iinm, every connector can be used with both types of pipeline. 

Though, please, be aware that Beam doesn’t guarantee an order of processed records. So, it should be taken into account in case of updating the same keys, for example.

> If this works then what is a better option to use write or writeStreams function ?

I think it depends on how you are going to use Redis farther.  

RedisIO.write() is general way of how to write data into Redis and supports the different methods (APEND, SET, LPUSH, RPUSH, etc). 

RedisIO.writeStreams() is supposed to be used with Redis streams. 

—
Alexey