Spark streaming scala window length by number of objects -


i using spark , scala , create window operation length set in number of objects i.e. window starts empty, stream initiates objects stored in window until holds 10 objects , when 11th comes first dropped.

is possible or have use other structure list or array? documentation (http://spark.apache.org/docs/latest/streaming-programming-guide.html#window-operations) , googling refer time based window (length , interval).

thank in advance.

window in spark streaming characterized windowduration , slideduration (optional). so, time window. can consider using apache flink. supports both count windows , time windows. in comparison spark, flink has streaming ideology. process incoming events arrive (spark processes events in micro-batches). result, flink may have restrictions. give try if suits needs.


Comments

Popular posts from this blog

Ansible - ERROR! the field 'hosts' is required but was not set -

customize file_field button ruby on rails -

SoapUI on windows 10 - high DPI/4K scaling issue -