A Java-friendly interface to org.apache.spark.streaming.dstream.DStream, the basic abstraction in Spark Streaming that represents a continuous stream of data.
A Java-friendly interface to org.apache.spark.streaming.dstream.DStream, the basic
abstraction in Spark Streaming that represents a continuous stream of data.
DStreams can either be created from live data (such as, data from TCP sockets, Kafka,
etc.) or it can be generated by transforming existing DStreams using operations such as map
,
window
. For operations applicable to key-value pair DStreams, see
org.apache.spark.streaming.api.java.JavaPairDStream.
A Java-friendly interface to org.apache.spark.streaming.dstream.InputDStream.
:: Experimental ::
DStream representing the stream of data generated by mapWithState
operation on a
JavaPairDStream.
:: Experimental ::
DStream representing the stream of data generated by mapWithState
operation on a
JavaPairDStream. Additionally, it also gives access to the
stream of state snapshots, that is, the state data of all keys after a batch has updated them.
Class of the keys
Class of the values
Class of the state data
Class of the mapped data
A Java-friendly interface to a DStream of key-value pairs, which provides extra methods
like reduceByKey
and join
.
A Java-friendly interface to org.apache.spark.streaming.dstream.InputDStream of key-value pairs.
A Java-friendly interface to org.apache.spark.streaming.dstream.ReceiverInputDStream, the abstract class for defining any input stream that receives data over the network.
A Java-friendly interface to org.apache.spark.streaming.dstream.ReceiverInputDStream, the abstract class for defining any input stream that receives data over the network.
A Java-friendly version of org.apache.spark.streaming.StreamingContext which is the main entry point for Spark Streaming functionality.
A Java-friendly version of org.apache.spark.streaming.StreamingContext which is the main
entry point for Spark Streaming functionality. It provides methods to create
org.apache.spark.streaming.api.java.JavaDStream and
org.apache.spark.streaming.api.java.JavaPairDStream from input sources. The internal
org.apache.spark.api.java.JavaSparkContext (see core Spark documentation) can be accessed
using context.sparkContext
. After creating and transforming DStreams, the streaming
computation can be started and stopped using context.start()
and context.stop()
,
respectively. context.awaitTermination()
allows the current thread to wait for the
termination of a context by stop()
or by an exception.
JavaStreamingContext object contains a number of utility functions.
Spark streaming's Java API.