| Interface | Description |
|---|---|
| AccumulableParam<R,T> | Deprecated
use AccumulatorV2.
|
| AccumulatorParam<T> | Deprecated
use AccumulatorV2.
|
| CleanupTask |
Classes that represent cleaning tasks.
|
| FutureAction<T> |
A future for the result of an action to support cancellation.
|
| JobSubmitter |
Handle via which a "run" function passed to a
ComplexFutureAction
can submit jobs for execution. |
| Partition |
An identifier for a partition in an RDD.
|
| SparkExecutorInfo |
Exposes information about Spark Executors.
|
| SparkJobInfo |
Exposes information about Spark Jobs.
|
| SparkStageInfo |
Exposes information about Spark Stages.
|
| TaskEndReason |
:: DeveloperApi ::
Various possible reasons why a task ended.
|
| TaskFailedReason |
:: DeveloperApi ::
Various possible reasons why a task failed.
|
| Class | Description |
|---|---|
| Accumulable<R,T> | Deprecated
use AccumulatorV2.
|
| Accumulator<T> | Deprecated
use AccumulatorV2.
|
| AccumulatorParam.DoubleAccumulatorParam$ | Deprecated
use AccumulatorV2.
|
| AccumulatorParam.FloatAccumulatorParam$ | Deprecated
use AccumulatorV2.
|
| AccumulatorParam.IntAccumulatorParam$ | Deprecated
use AccumulatorV2.
|
| AccumulatorParam.LongAccumulatorParam$ | Deprecated
use AccumulatorV2.
|
| AccumulatorParam.StringAccumulatorParam$ | Deprecated
use AccumulatorV2.
|
| Aggregator<K,V,C> |
:: DeveloperApi ::
A set of functions used to aggregate data.
|
| CleanAccum | |
| CleanBroadcast | |
| CleanCheckpoint | |
| CleanRDD | |
| CleanShuffle | |
| CleanupTaskWeakReference |
A WeakReference associated with a CleanupTask.
|
| ComplexFutureAction<T> |
A
FutureAction for actions that could trigger multiple Spark jobs. |
| Dependency<T> |
:: DeveloperApi ::
Base class for dependencies.
|
| ExceptionFailure |
:: DeveloperApi ::
Task failed due to a runtime exception.
|
| ExecutorLostFailure |
:: DeveloperApi ::
The task failed because the executor that it was running on was lost.
|
| ExecutorRegistered | |
| ExecutorRemoved | |
| ExpireDeadHosts | |
| FetchFailed |
:: DeveloperApi ::
Task failed to fetch shuffle data from a remote node.
|
| HashPartitioner |
A
Partitioner that implements hash-based partitioning using
Java's Object.hashCode. |
| InternalAccumulator |
A collection of fields and methods concerned with internal accumulators that represent
task level metrics.
|
| InternalAccumulator.input$ | |
| InternalAccumulator.output$ | |
| InternalAccumulator.shuffleRead$ | |
| InternalAccumulator.shuffleWrite$ | |
| InterruptibleIterator<T> |
:: DeveloperApi ::
An iterator that wraps around an existing iterator to provide task killing functionality.
|
| NarrowDependency<T> |
:: DeveloperApi ::
Base class for dependencies where each partition of the child RDD depends on a small number
of partitions of the parent RDD.
|
| OneToOneDependency<T> |
:: DeveloperApi ::
Represents a one-to-one dependency between partitions of the parent and child RDDs.
|
| Partitioner |
An object that defines how the elements in a key-value pair RDD are partitioned by key.
|
| RangeDependency<T> |
:: DeveloperApi ::
Represents a one-to-one dependency between ranges of partitions in the parent and child RDDs.
|
| RangePartitioner<K,V> |
A
Partitioner that partitions sortable records by range into roughly
equal ranges. |
| Resubmitted |
:: DeveloperApi ::
A
org.apache.spark.scheduler.ShuffleMapTask that completed successfully earlier, but we
lost the executor before the stage completed. |
| SerializableWritable<T extends org.apache.hadoop.io.Writable> | |
| ShuffleDependency<K,V,C> |
:: DeveloperApi ::
Represents a dependency on the output of a shuffle stage.
|
| SimpleFutureAction<T> |
A
FutureAction holding the result of an action that triggers a single job. |
| SparkConf |
Configuration for a Spark application.
|
| SparkContext |
Main entry point for Spark functionality.
|
| SparkEnv |
:: DeveloperApi ::
Holds all the runtime environment objects for a running Spark instance (either master or worker),
including the serializer, RpcEnv, block manager, map output tracker, etc.
|
| SparkExecutorInfoImpl | |
| SparkFiles |
Resolves paths to files added through
SparkContext.addFile(). |
| SparkFirehoseListener |
Class that allows users to receive all SparkListener events.
|
| SparkJobInfoImpl | |
| SparkMasterRegex |
A collection of regexes for extracting information from the master string.
|
| SparkStageInfoImpl | |
| SparkStatusTracker |
Low-level status reporting APIs for monitoring job and stage progress.
|
| SpillListener |
A
SparkListener that detects whether spills have occurred in Spark jobs. |
| StopMapOutputTracker | |
| Success |
:: DeveloperApi ::
Task succeeded.
|
| TaskCommitDenied |
:: DeveloperApi ::
Task requested the driver to commit, but was denied.
|
| TaskContext |
Contextual information about a task which can be read or mutated during
execution.
|
| TaskKilled |
:: DeveloperApi ::
Task was killed intentionally and needs to be rescheduled.
|
| TaskResultLost |
:: DeveloperApi ::
The task finished successfully, but the result was lost from the executor's block manager before
it was fetched.
|
| TaskSchedulerIsSet |
An event that SparkContext uses to notify HeartbeatReceiver that SparkContext.taskScheduler is
created.
|
| TaskState | |
| TestUtils |
Utilities for tests.
|
| UnknownReason |
:: DeveloperApi ::
We don't know why the task ended -- for example, because of a ClassNotFound exception when
deserializing the task result.
|
| Enum | Description |
|---|---|
| JobExecutionStatus |
| Exception | Description |
|---|---|
| SparkException | |
| TaskKilledException |
:: DeveloperApi ::
Exception thrown when a task is explicitly killed (i.e., task failure is expected).
|
Accumulator
and StorageLevel, are also used in Java, but the
org.apache.spark.api.java package contains the main Java API.