Trait

org.apache.spark.sql.sources

MutableRelation

Related Doc: package sources

Permalink

trait MutableRelation extends BaseRelation with NativeTableRelation

::DeveloperApi

API for updates and deletes to a relation.

Annotations
@DeveloperApi()
Linear Supertypes
Known Subclasses
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. MutableRelation
  2. NativeTableRelation
  3. DestroyRelation
  4. BaseRelation
  5. AnyRef
  6. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Abstract Value Members

  1. abstract val connFactory: () ⇒ Connection

    Permalink
    Attributes
    protected
    Definition Classes
    NativeTableRelation
  2. abstract def connProperties: ConnectionProperties

    Permalink
    Definition Classes
    NativeTableRelation
  3. abstract def destroy(ifExists: Boolean): Unit

    Permalink

    Destroy and cleanup this relation.

    Destroy and cleanup this relation. It may include, but not limited to, dropping the external table that this relation represents.

    Definition Classes
    DestroyRelation
  4. abstract def dialect: JdbcDialect

    Permalink
    Attributes
    protected
    Definition Classes
    NativeTableRelation
  5. abstract def getDeletePlan(relation: LogicalRelation, child: SparkPlan, keyColumns: Seq[Attribute]): SparkPlan

    Permalink

    Get a spark plan to delete rows the relation.

    Get a spark plan to delete rows the relation. The result of SparkPlan execution should be a count of number of updated rows.

  6. abstract def getKeyColumns: Seq[String]

    Permalink

    Get the "key" columns for the table that need to be projected out by UPDATE and DELETE operations for affecting the selected rows.

  7. abstract def getPrimaryKeyColumns(session: SnappySession): Seq[String]

    Permalink

    Get the "primary key" of the row table and "key columns" of the column table

  8. abstract def getUpdatePlan(relation: LogicalRelation, child: SparkPlan, updateColumns: Seq[Attribute], updateExpressions: Seq[Expression], keyColumns: Seq[Attribute]): SparkPlan

    Permalink

    Get a spark plan to update rows in the relation.

    Get a spark plan to update rows in the relation. The result of SparkPlan execution should be a count of number of updated rows.

  9. abstract def isRowTable: Boolean

    Permalink
    Definition Classes
    NativeTableRelation
  10. abstract def partitionColumns: Seq[String]

    Permalink

    Get the partitioning columns for the table, if any.

  11. abstract def schema: StructType

    Permalink
    Definition Classes
    BaseRelation
  12. abstract def sqlContext: SQLContext

    Permalink
    Definition Classes
    BaseRelation
  13. abstract val table: String

    Permalink

    Name of this table as stored in catalog.

    Name of this table as stored in catalog.

    Definition Classes
    NativeTableRelation
  14. abstract def truncate(): Unit

    Permalink

    Truncate the table represented by this relation.

    Truncate the table represented by this relation.

    Definition Classes
    DestroyRelation

Concrete Value Members

  1. final def !=(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  4. final def asInstanceOf[T0]: T0

    Permalink
    Definition Classes
    Any
  5. def clone(): AnyRef

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  6. final def eq(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  7. def equals(arg0: Any): Boolean

    Permalink
    Definition Classes
    AnyRef → Any
  8. def executeUpdate(sql: String, defaultSchema: String): Int

    Permalink

    Execute a DML SQL and return the number of rows affected.

    Execute a DML SQL and return the number of rows affected.

    Definition Classes
    NativeTableRelation
  9. def finalize(): Unit

    Permalink
    Attributes
    protected[java.lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  10. final def getClass(): Class[_]

    Permalink
    Definition Classes
    AnyRef → Any
  11. def hashCode(): Int

    Permalink
    Definition Classes
    AnyRef → Any
  12. final def isInstanceOf[T0]: Boolean

    Permalink
    Definition Classes
    Any
  13. final def ne(arg0: AnyRef): Boolean

    Permalink
    Definition Classes
    AnyRef
  14. def needConversion: Boolean

    Permalink

    Whether does it need to convert the objects in Row to internal representation, for example: java.lang.String to UTF8String java.lang.Decimal to Decimal

    Whether does it need to convert the objects in Row to internal representation, for example: java.lang.String to UTF8String java.lang.Decimal to Decimal

    If needConversion is false, buildScan() should return an RDD of InternalRow

    Definition Classes
    BaseRelation
    Since

    1.4.0

    Note

    The internal representation is not stable across releases and thus data sources outside of Spark SQL should leave this as true.

  15. final def notify(): Unit

    Permalink
    Definition Classes
    AnyRef
  16. final def notifyAll(): Unit

    Permalink
    Definition Classes
    AnyRef
  17. def sizeInBytes: Long

    Permalink

    Returns an estimated size of this relation in bytes.

    Returns an estimated size of this relation in bytes. This information is used by the planner to decide when it is safe to broadcast a relation and can be overridden by sources that know the size ahead of time. By default, the system will assume that tables are too large to broadcast. This method will be called multiple times during query planning and thus should not perform expensive operations for each invocation.

    Definition Classes
    BaseRelation
    Since

    1.3.0

    Note

    It is always better to overestimate size than underestimate, because underestimation could lead to execution plans that are suboptimal (i.e. broadcasting a very large table).

  18. final def synchronized[T0](arg0: ⇒ T0): T0

    Permalink
    Definition Classes
    AnyRef
  19. def toString(): String

    Permalink
    Definition Classes
    AnyRef → Any
  20. def unhandledFilters(filters: Array[Filter]): Array[Filter]

    Permalink

    Returns the list of Filters that this datasource may not be able to handle.

    Returns the list of Filters that this datasource may not be able to handle. These returned Filters will be evaluated by Spark SQL after data is output by a scan. By default, this function will return all filters, as it is always safe to double evaluate a Filter. However, specific implementations can override this function to avoid double filtering when they are capable of processing a filter internally.

    Definition Classes
    BaseRelation
    Since

    1.6.0

  21. final def wait(): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  22. final def wait(arg0: Long, arg1: Int): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  23. final def wait(arg0: Long): Unit

    Permalink
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  24. def withKeyColumns(relation: LogicalRelation, keyColumns: Seq[String]): LogicalRelation

    Permalink

    If required inject the key columns in the original relation.

Inherited from NativeTableRelation

Inherited from DestroyRelation

Inherited from BaseRelation

Inherited from AnyRef

Inherited from Any

Ungrouped