A generic encoder for JVM objects.
A generic encoder for JVM objects.
The schema after converting T
to a Spark SQL row.
A set of expressions, one for each top-level field that can be used to extract the values from a raw object into an InternalRow.
An expression that will construct an object given an InternalRow.
A classtag for T
.
A factory for constructing encoders that convert objects and primitives to and from the internal row format using catalyst expressions and code generation.
A factory for constructing encoders that convert objects and primitives to and from the internal row format using catalyst expressions and code generation. By default, the expressions used to retrieve values from an input row when producing an object will be created as follows:
value
.
A factory for constructing encoders that convert external row to/from the Spark SQL internal binary representation.
A factory for constructing encoders that convert external row to/from the Spark SQL internal binary representation.
The following is a mapping between Spark SQL types and its allowed external types:
BooleanType -> java.lang.Boolean ByteType -> java.lang.Byte ShortType -> java.lang.Short IntegerType -> java.lang.Integer FloatType -> java.lang.Float DoubleType -> java.lang.Double StringType -> String DecimalType -> java.math.BigDecimal or scala.math.BigDecimal or Decimal DateType -> java.sql.Date TimestampType -> java.sql.Timestamp BinaryType -> byte array ArrayType -> scala.collection.Seq or Array MapType -> scala.collection.Map StructType -> org.apache.spark.sql.Row
Returns an internal encoder object that can be used to serialize / deserialize JVM objects into Spark SQL rows.
Returns an internal encoder object that can be used to serialize / deserialize JVM objects into Spark SQL rows. The implicit encoder should always be unresolved (i.e. have no attribute references from a specific schema.) This requirement allows us to preserve whether a given object type is being bound by name or by ordinal when doing resolution.