Internal implementation of the user-facing Catalog
.
Wrap plan-specific query hints (like joinType).
Wrap plan-specific query hints (like joinType). This extends Spark's BroadcastHint so that filters/projections etc can be pushed below this by optimizer.
A class that enables the setting and getting of mutable config parameters/hints.
A class that enables the setting and getting of mutable config parameters/hints.
In the presence of a SQLContext, these can be set and queried by passing SET commands into Spark SQL's query functions (i.e. sql()). Otherwise, users of this class can modify the hints by programmatically calling the setters and getters of this class.
SQLConf is thread-safe (internally synchronized, so safe to be used in multiple threads).
::DeveloperApi:: Catalog using Hive for persistence and adding Snappy extensions like stream/topK tables and returning LogicalPlan to materialize these entities.
::DeveloperApi:: Catalog using Hive for persistence and adding Snappy extensions like stream/topK tables and returning LogicalPlan to materialize these entities.
A helper class that enables substitution using syntax like
${var}
, ${system:var}
and ${env:var}
.
A helper class that enables substitution using syntax like
${var}
, ${system:var}
and ${env:var}
.
Variable substitution is controlled by SQLConf.variableSubstituteEnabled
.
Helper object for PutInto operations for column tables.
Helper object for PutInto operations for column tables. This class takes the logical plans from SnappyParser and converts it into another plan.
An utility class to store jar file reference with their individual class loaders.
An utility class to store jar file reference with their individual class loaders. This is to reflect class changes at driver side. e.g. If an UDF definition changes the driver should pick up the correct UDF class. This class can not initialize itself after a driver failure. So the callers will have to make sure that the classloader gets initialized after a driver startup. Usually it can be achieved by adding classloader at query time.
Deals with any escape characters in the LIKE pattern in optimization.
Deals with any escape characters in the LIKE pattern in optimization. Does not deal with startsAndEndsWith equivalent of Spark's LikeSimplification so 'a%b' kind of pattern with additional escaped chars will not be optimized.
Static SQL configuration is a cross-session, immutable Spark configuration.
Static SQL configuration is a cross-session, immutable Spark configuration. External users can
see the static sql configs via SparkSession.conf
, but can NOT set/unset them.
All classes in this package are considered an internal API to Spark and are subject to change between minor releases.