package shim
A collection of functions with possibly varying behaviour across Spark versions. Should actual implementations fracture they will be implemented as part of ShimUtils but the interface will remain to proxy the calls.
- Alphabetic
- By Inheritance
- shim
- AnyRef
- Any
- Hide All
- Show All
- Public
- All
Type Members
- class AbstractInjectableParser extends ParserInterface with Logging
-
case class
ShowParams(numRows: Int = 1000, truncate: Int = 0, vertical: Boolean = false) extends Product with Serializable
Paramters to pass into showString for debugging / validation
Paramters to pass into showString for debugging / validation
- numRows
defaults to 1000
Value Members
- def deriveUnitLiteral: Expression
-
def
ifIsNull(dataType: DataType, path: Expression, nonNullExpr: Expression): Expression
If the path is null then uses a null literal with dataType, if it's not null it uses the nonNullExpr
-
def
registerSessionPlan(logicalPlan: Rule[LogicalPlan])(isPresentFilter: (Rule[LogicalPlan]) ⇒ Boolean): Boolean
Registers a session only plan via experimental methods when isPresentFilter is not true
Registers a session only plan via experimental methods when isPresentFilter is not true
- isPresentFilter
a filter that should return true when the plan is identical and it should not be added
- returns
true if the plan has been added
-
def
toCatalyst(any: Any): Any
work around 2.13 issue / possibly 4 the encoder generates Seq -> ArraySeq$ofRef for a DF with a Seq in it.
work around 2.13 issue / possibly 4 the encoder generates Seq -> ArraySeq$ofRef for a DF with a Seq in it. CatalystTypeConverters looks for immutable.Seq not collection.Seq, which is different in 2.13 so we need to check for it RowEncoder is responsible for generating that and wrapped array disappears in 2.13
- object LambdaFunctions