Packages

o

org.apache.spark.sql

QualitySparkUtils

object QualitySparkUtils

Set of utilities to reach in to private functions

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. QualitySparkUtils
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. case class Batch(name: String, strategy: Strategy, rules: Rule[LogicalPlan]*) extends Product with Serializable
  2. case class FakePlan(expr: Expression, child: LogicalPlan) extends LogicalPlan with UnaryNode with Product with Serializable
  3. case class Strategy(maxIterations: Int, errorOnExceed: Boolean = false, maxIterationsSetting: String = null) extends Product with Serializable
  4. implicit class UnresolvedFunctionOps extends AnyRef

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def add(left: Expression, right: Expression, dataType: DataType): Expression

    Dbr 11.2 broke the contract for add and cast

  5. def arguments(unresolvedFunction: UnresolvedFunction): Seq[Expression]

    Arguments for everything above 2.4

  6. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  7. def cast(child: Expression, dataType: DataType): Expression

    Dbr 11.2 broke the contract for add and cast

  8. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  11. def execute(logicalPlan: LogicalPlan, batch: Batch): LogicalPlan
  12. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  14. def hashCalendarInterval(c: CalendarInterval, hashlongs: InterpretedHashLongsFunction, digest: Digest): Digest

    Provides Spark 3 specific version of hashing CalendarInterval

  15. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  16. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  17. def isPrimitive(dataType: DataType): Boolean
  18. def mismatch(errorSubClass: String, messageParameters: Map[String, String]): TypeCheckResult

    Type signature changed for 3.4 to more detailed setup, 12.2 already uses it

  19. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  20. def newParser(): SparkSqlParser

    Creates a new parser, introduced in 0.4 - 3.2.0 due to SparkSqlParser having no params

  21. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  22. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  23. def registerFunction(funcReg: FunctionRegistry)(name: String, builder: (Seq[Expression]) ⇒ Expression): Unit

    Registers functions with spark, Introduced in 0.4 - 3.2.0 support due to extra source parameter - "built-in" is used as no other option is remotely close

  24. def registerFunctionViaBuiltin(name: String, builder: (Seq[Expression]) ⇒ Expression): Unit

    Used by the SparkSessionExtensions mechanism but registered via builtin registry

  25. def registerFunctionViaExtension(extensions: SparkSessionExtensions)(name: String, builder: (Seq[Expression]) ⇒ Expression): Unit

    Used by the SparkSessionExtensions mechanism

  26. def resolution(analyzer: Analyzer, sparkSession: SparkSession, plan: LogicalPlan): Batch
  27. def resolveExpression(dataFrame: DataFrame, expr: Expression): Expression

    Resolves expressions against a dataframe, this allows them to be swapped out after name checking - spark cannot then simply optimise the tree so certain things like constant folding etc.

    Resolves expressions against a dataframe, this allows them to be swapped out after name checking - spark cannot then simply optimise the tree so certain things like constant folding etc. won't show up.

    dataFrame

    resolve's must be against a given dataframe to keep names matching

    expr

    the expression to resolve

  28. def resolveWithOverride(orig: Option[DataFrame]): Option[DataFrame]

    Where resolveWith is not possible (e.g.

    Where resolveWith is not possible (e.g. 10.x DBRs) it is disabled here. This is, in the 10.x DBR case, due to the class files for UnaryNode (FakePlan) being radically different and causing an IncompatibleClassChangeError: Implementing class

  29. def rowEncoder(structType: StructType): ExpressionEncoder[Row]
  30. def sparkOrdering(dataType: DataType): Ordering[_]
  31. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  32. def tableOrViewNotFound(e: Exception): Option[Either[Exception, Set[String]]]
  33. def toSQLExpr(value: Expression): String
  34. def toSQLType(dataType: DataType): String
  35. def toSQLValue(value: Any, dataType: DataType): String
  36. def toString(dataFrame: DataFrame, showParams: ShowParams = ShowParams()): String
  37. def toString(): String
    Definition Classes
    AnyRef → Any
  38. def tryResolveReferences(sparkSession: SparkSession)(expr: Expression, child: LogicalPlan): Expression
    Attributes
    protected
  39. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  40. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  41. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from AnyRef

Inherited from Any

Ungrouped