Packages

o

org.apache.spark.sql

QualitySparkUtils

object QualitySparkUtils

Set of utilities to reach in to private functions

Linear Supertypes
AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. QualitySparkUtils
  2. AnyRef
  3. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. All

Type Members

  1. case class Batch(name: String, strategy: Strategy, rules: Rule[LogicalPlan]*) extends Product with Serializable
  2. case class FakePlan(expr: Expression, child: LogicalPlan) extends LogicalPlan with UnaryNode with Product with Serializable
  3. case class Strategy(maxIterations: Int, errorOnExceed: Boolean = false, maxIterationsSetting: String = null) extends Product with Serializable

Value Members

  1. final def !=(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  2. final def ##(): Int
    Definition Classes
    AnyRef → Any
  3. final def ==(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  4. def add(left: Expression, right: Expression, dataType: DataType): Expression

    Dbr 11.2 broke the contract for add and cast

  5. def arguments(unresolvedFunction: UnresolvedFunction): Seq[Expression]

    Arguments for everything above 2.4

  6. final def asInstanceOf[T0]: T0
    Definition Classes
    Any
  7. def cast(child: Expression, dataType: DataType): Expression

    Dbr 11.2 broke the contract for add and cast

  8. def clone(): AnyRef
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()
  9. final def eq(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  10. def equals(arg0: Any): Boolean
    Definition Classes
    AnyRef → Any
  11. def execute(logicalPlan: LogicalPlan, batch: Batch): LogicalPlan
  12. def finalize(): Unit
    Attributes
    protected[lang]
    Definition Classes
    AnyRef
    Annotations
    @throws( classOf[java.lang.Throwable] )
  13. final def getClass(): Class[_]
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  14. def hashCalendarInterval(c: CalendarInterval, hashlongs: InterpretedHashLongsFunction, digest: Digest): Digest

    Provides Spark 3 specific version of hashing CalendarInterval

  15. def hashCode(): Int
    Definition Classes
    AnyRef → Any
    Annotations
    @native()
  16. final def isInstanceOf[T0]: Boolean
    Definition Classes
    Any
  17. def isPrimitive(dataType: DataType): Boolean
  18. final def ne(arg0: AnyRef): Boolean
    Definition Classes
    AnyRef
  19. def newParser(): SparkSqlParser

    Creates a new parser, introduced in 0.4 - 3.2.0 due to SparkSqlParser having no params

  20. final def notify(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  21. final def notifyAll(): Unit
    Definition Classes
    AnyRef
    Annotations
    @native()
  22. def registerFunction(funcReg: FunctionRegistry)(name: String, builder: (Seq[Expression]) ⇒ Expression): Unit

    Registers functions with spark, Introduced in 0.4 - 3.2.0 support due to extra source parameter - "built-in" is used as no other option is remotely close

  23. def resolution(analyzer: Analyzer, sparkSession: SparkSession, plan: LogicalPlan): Batch
  24. def resolveExpression(dataFrame: DataFrame, expr: Expression): Expression

    Resolves expressions against a dataframe, this allows them to be swapped out after name checking - spark cannot then simply optimise the tree so certain things like constant folding etc.

    Resolves expressions against a dataframe, this allows them to be swapped out after name checking - spark cannot then simply optimise the tree so certain things like constant folding etc. won't show up.

    dataFrame

    resolve's must be against a given dataframe to keep names matching

    expr

    the expression to resolve

  25. def resolveWithOverride(orig: Option[DataFrame]): Option[DataFrame]

    Where resolveWith is not possible (e.g.

    Where resolveWith is not possible (e.g. 10.x DBRs) it is disabled here. This is, in the 10.x DBR case, due to the class files for UnaryNode (FakePlan) being radically different and causing an IncompatibleClassChangeError: Implementing class

  26. final def synchronized[T0](arg0: ⇒ T0): T0
    Definition Classes
    AnyRef
  27. def toString(dataFrame: DataFrame, showParams: ShowParams = ShowParams()): String
  28. def toString(): String
    Definition Classes
    AnyRef → Any
  29. def tryResolveReferences(sparkSession: SparkSession)(expr: Expression, child: LogicalPlan): Expression
    Attributes
    protected
  30. final def wait(): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  31. final def wait(arg0: Long, arg1: Int): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... )
  32. final def wait(arg0: Long): Unit
    Definition Classes
    AnyRef
    Annotations
    @throws( ... ) @native()

Inherited from AnyRef

Inherited from Any

Ungrouped