Skip to content

CreateViewCommand Logical Command

CreateViewCommand is a <> for <>.

CreateViewCommand is <> to represent the following:

  • <> SQL statements

  • Dataset operators: <>, <>, <> and <>

CAUTION: FIXME What's the difference between CreateTempViewUsing?

CreateViewCommand works with different <>.

[[viewType]] .CreateViewCommand Behaviour Per View Type [options="header",cols="1m,2",width="100%"] |=== | View Type | Description / Side Effect

| LocalTempView | [[LocalTempView]] A session-scoped local temporary view that is available until the session, that has created it, is stopped.

When executed, CreateViewCommand requests the current SessionCatalog to create a temporary view.

| GlobalTempView | [[GlobalTempView]] A cross-session global temporary view that is available until the Spark application stops.

When executed, CreateViewCommand requests the current SessionCatalog to create a global view.

| PersistedView | [[PersistedView]] A cross-session persisted view that is available until dropped.

When executed, CreateViewCommand checks if the table exists. If it does and replace is enabled CreateViewCommand requests the current SessionCatalog to alter a table. Otherwise, when the table does not exist, CreateViewCommand requests the current SessionCatalog to create it. |===

/* CREATE [OR REPLACE] [[GLOBAL] TEMPORARY]
VIEW [IF NOT EXISTS] tableIdentifier
[identifierCommentList] [COMMENT STRING]
[PARTITIONED ON identifierList]
[TBLPROPERTIES tablePropertyList] AS query */

// Demo table for "AS query" part
spark.range(10).write.mode("overwrite").saveAsTable("t1")

// The "AS" query
val asQuery = "SELECT * FROM t1"

// The following queries should all work fine
val q1 = "CREATE VIEW v1 AS " + asQuery
sql(q1)

val q2 = "CREATE OR REPLACE VIEW v1 AS " + asQuery
sql(q2)

val q3 = "CREATE OR REPLACE TEMPORARY VIEW v1 " + asQuery
sql(q3)

val q4 = "CREATE OR REPLACE GLOBAL TEMPORARY VIEW v1 " + asQuery
sql(q4)

val q5 = "CREATE VIEW IF NOT EXISTS v1 AS " + asQuery
sql(q5)

// The following queries should all fail
// the number of user-specified columns does not match the schema of the AS query
val qf1 = "CREATE VIEW v1 (c1 COMMENT 'comment', c2) AS " + asQuery
scala> sql(qf1)
org.apache.spark.sql.AnalysisException: The number of columns produced by the SELECT clause (num: `1`) does not match the number of column names specified by CREATE VIEW (num: `2`).;
  at org.apache.spark.sql.execution.command.CreateViewCommand.run(views.scala:134)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
  at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190)
  at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190)
  at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3254)
  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
  at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3253)
  at org.apache.spark.sql.Dataset.<init>(Dataset.scala:190)
  at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:75)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:641)
  ... 49 elided

// CREATE VIEW ... PARTITIONED ON is not allowed
val qf2 = "CREATE VIEW v1 PARTITIONED ON (c1, c2) AS " + asQuery
scala> sql(qf2)
org.apache.spark.sql.catalyst.parser.ParseException:
Operation not allowed: CREATE VIEW ... PARTITIONED ON(line 1, pos 0)

// Use the same name of t1 for a new view
val qf3 = "CREATE VIEW t1 AS " + asQuery
scala> sql(qf3)
org.apache.spark.sql.AnalysisException: `t1` is not a view;
  at org.apache.spark.sql.execution.command.CreateViewCommand.run(views.scala:156)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
  at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190)
  at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190)
  at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3254)
  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
  at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3253)
  at org.apache.spark.sql.Dataset.<init>(Dataset.scala:190)
  at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:75)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:641)
  ... 49 elided

// View already exists
val qf4 = "CREATE VIEW v1 AS " + asQuery
scala> sql(qf4)
org.apache.spark.sql.AnalysisException: View `v1` already exists. If you want to update the view definition, please use ALTER VIEW AS or CREATE OR REPLACE VIEW AS;
  at org.apache.spark.sql.execution.command.CreateViewCommand.run(views.scala:169)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult$lzycompute(commands.scala:70)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.sideEffectResult(commands.scala:68)
  at org.apache.spark.sql.execution.command.ExecutedCommandExec.executeCollect(commands.scala:79)
  at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190)
  at org.apache.spark.sql.Dataset$$anonfun$6.apply(Dataset.scala:190)
  at org.apache.spark.sql.Dataset$$anonfun$52.apply(Dataset.scala:3254)
  at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:77)
  at org.apache.spark.sql.Dataset.withAction(Dataset.scala:3253)
  at org.apache.spark.sql.Dataset.<init>(Dataset.scala:190)
  at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:75)
  at org.apache.spark.sql.SparkSession.sql(SparkSession.scala:641)
  ... 49 elided

[[innerChildren]] CreateViewCommand returns the <> when requested for the inner nodes (that should be shown as an inner nested tree of this node).

[source, scala]

val sqlText = "CREATE VIEW v1 AS " + asQuery val plan = spark.sessionState.sqlParser.parsePlan(sqlText) scala> println(plan.numberedTreeString) 00 CreateViewCommand v1, SELECT * FROM t1, false, false, PersistedView 01 +- 'Project [*] 02 +- 'UnresolvedRelation t1


=== [[prepareTable]] Creating CatalogTable -- prepareTable Internal Method

[source, scala]

prepareTable(session: SparkSession, analyzedPlan: LogicalPlan): CatalogTable

prepareTable...FIXME

NOTE: prepareTable is used exclusively when CreateViewCommand logical command is <>.

=== [[run]] Executing Logical Command -- run Method

[source, scala]

run(sparkSession: SparkSession): Seq[Row]

NOTE: run is part of <> to execute (run) a logical command.

run requests the input SparkSession for the <> that is in turn requested to "execute" the <> (which simply creates a QueryExecution).

[NOTE]

run uses a <> in Spark SQL to make sure that a logical plan can be analyzed, i.e.

[source, scala]

val qe = sparkSession.sessionState.executePlan(child) qe.assertAnalyzed() val analyzedPlan = qe.analyzed


====

run <>.

run requests the input SparkSession for the <> that is in turn requested for the <>.

run then branches off per the <>:

** If the <> exists and the <> flag is on, run simply does nothing (and exits)

** If the <> exists and the <> flag is on, run requests the SessionCatalog for the table metadata and replaces the table, i.e. run requests the SessionCatalog to drop the table followed by re-creating it (with a <>)

** If however the <> does not exist, run simply requests the SessionCatalog to create it (with a <>)

run throws an AnalysisException for <> when they already exist, the <> flag is off and the table type is not a view.

[name] is not a view

run throws an AnalysisException for <> when they already exist and the <> and <> flags are off.

View [name] already exists. If you want to update the view definition, please use ALTER VIEW AS or CREATE OR REPLACE VIEW AS

run throws an AnalysisException if the <> are defined and their numbers is different from the number of <> of the analyzed logical plan.

The number of columns produced by the SELECT clause (num: `[output.length]`) does not match the number of column names specified by CREATE VIEW (num: `[userSpecifiedColumns.length]`).

=== [[creating-instance]] Creating CreateViewCommand Instance

CreateViewCommand takes the following when created:

  • [[name]] TableIdentifier
  • [[userSpecifiedColumns]] User-defined columns (as Seq[(String, Option[String])])
  • [[comment]] Optional comment
  • [[properties]] Properties (as Map[String, String])
  • [[originalText]] Optional DDL statement
  • [[child]] Child <>
  • [[allowExisting]] allowExisting flag
  • [[replace]] replace flag
  • <>

=== [[verifyTemporaryObjectsNotExists]] verifyTemporaryObjectsNotExists Internal Method

[source, scala]

verifyTemporaryObjectsNotExists(sparkSession: SparkSession): Unit

verifyTemporaryObjectsNotExists...FIXME

NOTE: verifyTemporaryObjectsNotExists is used exclusively when CreateViewCommand logical command is <>.

=== [[aliasPlan]] aliasPlan Internal Method

[source, scala]

aliasPlan(session: SparkSession, analyzedPlan: LogicalPlan): LogicalPlan

aliasPlan...FIXME

NOTE: aliasPlan is used when CreateViewCommand logical command is <> (and <>).

Back to top