Class SparkSessionCatalog<T extends org.apache.spark.sql.connector.catalog.TableCatalog & org.apache.spark.sql.connector.catalog.FunctionCatalog & org.apache.spark.sql.connector.catalog.SupportsNamespaces & org.apache.spark.sql.connector.catalog.ViewCatalog>

java.lang.Object
org.apache.iceberg.spark.SparkSessionCatalog<T>
Type Parameters:
T - CatalogPlugin class to avoid casting to TableCatalog, FunctionCatalog and SupportsNamespaces.
All Implemented Interfaces:
HasIcebergCatalog, SupportsReplaceView, org.apache.spark.sql.connector.catalog.CatalogExtension, org.apache.spark.sql.connector.catalog.CatalogPlugin, org.apache.spark.sql.connector.catalog.FunctionCatalog, org.apache.spark.sql.connector.catalog.StagingTableCatalog, org.apache.spark.sql.connector.catalog.SupportsNamespaces, org.apache.spark.sql.connector.catalog.TableCatalog, org.apache.spark.sql.connector.catalog.ViewCatalog, ProcedureCatalog

public class SparkSessionCatalog<T extends org.apache.spark.sql.connector.catalog.TableCatalog & org.apache.spark.sql.connector.catalog.FunctionCatalog & org.apache.spark.sql.connector.catalog.SupportsNamespaces & org.apache.spark.sql.connector.catalog.ViewCatalog> extends Object implements org.apache.spark.sql.connector.catalog.CatalogExtension
A Spark catalog that can also load non-Iceberg tables.
  • Field Summary

    Fields inherited from interface org.apache.spark.sql.connector.catalog.SupportsNamespaces

    PROP_COMMENT, PROP_LOCATION, PROP_OWNER

    Fields inherited from interface org.apache.spark.sql.connector.catalog.TableCatalog

    OPTION_PREFIX, PROP_COMMENT, PROP_EXTERNAL, PROP_IS_MANAGED_LOCATION, PROP_LOCATION, PROP_OWNER, PROP_PROVIDER

    Fields inherited from interface org.apache.spark.sql.connector.catalog.ViewCatalog

    PROP_COMMENT, PROP_CREATE_ENGINE_VERSION, PROP_ENGINE_VERSION, PROP_OWNER, RESERVED_PROPERTIES
  • Constructor Summary

    Constructors
    Constructor
    Description
     
  • Method Summary

    Modifier and Type
    Method
    Description
    void
    alterNamespace(String[] namespace, org.apache.spark.sql.connector.catalog.NamespaceChange... changes)
     
    org.apache.spark.sql.connector.catalog.Table
    alterTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.connector.catalog.TableChange... changes)
     
    org.apache.spark.sql.connector.catalog.View
    alterView(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.connector.catalog.ViewChange... changes)
     
    protected org.apache.spark.sql.connector.catalog.TableCatalog
    buildSparkCatalog(String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options)
    Build a SparkCatalog to be used for Iceberg operations.
    void
    createNamespace(String[] namespace, Map<String,String> metadata)
     
    org.apache.spark.sql.connector.catalog.Table
    createTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String,String> properties)
     
    org.apache.spark.sql.connector.catalog.View
    createView(org.apache.spark.sql.connector.catalog.Identifier ident, String sql, String currentCatalog, String[] currentNamespace, org.apache.spark.sql.types.StructType schema, String[] queryColumnNames, String[] columnAliases, String[] columnComments, Map<String,String> properties)
     
     
    boolean
    dropNamespace(String[] namespace, boolean cascade)
     
    boolean
    dropTable(org.apache.spark.sql.connector.catalog.Identifier ident)
     
    boolean
    dropView(org.apache.spark.sql.connector.catalog.Identifier ident)
     
    Returns the underlying Catalog backing this Spark Catalog
    final void
    initialize(String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options)
     
    void
    invalidateTable(org.apache.spark.sql.connector.catalog.Identifier ident)
     
    boolean
     
    boolean
     
    default org.apache.spark.sql.connector.catalog.Identifier[]
    listFunctions(String[] namespace)
     
    String[][]
     
    String[][]
    listNamespaces(String[] namespace)
     
    org.apache.spark.sql.connector.catalog.Identifier[]
    listTables(String[] namespace)
     
    org.apache.spark.sql.connector.catalog.Identifier[]
    listViews(String... namespace)
     
    org.apache.spark.sql.connector.catalog.functions.UnboundFunction
    loadFunction(org.apache.spark.sql.connector.catalog.Identifier ident)
     
     
    loadProcedure(org.apache.spark.sql.connector.catalog.Identifier ident)
    Load a stored procedure by identifier.
    org.apache.spark.sql.connector.catalog.Table
    loadTable(org.apache.spark.sql.connector.catalog.Identifier ident)
     
    org.apache.spark.sql.connector.catalog.Table
    loadTable(org.apache.spark.sql.connector.catalog.Identifier ident, long timestamp)
     
    org.apache.spark.sql.connector.catalog.Table
    loadTable(org.apache.spark.sql.connector.catalog.Identifier ident, String version)
     
    org.apache.spark.sql.connector.catalog.View
    loadView(org.apache.spark.sql.connector.catalog.Identifier ident)
     
     
    boolean
    namespaceExists(String[] namespace)
     
    boolean
    purgeTable(org.apache.spark.sql.connector.catalog.Identifier ident)
     
    void
    renameTable(org.apache.spark.sql.connector.catalog.Identifier from, org.apache.spark.sql.connector.catalog.Identifier to)
     
    void
    renameView(org.apache.spark.sql.connector.catalog.Identifier fromIdentifier, org.apache.spark.sql.connector.catalog.Identifier toIdentifier)
     
    org.apache.spark.sql.connector.catalog.View
    replaceView(org.apache.spark.sql.connector.catalog.Identifier ident, String sql, String currentCatalog, String[] currentNamespace, org.apache.spark.sql.types.StructType schema, String[] queryColumnNames, String[] columnAliases, String[] columnComments, Map<String,String> properties)
    Replace a view in the catalog
    void
    setDelegateCatalog(org.apache.spark.sql.connector.catalog.CatalogPlugin sparkSessionCatalog)
     
    org.apache.spark.sql.connector.catalog.StagedTable
    stageCreate(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String,String> properties)
     
    org.apache.spark.sql.connector.catalog.StagedTable
    stageCreateOrReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String,String> properties)
     
    org.apache.spark.sql.connector.catalog.StagedTable
    stageReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String,String> properties)
     
    boolean
     

    Methods inherited from class java.lang.Object

    clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, wait

    Methods inherited from interface org.apache.spark.sql.connector.catalog.FunctionCatalog

    functionExists

    Methods inherited from interface org.apache.spark.sql.connector.catalog.StagingTableCatalog

    stageCreate, stageCreateOrReplace, stageReplace

    Methods inherited from interface org.apache.spark.sql.connector.catalog.TableCatalog

    capabilities, createTable, loadTable, tableExists, useNullableQuerySchema

    Methods inherited from interface org.apache.spark.sql.connector.catalog.ViewCatalog

    invalidateView, viewExists
  • Constructor Details

    • SparkSessionCatalog

      public SparkSessionCatalog()
  • Method Details

    • buildSparkCatalog

      protected org.apache.spark.sql.connector.catalog.TableCatalog buildSparkCatalog(String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options)
      Build a SparkCatalog to be used for Iceberg operations.

      The default implementation creates a new SparkCatalog with the session catalog's name and options.

      Parameters:
      name - catalog name
      options - catalog options
      Returns:
      a SparkCatalog to be used for Iceberg tables
    • defaultNamespace

      public String[] defaultNamespace()
      Specified by:
      defaultNamespace in interface org.apache.spark.sql.connector.catalog.CatalogPlugin
    • listNamespaces

      public String[][] listNamespaces() throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
      Specified by:
      listNamespaces in interface org.apache.spark.sql.connector.catalog.SupportsNamespaces
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
    • listNamespaces

      public String[][] listNamespaces(String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
      Specified by:
      listNamespaces in interface org.apache.spark.sql.connector.catalog.SupportsNamespaces
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
    • namespaceExists

      public boolean namespaceExists(String[] namespace)
      Specified by:
      namespaceExists in interface org.apache.spark.sql.connector.catalog.SupportsNamespaces
    • loadNamespaceMetadata

      public Map<String,String> loadNamespaceMetadata(String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
      Specified by:
      loadNamespaceMetadata in interface org.apache.spark.sql.connector.catalog.SupportsNamespaces
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
    • createNamespace

      public void createNamespace(String[] namespace, Map<String,String> metadata) throws org.apache.spark.sql.catalyst.analysis.NamespaceAlreadyExistsException
      Specified by:
      createNamespace in interface org.apache.spark.sql.connector.catalog.SupportsNamespaces
      Throws:
      org.apache.spark.sql.catalyst.analysis.NamespaceAlreadyExistsException
    • alterNamespace

      public void alterNamespace(String[] namespace, org.apache.spark.sql.connector.catalog.NamespaceChange... changes) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
      Specified by:
      alterNamespace in interface org.apache.spark.sql.connector.catalog.SupportsNamespaces
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
    • dropNamespace

      public boolean dropNamespace(String[] namespace, boolean cascade) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException, org.apache.spark.sql.catalyst.analysis.NonEmptyNamespaceException
      Specified by:
      dropNamespace in interface org.apache.spark.sql.connector.catalog.SupportsNamespaces
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
      org.apache.spark.sql.catalyst.analysis.NonEmptyNamespaceException
    • listTables

      public org.apache.spark.sql.connector.catalog.Identifier[] listTables(String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
      Specified by:
      listTables in interface org.apache.spark.sql.connector.catalog.TableCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
    • loadTable

      public org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException
      Specified by:
      loadTable in interface org.apache.spark.sql.connector.catalog.TableCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchTableException
    • loadTable

      public org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident, String version) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException
      Specified by:
      loadTable in interface org.apache.spark.sql.connector.catalog.TableCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchTableException
    • loadTable

      public org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident, long timestamp) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException
      Specified by:
      loadTable in interface org.apache.spark.sql.connector.catalog.TableCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchTableException
    • invalidateTable

      public void invalidateTable(org.apache.spark.sql.connector.catalog.Identifier ident)
      Specified by:
      invalidateTable in interface org.apache.spark.sql.connector.catalog.TableCatalog
    • createTable

      public org.apache.spark.sql.connector.catalog.Table createTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String,String> properties) throws org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException, org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
      Specified by:
      createTable in interface org.apache.spark.sql.connector.catalog.TableCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
    • stageCreate

      public org.apache.spark.sql.connector.catalog.StagedTable stageCreate(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String,String> properties) throws org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException, org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
      Specified by:
      stageCreate in interface org.apache.spark.sql.connector.catalog.StagingTableCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
    • stageReplace

      public org.apache.spark.sql.connector.catalog.StagedTable stageReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String,String> properties) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException, org.apache.spark.sql.catalyst.analysis.NoSuchTableException
      Specified by:
      stageReplace in interface org.apache.spark.sql.connector.catalog.StagingTableCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
      org.apache.spark.sql.catalyst.analysis.NoSuchTableException
    • stageCreateOrReplace

      public org.apache.spark.sql.connector.catalog.StagedTable stageCreateOrReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String,String> properties) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
      Specified by:
      stageCreateOrReplace in interface org.apache.spark.sql.connector.catalog.StagingTableCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
    • alterTable

      public org.apache.spark.sql.connector.catalog.Table alterTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.connector.catalog.TableChange... changes) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException
      Specified by:
      alterTable in interface org.apache.spark.sql.connector.catalog.TableCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchTableException
    • dropTable

      public boolean dropTable(org.apache.spark.sql.connector.catalog.Identifier ident)
      Specified by:
      dropTable in interface org.apache.spark.sql.connector.catalog.TableCatalog
    • purgeTable

      public boolean purgeTable(org.apache.spark.sql.connector.catalog.Identifier ident)
      Specified by:
      purgeTable in interface org.apache.spark.sql.connector.catalog.TableCatalog
    • renameTable

      public void renameTable(org.apache.spark.sql.connector.catalog.Identifier from, org.apache.spark.sql.connector.catalog.Identifier to) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException, org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
      Specified by:
      renameTable in interface org.apache.spark.sql.connector.catalog.TableCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchTableException
      org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
    • initialize

      public final void initialize(String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options)
      Specified by:
      initialize in interface org.apache.spark.sql.connector.catalog.CatalogPlugin
    • setDelegateCatalog

      public void setDelegateCatalog(org.apache.spark.sql.connector.catalog.CatalogPlugin sparkSessionCatalog)
      Specified by:
      setDelegateCatalog in interface org.apache.spark.sql.connector.catalog.CatalogExtension
    • name

      public String name()
      Specified by:
      name in interface org.apache.spark.sql.connector.catalog.CatalogPlugin
    • icebergCatalog

      public Catalog icebergCatalog()
      Description copied from interface: HasIcebergCatalog
      Returns the underlying Catalog backing this Spark Catalog
      Specified by:
      icebergCatalog in interface HasIcebergCatalog
    • loadFunction

      public org.apache.spark.sql.connector.catalog.functions.UnboundFunction loadFunction(org.apache.spark.sql.connector.catalog.Identifier ident) throws org.apache.spark.sql.catalyst.analysis.NoSuchFunctionException
      Specified by:
      loadFunction in interface org.apache.spark.sql.connector.catalog.FunctionCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchFunctionException
    • listViews

      public org.apache.spark.sql.connector.catalog.Identifier[] listViews(String... namespace)
      Specified by:
      listViews in interface org.apache.spark.sql.connector.catalog.ViewCatalog
    • loadView

      public org.apache.spark.sql.connector.catalog.View loadView(org.apache.spark.sql.connector.catalog.Identifier ident) throws org.apache.spark.sql.catalyst.analysis.NoSuchViewException
      Specified by:
      loadView in interface org.apache.spark.sql.connector.catalog.ViewCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchViewException
    • createView

      public org.apache.spark.sql.connector.catalog.View createView(org.apache.spark.sql.connector.catalog.Identifier ident, String sql, String currentCatalog, String[] currentNamespace, org.apache.spark.sql.types.StructType schema, String[] queryColumnNames, String[] columnAliases, String[] columnComments, Map<String,String> properties) throws org.apache.spark.sql.catalyst.analysis.ViewAlreadyExistsException, org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
      Specified by:
      createView in interface org.apache.spark.sql.connector.catalog.ViewCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.ViewAlreadyExistsException
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
    • replaceView

      public org.apache.spark.sql.connector.catalog.View replaceView(org.apache.spark.sql.connector.catalog.Identifier ident, String sql, String currentCatalog, String[] currentNamespace, org.apache.spark.sql.types.StructType schema, String[] queryColumnNames, String[] columnAliases, String[] columnComments, Map<String,String> properties) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException, org.apache.spark.sql.catalyst.analysis.NoSuchViewException
      Description copied from interface: SupportsReplaceView
      Replace a view in the catalog
      Specified by:
      replaceView in interface SupportsReplaceView
      Parameters:
      ident - a view identifier
      sql - the SQL text that defines the view
      currentCatalog - the current catalog
      currentNamespace - the current namespace
      schema - the view query output schema
      queryColumnNames - the query column names
      columnAliases - the column aliases
      columnComments - the column comments
      properties - the view properties
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException - If the identifier namespace does not exist (optional)
      org.apache.spark.sql.catalyst.analysis.NoSuchViewException - If the view doesn't exist or is a table
    • alterView

      public org.apache.spark.sql.connector.catalog.View alterView(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.connector.catalog.ViewChange... changes) throws org.apache.spark.sql.catalyst.analysis.NoSuchViewException, IllegalArgumentException
      Specified by:
      alterView in interface org.apache.spark.sql.connector.catalog.ViewCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchViewException
      IllegalArgumentException
    • dropView

      public boolean dropView(org.apache.spark.sql.connector.catalog.Identifier ident)
      Specified by:
      dropView in interface org.apache.spark.sql.connector.catalog.ViewCatalog
    • renameView

      public void renameView(org.apache.spark.sql.connector.catalog.Identifier fromIdentifier, org.apache.spark.sql.connector.catalog.Identifier toIdentifier) throws org.apache.spark.sql.catalyst.analysis.NoSuchViewException, org.apache.spark.sql.catalyst.analysis.ViewAlreadyExistsException
      Specified by:
      renameView in interface org.apache.spark.sql.connector.catalog.ViewCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchViewException
      org.apache.spark.sql.catalyst.analysis.ViewAlreadyExistsException
    • loadProcedure

      public Procedure loadProcedure(org.apache.spark.sql.connector.catalog.Identifier ident) throws NoSuchProcedureException
      Description copied from interface: ProcedureCatalog
      Load a stored procedure by identifier.
      Specified by:
      loadProcedure in interface ProcedureCatalog
      Parameters:
      ident - a stored procedure identifier
      Returns:
      the stored procedure's metadata
      Throws:
      NoSuchProcedureException - if there is no matching stored procedure
    • isFunctionNamespace

      public boolean isFunctionNamespace(String[] namespace)
    • isExistingNamespace

      public boolean isExistingNamespace(String[] namespace)
    • useNullableQuerySchema

      public boolean useNullableQuerySchema()
      Specified by:
      useNullableQuerySchema in interface org.apache.spark.sql.connector.catalog.TableCatalog
    • listFunctions

      default org.apache.spark.sql.connector.catalog.Identifier[] listFunctions(String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
      Specified by:
      listFunctions in interface org.apache.spark.sql.connector.catalog.FunctionCatalog
      Throws:
      org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException