Package org.apache.iceberg.spark
Class SparkSessionCatalog<T extends org.apache.spark.sql.connector.catalog.TableCatalog & org.apache.spark.sql.connector.catalog.FunctionCatalog & org.apache.spark.sql.connector.catalog.SupportsNamespaces>
java.lang.Object
org.apache.iceberg.spark.SparkSessionCatalog<T>
- Type Parameters:
 T- CatalogPlugin class to avoid casting to TableCatalog, FunctionCatalog and SupportsNamespaces.
- All Implemented Interfaces:
 HasIcebergCatalog,org.apache.spark.sql.connector.catalog.CatalogExtension,org.apache.spark.sql.connector.catalog.CatalogPlugin,org.apache.spark.sql.connector.catalog.FunctionCatalog,org.apache.spark.sql.connector.catalog.StagingTableCatalog,org.apache.spark.sql.connector.catalog.SupportsNamespaces,org.apache.spark.sql.connector.catalog.TableCatalog,ProcedureCatalog
public class SparkSessionCatalog<T extends org.apache.spark.sql.connector.catalog.TableCatalog & org.apache.spark.sql.connector.catalog.FunctionCatalog & org.apache.spark.sql.connector.catalog.SupportsNamespaces>
extends Object
implements org.apache.spark.sql.connector.catalog.CatalogExtension
A Spark catalog that can also load non-Iceberg tables.
- 
Field Summary
Fields inherited from interface org.apache.spark.sql.connector.catalog.SupportsNamespaces
PROP_COMMENT, PROP_LOCATION, PROP_OWNERFields inherited from interface org.apache.spark.sql.connector.catalog.TableCatalog
OPTION_PREFIX, PROP_COMMENT, PROP_EXTERNAL, PROP_IS_MANAGED_LOCATION, PROP_LOCATION, PROP_OWNER, PROP_PROVIDER - 
Constructor Summary
Constructors - 
Method Summary
Modifier and TypeMethodDescriptionvoidalterNamespace(String[] namespace, org.apache.spark.sql.connector.catalog.NamespaceChange... changes) org.apache.spark.sql.connector.catalog.TablealterTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.connector.catalog.TableChange... changes) protected org.apache.spark.sql.connector.catalog.TableCatalogbuildSparkCatalog(String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options) Build aSparkCatalogto be used for Iceberg operations.voidcreateNamespace(String[] namespace, Map<String, String> metadata) org.apache.spark.sql.connector.catalog.TablecreateTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) String[]booleandropNamespace(String[] namespace, boolean cascade) booleandropTable(org.apache.spark.sql.connector.catalog.Identifier ident) Returns the underlyingCatalogbacking this Spark Catalogfinal voidinitialize(String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options) voidinvalidateTable(org.apache.spark.sql.connector.catalog.Identifier ident) booleanisExistingNamespace(String[] namespace) booleanisFunctionNamespace(String[] namespace) default org.apache.spark.sql.connector.catalog.Identifier[]listFunctions(String[] namespace) String[][]String[][]listNamespaces(String[] namespace) org.apache.spark.sql.connector.catalog.Identifier[]listTables(String[] namespace) org.apache.spark.sql.connector.catalog.functions.UnboundFunctionloadFunction(org.apache.spark.sql.connector.catalog.Identifier ident) loadNamespaceMetadata(String[] namespace) loadProcedure(org.apache.spark.sql.connector.catalog.Identifier ident) Load astored procedurebyidentifier.org.apache.spark.sql.connector.catalog.TableloadTable(org.apache.spark.sql.connector.catalog.Identifier ident) org.apache.spark.sql.connector.catalog.TableloadTable(org.apache.spark.sql.connector.catalog.Identifier ident, long timestamp) org.apache.spark.sql.connector.catalog.Tablename()booleannamespaceExists(String[] namespace) booleanpurgeTable(org.apache.spark.sql.connector.catalog.Identifier ident) voidrenameTable(org.apache.spark.sql.connector.catalog.Identifier from, org.apache.spark.sql.connector.catalog.Identifier to) voidsetDelegateCatalog(org.apache.spark.sql.connector.catalog.CatalogPlugin sparkSessionCatalog) org.apache.spark.sql.connector.catalog.StagedTablestageCreate(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) org.apache.spark.sql.connector.catalog.StagedTablestageCreateOrReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) org.apache.spark.sql.connector.catalog.StagedTablestageReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) booleanMethods inherited from class java.lang.Object
clone, equals, finalize, getClass, hashCode, notify, notifyAll, toString, wait, wait, waitMethods inherited from interface org.apache.spark.sql.connector.catalog.FunctionCatalog
functionExistsMethods inherited from interface org.apache.spark.sql.connector.catalog.StagingTableCatalog
stageCreate, stageCreateOrReplace, stageReplaceMethods inherited from interface org.apache.spark.sql.connector.catalog.TableCatalog
capabilities, createTable, tableExists, useNullableQuerySchema 
- 
Constructor Details
- 
SparkSessionCatalog
public SparkSessionCatalog() 
 - 
 - 
Method Details
- 
buildSparkCatalog
protected org.apache.spark.sql.connector.catalog.TableCatalog buildSparkCatalog(String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options) Build aSparkCatalogto be used for Iceberg operations.The default implementation creates a new SparkCatalog with the session catalog's name and options.
- Parameters:
 name- catalog nameoptions- catalog options- Returns:
 - a SparkCatalog to be used for Iceberg tables
 
 - 
defaultNamespace
- Specified by:
 defaultNamespacein interfaceorg.apache.spark.sql.connector.catalog.CatalogPlugin
 - 
listNamespaces
public String[][] listNamespaces() throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
 listNamespacesin interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 - 
listNamespaces
public String[][] listNamespaces(String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException - Specified by:
 listNamespacesin interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 - 
namespaceExists
- Specified by:
 namespaceExistsin interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces
 - 
loadNamespaceMetadata
public Map<String,String> loadNamespaceMetadata(String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException - Specified by:
 loadNamespaceMetadatain interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 - 
createNamespace
public void createNamespace(String[] namespace, Map<String, String> metadata) throws org.apache.spark.sql.catalyst.analysis.NamespaceAlreadyExistsException- Specified by:
 createNamespacein interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces- Throws:
 org.apache.spark.sql.catalyst.analysis.NamespaceAlreadyExistsException
 - 
alterNamespace
public void alterNamespace(String[] namespace, org.apache.spark.sql.connector.catalog.NamespaceChange... changes) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException - Specified by:
 alterNamespacein interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 - 
dropNamespace
public boolean dropNamespace(String[] namespace, boolean cascade) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException, org.apache.spark.sql.catalyst.analysis.NonEmptyNamespaceException - Specified by:
 dropNamespacein interfaceorg.apache.spark.sql.connector.catalog.SupportsNamespaces- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceExceptionorg.apache.spark.sql.catalyst.analysis.NonEmptyNamespaceException
 - 
listTables
public org.apache.spark.sql.connector.catalog.Identifier[] listTables(String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException - Specified by:
 listTablesin interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 - 
loadTable
public org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException - Specified by:
 loadTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchTableException
 - 
loadTable
public org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident, String version) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException - Specified by:
 loadTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchTableException
 - 
loadTable
public org.apache.spark.sql.connector.catalog.Table loadTable(org.apache.spark.sql.connector.catalog.Identifier ident, long timestamp) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException - Specified by:
 loadTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchTableException
 - 
invalidateTable
public void invalidateTable(org.apache.spark.sql.connector.catalog.Identifier ident) - Specified by:
 invalidateTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog
 - 
createTable
public org.apache.spark.sql.connector.catalog.Table createTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) throws org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException, org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
 createTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
 org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsExceptionorg.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 - 
stageCreate
public org.apache.spark.sql.connector.catalog.StagedTable stageCreate(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) throws org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException, org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
 stageCreatein interfaceorg.apache.spark.sql.connector.catalog.StagingTableCatalog- Throws:
 org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsExceptionorg.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 - 
stageReplace
public org.apache.spark.sql.connector.catalog.StagedTable stageReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException, org.apache.spark.sql.catalyst.analysis.NoSuchTableException- Specified by:
 stageReplacein interfaceorg.apache.spark.sql.connector.catalog.StagingTableCatalog- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceExceptionorg.apache.spark.sql.catalyst.analysis.NoSuchTableException
 - 
stageCreateOrReplace
public org.apache.spark.sql.connector.catalog.StagedTable stageCreateOrReplace(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.types.StructType schema, org.apache.spark.sql.connector.expressions.Transform[] partitions, Map<String, String> properties) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException- Specified by:
 stageCreateOrReplacein interfaceorg.apache.spark.sql.connector.catalog.StagingTableCatalog- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 - 
alterTable
public org.apache.spark.sql.connector.catalog.Table alterTable(org.apache.spark.sql.connector.catalog.Identifier ident, org.apache.spark.sql.connector.catalog.TableChange... changes) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException - Specified by:
 alterTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchTableException
 - 
dropTable
public boolean dropTable(org.apache.spark.sql.connector.catalog.Identifier ident) - Specified by:
 dropTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog
 - 
purgeTable
public boolean purgeTable(org.apache.spark.sql.connector.catalog.Identifier ident) - Specified by:
 purgeTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog
 - 
renameTable
public void renameTable(org.apache.spark.sql.connector.catalog.Identifier from, org.apache.spark.sql.connector.catalog.Identifier to) throws org.apache.spark.sql.catalyst.analysis.NoSuchTableException, org.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException - Specified by:
 renameTablein interfaceorg.apache.spark.sql.connector.catalog.TableCatalog- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchTableExceptionorg.apache.spark.sql.catalyst.analysis.TableAlreadyExistsException
 - 
initialize
public final void initialize(String name, org.apache.spark.sql.util.CaseInsensitiveStringMap options) - Specified by:
 initializein interfaceorg.apache.spark.sql.connector.catalog.CatalogPlugin
 - 
setDelegateCatalog
public void setDelegateCatalog(org.apache.spark.sql.connector.catalog.CatalogPlugin sparkSessionCatalog) - Specified by:
 setDelegateCatalogin interfaceorg.apache.spark.sql.connector.catalog.CatalogExtension
 - 
name
- Specified by:
 namein interfaceorg.apache.spark.sql.connector.catalog.CatalogPlugin
 - 
icebergCatalog
Description copied from interface:HasIcebergCatalogReturns the underlyingCatalogbacking this Spark Catalog- Specified by:
 icebergCatalogin interfaceHasIcebergCatalog
 - 
loadFunction
public org.apache.spark.sql.connector.catalog.functions.UnboundFunction loadFunction(org.apache.spark.sql.connector.catalog.Identifier ident) throws org.apache.spark.sql.catalyst.analysis.NoSuchFunctionException - Specified by:
 loadFunctionin interfaceorg.apache.spark.sql.connector.catalog.FunctionCatalog- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchFunctionException
 - 
loadProcedure
public Procedure loadProcedure(org.apache.spark.sql.connector.catalog.Identifier ident) throws NoSuchProcedureException Description copied from interface:ProcedureCatalogLoad astored procedurebyidentifier.- Specified by:
 loadProcedurein interfaceProcedureCatalog- Parameters:
 ident- a stored procedure identifier- Returns:
 - the stored procedure's metadata
 - Throws:
 NoSuchProcedureException- if there is no matching stored procedure
 - 
isFunctionNamespace
 - 
isExistingNamespace
 - 
useNullableQuerySchema
public boolean useNullableQuerySchema()- Specified by:
 useNullableQuerySchemain interfaceorg.apache.spark.sql.connector.catalog.TableCatalog
 - 
listFunctions
default org.apache.spark.sql.connector.catalog.Identifier[] listFunctions(String[] namespace) throws org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException - Specified by:
 listFunctionsin interfaceorg.apache.spark.sql.connector.catalog.FunctionCatalog- Throws:
 org.apache.spark.sql.catalyst.analysis.NoSuchNamespaceException
 
 -