Add protocols tu support iceberg catalogs #13
andrea-gioia
started this conversation in
Ideas
Replies: 0 comments
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
We can use the same properties used by flik SQL...
Flink SQL:
CREATE CATALOG
ExamplesIntroduction
In Apache Flink, the
CREATE CATALOG
command is used to define a new catalog that provides access to tables, databases, and functions stored in an external system (such as Apache Iceberg, Hive, JDBC databases, and Kafka Schema Registry).1. Iceberg Catalog
Apache Iceberg supports multiple catalog backends, such as Hadoop, Hive, and REST.
1.1 Iceberg with Hadoop Catalog (HDFS/S3/Unitstore)
➡ Queries:
1.2 Iceberg with Hive Metastore
➡ Queries:
1.3 Iceberg with REST Metastore
➡ Queries:
2. Hive Catalog
Connects Flink to Hive Metastore.
➡ Queries:
3. JDBC Catalog (MySQL, PostgreSQL, etc.)
JDBC Catalog connects Flink to relational databases.
3.1 JDBC (MySQL Example)
➡ Queries:
3.2 JDBC (PostgreSQL Example)
➡ Queries:
4. Generic In-Memory Catalog
This is a default built-in catalog for Flink.
➡ Queries:
5. Kafka Schema Registry Catalog
This catalog integrates Apache Kafka Schema Registry (Avro/Protobuf/JSON).
➡ Queries:
6. Dropping a Catalog
If you need to remove a catalog, you can use the
DROP CATALOG
command:7. Summary
'catalog-type' = 'hadoop'
'catalog-type' = 'hive'
'catalog-type' = 'rest'
'type' = 'hive'
'type' = 'jdbc'
'type' = 'generic_in_memory'
'type' = 'kafka'
Beta Was this translation helpful? Give feedback.
All reactions