Flink Table Store
Flink Table Store#
Flink Table Store is a unified storage to build dynamic tables for both streaming and batch processing in Flink, supporting high-speed data ingestion and timely data query.
Tip
This article assumes that you have mastered the basic knowledge and operation of Flink Table Store. For the knowledge about Flink Table Store not mentioned in this article, you can obtain it from its Official Documentation.
By using Kyuubi, we can run SQL queries towards Flink Table Store which is more convenient, easy to understand, and easy to expand than directly using Hive to manipulate Flink Table Store.
Flink Table Store Integration#
To enable the integration of kyuubi flink sql engine and Flink Table Store, you need to:
Referencing the Flink Table Store dependencies
Setting the environment variable configurations
Dependencies#
The classpath of kyuubi hive sql engine with Iceberg supported consists of
kyuubi-hive-sql-engine-1.7.0_2.12.jar, the engine jar deployed with Kyuubi distributions
a copy of hive distribution
flink-table-store-hive-connector-<flink-table-store.version>_<hive.binary.version>.jar (example: flink-table-store-hive-connector-0.4.0_3.1.jar), which can be found in the Installation Table Store in Hive
In order to make the Hive packages visible for the runtime classpath of engines, we can use one of these methods:
You can create an auxlib folder under the root directory of Hive, and copy flink-table-store-hive-connector-0.4.0_3.1.jar into auxlib.
Execute ADD JAR statement in the Kyuubi to add dependencies to Hive’s auxiliary classpath. For example:
ADD JAR /path/to/flink-table-store-hive-connector-0.4.0_3.1.jar;
Warning
The second method is not recommended. If you’re using the MR execution engine and running a join statement, you may be faced with the exception
org.apache.hive.com.esotericsoftware.kryo.kryoexception: unable to find class.
Warning
Please mind the compatibility of different Flink Table Store and Hive versions, which can be confirmed on the page of Flink Table Store multi engine support.
Configurations#
If you are using HDFS, make sure that the environment variable HADOOP_HOME or HADOOP_CONF_DIR is set.
Flink Table Store Operations#
Flink Table Store only supports only reading table store tables through Hive.
A common scenario is to write data with Flink and read data with Hive.
You can follow this document Flink Table Store Quick Start to write data to a table store table
and then use Kyuubi Hive SQL engine to query the table with the following SQL SELECT
statement.
Taking Query Data
as an example,
SELECT a, b FROM test_table ORDER BY a;
Taking Query External Table
as an example,
CREATE EXTERNAL TABLE external_test_table
STORED BY 'org.apache.flink.table.store.hive.TableStoreHiveStorageHandler'
LOCATION '/path/to/table/store/warehouse/default.db/test_table';
SELECT a, b FROM test_table ORDER BY a;