🦊 Welcome to Kyuubi’s online documentation ✨, vbuild/mvn: line 99: help:evaluate: command not found
logo

Admin Guide

  • Quick Start
    • Getting Started
    • Getting Started With Kyuubi on kubernetes
    • Getting Started With Hive JDBC
  • Deploying Kyuubi
    • Deploy Kyuubi On Kubernetes
    • Integration with Hive Metastore
    • Kyuubi High Availability Guide
    • Kyuubi Migration Guide
    • Introduction to the Kyuubi Configurations System
    • Deploy Kyuubi engines on Yarn
    • Deploy Kyuubi engines on Kubernetes
    • The Share Level Of Kyuubi Engines
    • The TTL Of Kyuubi Engines
    • The Spark SQL Engine Configuration Guide
      • How To Use Spark Dynamic Resource Allocation (DRA) in Kyuubi
      • How To Use Spark Adaptive Query Execution (AQE) in Kyuubi
      • Solution for Big Result Sets
  • Security
    • Authentication
      • Configure Kyuubi to use Kerberos Authentication
      • Configure Kerberos for clients to Access Kerberized Kyuubi
      • Configure Kyuubi to use LDAP Authentication
      • Configure Kyuubi to Use JDBC Authentication
      • Configure Kyuubi to use Custom Authentication
    • Authorization
      • Spark AuthZ Plugin
        • Overview
        • Building
        • Installing
    • Kinit Auxiliary Service
    • Hadoop Credentials Manager
  • Monitoring
    • 1. Monitoring Kyuubi - Logging System
    • 2. Monitoring Kyuubi - Server Metrics
    • 3. Trouble Shooting
  • Tools
    • Kubernetes Tools Spark Block Cleaner
    • Managing kyuubi servers and engines Tool
    • Kyuubi Administer Tool

User Guide

  • Clients & APIs
    • JDBC Drivers
      • Kyuubi Hive JDBC Driver
      • Hive JDBC Driver
      • MySQL Connectors
    • Command Line Interface(CLI)s
      • Kyuubi Beeline
      • Hive Beeline
    • Business Intelligence Tools and SQL IDEs
      • Apache Superset
      • Cloudera Hue
      • DataGrip
      • DBeaver
      • PowerBI
      • Tableau
    • ODBC Drivers
    • Thrift APIs
    • RESTful APIs and Clients
      • REST API v1
    • Web UI
    • Python
      • PyHive
      • PySpark
    • Client Commons
      • Client Configuration Guide
      • Logging
      • Configure Kerberos for clients to Access Kerberized Kyuubi
      • Advanced Features
        • Using Different Kyuubi Engines
        • Sharing and Isolation for Kyuubi Engines
        • Setting Time to Live for Kyuubi Engines
        • Enabling Kyuubi Engine Pool
        • Running Scala Snippets
        • Plan Only Execution Mode

Extension Guide

  • Extensions
    • Server Side Extensions
      • Configure Kyuubi to use Custom Authentication
      • Inject Session Conf with Custom Config Advisor
      • Configure Kyuubi to use Custom EventHandler
      • Manage Applications against Extra Cluster Managers
    • Engine Side Extensions
      • Extensions for Spark
        • Z-Ordering Support
        • Auxiliary Optimization Rules
        • Kyuubi Spark AuthZ Plugin
        • Auxiliary SQL Functions
        • Connectors for Spark SQL Query Engine
        • SQL Lineage Support
        • Hive Dialect Support
      • Extensions for Flink
        • Connectors For Flink SQL Query Engine
      • Extensions for Hive
        • Connectors for Hive SQL Query Engine
      • Extensions for Trino
        • Connectors For Trino SQL Engine

Connectors

  • Connectors
    • Connectors for Spark SQL Query Engine
      • Delta Lake
      • Delta Lake with Microsoft Azure Blob Storage
      • Hudi
      • Iceberg
      • Kudu
      • Hive
      • Flink Table Store
      • TiDB
      • TPC-DS
      • TPC-H
    • Connectors For Flink SQL Query Engine
      • Flink Table Store
      • Hudi
      • Iceberg
    • Connectors for Hive SQL Query Engine
      • Iceberg
    • Connectors For Trino SQL Engine
      • Flink Table Store
      • Iceberg

Kyuubi Insider

  • Overview
    • Architecture
    • Kyuubi v.s. HiveServer2
    • Kyuubi v.s. Spark Thrift JDBC/ODBC Server (STS)

Contributing

  • Develop Tools
    • Building Kyuubi
    • Building a Runnable Distribution
    • Building Kyuubi Documentation
    • Running Tests
    • Debugging Kyuubi
    • Developer Tools
    • IntelliJ IDEA Setup Guide
  • Community
    • Contributing to Apache Kyuubi
    • Collaborators
    • Kyuubi Release Guide

Appendix

  • Appendixes
    • 1. Terminologies
Version build/mvn: line 99: help:evaluate: command not found
  • repository
  • suggest edit
  • .rst
  • Flink Table Store Integration
    • Dependencies
    • Configurations
  • Flink Table Store Operations

Flink Table Store

  • Flink Table Store Integration
    • Dependencies
    • Configurations
  • Flink Table Store Operations

Flink Table Store#

Flink Table Store is a unified storage to build dynamic tables for both streaming and batch processing in Flink, supporting high-speed data ingestion and timely data query.

Tip

This article assumes that you have mastered the basic knowledge and operation of Flink Table Store. For the knowledge about Flink Table Store not mentioned in this article, you can obtain it from its Official Documentation.

By using kyuubi, we can run SQL queries towards Flink Table Store which is more convenient, easy to understand, and easy to expand than directly using spark to manipulate Flink Table Store.

Flink Table Store Integration#

To enable the integration of kyuubi spark sql engine and Flink Table Store through Apache Spark Datasource V2 and Catalog APIs, you need to:

  • Referencing the Flink Table Store dependencies

  • Setting the spark extension and catalog configurations

Dependencies#

The classpath of kyuubi spark sql engine with Flink Table Store supported consists of

  1. kyuubi-spark-sql-engine-build/mvn: line 99: help:evaluate: command not found_2.12.jar, the engine jar deployed with Kyuubi distributions

  2. a copy of spark distribution

  3. flink-table-store-spark-<version>.jar (example: flink-table-store-spark-0.2.jar), which can be found in the Maven Central

In order to make the Flink Table Store packages visible for the runtime classpath of engines, we can use one of these methods:

  1. Put the Flink Table Store packages into $SPARK_HOME/jars directly

  2. Set spark.jars=/path/to/flink-table-store-spark

Warning

Please mind the compatibility of different Flink Table Store and Spark versions, which can be confirmed on the page of Flink Table Store multi engine support.

Configurations#

To activate functionality of Flink Table Store, we can set the following configurations:

spark.sql.catalog.tablestore=org.apache.flink.table.store.spark.SparkCatalog
spark.sql.catalog.tablestore.warehouse=file:/tmp/warehouse

Flink Table Store Operations#

Flink Table Store supports reading table store tables through Spark. A common scenario is to write data with Flink and read data with Spark. You can follow this document Flink Table Store Quick Start to write data to a table store table and then use kyuubi spark sql engine to query the table with the following SQL SELECT statement.

select * from table_store.default.word_count;

previous

Hive

next

TiDB

By Apache Kyuubi Community
© Copyright 2023 The Apache Software Foundation, Licensed under the Apache License, Version 2.0.