- Use the notebook
config
to define the name of your preferred catalog, schema, and volume - Then review notebook
Introduction
for an overview of the current PoC Template - Run notebook
00-setup
to create a catalog, schema, volume, and download dataset to the volume
- Create a Delta Live Table Pipeline using notebook
01-DLT-Transcript-Policy
, refer to the below imange for the example of the resulting pipeline. Please also refer to the DLT pipeline tutorial on how to set up a DLT pipeline - Run notebook
01.1-DLT-Transcript-Enriched-Persist-MV
to create a copy of materialized view of the DLT from the previous step. This steps is needed to due the current limitation of DLT table
Step 2. Perform LLM Summarization, Sentiment Analysis and Classification with Databricks SQL AI Fundations
-
Run notebook
02-GenAI-Text-Classification
to perform summarization and sentiment analysis task using prompt enginering in batch with the Databricks DBRX foundation model -
We can create Lakehouse Dashboard (example below) based on the analysises performed by the AI functions.
- Here is a example lakeview dashboard template. To import the dashboard template to your workspace, please refer to this script
- Run notebook
03-LLM-SQL-Agent
to ask questions on the analysis result table from applying AI Functions. - DBSQL agent is based on Langchain agent and SQL Database toolkits