Skip to content

Latest commit

 

History

History

transcripts_analytics_with_AI

Application: Extract Intelligence from raw call center transcripts using RAG

Architecture

image

Please Follow the Steps Below:

Introduction and Setup Environment

  • Use the notebook config to define the name of your preferred catalog, schema, and volume
  • Then review notebook Introduction for an overview of the current PoC Template
  • Run notebook 00-setup to create a catalog, schema, volume, and download dataset to the volume

Step 1. Data Ingestions with Delta Live Table

  • Create a Delta Live Table Pipeline using notebook 01-DLT-Transcript-Policy, refer to the below imange for the example of the resulting pipeline. Please also refer to the DLT pipeline tutorial on how to set up a DLT pipeline image
  • Run notebook 01.1-DLT-Transcript-Enriched-Persist-MV to create a copy of materialized view of the DLT from the previous step. This steps is needed to due the current limitation of DLT table

Step 2. Perform LLM Summarization, Sentiment Analysis and Classification with Databricks SQL AI Fundations

  • Run notebook 02-GenAI-Text-Classification to perform summarization and sentiment analysis task using prompt enginering in batch with the Databricks DBRX foundation model

  • We can create Lakehouse Dashboard (example below) based on the analysises performed by the AI functions. image

Step 3. Using DBSQL Agent to ask question on the data