In traditional analytics, asking questions to the data would be a challenging task for the business users as it requires technical expertise like SQL, Python to unlock the hidden insights from the data. What if someone coming from a non-technical background wants to dive into the data and swim through it to make data driven decisions? The rise of Large Language Models (LLM’s) is beginning to bridge this gap. The power of these AI systems is capable of understanding the natural language, empower users to query data without needing technical expertise. It allows users to ask queries in simple English like “What is the sales in last quarter?”. You are not just getting the numerical fact as your response instead you will be given detail insight about your query based on the contextual data.
At LokiBots, "Conversational Analytics" is emerged as a transformative technology which makes the data readily available in the tool by empowering the business users and analysts to explore the data independently and focus on more strategic decision-making process. In this blog we will understand how LokiBots is using Conversational analytics to make data interaction more intuitive and accessible for everyone with zero-code.
Conversational Analytics is the process of extracting and analyzing your organizational data using Natural Language Query and Machine Learning by allowing business users to make quick decisions. It plays a role of personal data analyst to answer all the business questions instantly without much delay by generating the actionable insights. Unlike traditional analytics, which primarily deals with structured data in predefined formats, conversational analytics focuses on unstructured data, such as free-form text or speech. This capability has unlocked new opportunities for businesses across various industries.
LokiBots has incorporated the Generative AI within their platform to seamlessly allow users to interact with the organizational data. It has the capability to understand the deeper context of every question you ask and present answers in more detail format.
Key features include:
· Uploads file size up to 200MB.
· Allow users to enter questions in natural language.
· Support for multiple file formats and data connectors, including Excel, PDF, PowerPoint(PPT), Word documents (DOC), Databases, APIs, as well as text, audio, and video files.
· Structured and unstructured data.
· Results are more appropriate by providing actionable insights and recommendations.
A disconnect often exists between the vocabulary used by business users when querying data and the terminology found in database schemas. Business users tend to use domain-specific or business-oriented language, while database schemas commonly rely on abbreviations or technical terms typical of ETL pipelines. Moreover, database schemas often lack these mantic context necessary for individuals unfamiliar with the dataset to fully grasp the information. This difference in vocabulary makes it difficult to answer data queries with high precision.
Enhancing the dataset with semantic details—such as descriptive names, master data or properties, measures or values, and relationships between datasets—enriches its context. This added clarity enables data queries to be interpreted and resolved with greater accuracy and reliability.
LokiBots uses different approaches to analyze the structured and unstructured data. The different approaches ensures that the pipeline is optimized for the specific characteristics of each data type, balancing efficiency, accuracy, and interpretability.
For structured data,
Unstructured data also follows the similar pipeline, where uploaded file and user prompt will be passed directly to LLM’s for generating the actionable insights instead of generating the executable code.
The difference in approach for structured and unstructured data in the pipeline arises from the inherent characteristics and handling requirements of each data type:
1. Structured Data:
Structured data, like Excel or CSV files, is highly organized and follows a predictable schema with rows and columns. This makes it well-suited for direct manipulation through code.
Why a Code-Generation step?
With structured data, generating and executing code allows for precise data manipulation, calculations, and transformations that are aligned with the user's query. This approach ensures efficiency and accuracy, particularly when working with large datasets or complex queries.
Process Efficiency:
By extracting metadata (e.g., column names, data types, or relationships), the pipeline can optimize the refinement of the user’s prompt and generate tailored executable code. This step enhances performance, as the structured format lends itself to deterministic operations.
2. Unstructured Data:
Unstructured data, such as text documents, images, or audio, lacks a predefined format or schema. Its analysis often requires interpretation rather than straight forward computations.
Why Skip Code Generation?
Unstructured data is typically processed directly by Large Language Models (LLMs) or other AI systems that excel at understanding and generating insights from raw, free-form data. Generating executable code for such data would add unnecessary complexity and might not align with the unpredictable nature of unstructured formats.
Direct LLM Usage
Instead of generating code, the pipeline passes the uploaded file and user prompt directly to LLMs, which can interpret the content, extract meaning, and generate actionable insights without additional processing steps.
The versatility of conversational analytics makes it relevant across a variety of industries. Here are some key applications:
Despite its many benefits, conversational analytics comes with its own set of challenges:
The future of conversational analytics is bright, with advancements in AI and NLP driving its evolution. Here’s what we can expect:
Conversational analytics is redefining how businesses interact with customers, employees, and stakeholders by transforming unstructured conversations into actionable insights. Whether enhancing customer experiences, improving employee engagement, or driving innovation, this technology is poised to play a pivotal role in modern analytics.
As AI and NLP continue to advance, the potential of conversational analytics to revolutionize industries will only grow, making it an indispensable tool in the contemporary business landscape.
I would like to extend my sincere gratitude to Sourin Karmakar, Senior Engineering Manager - RPA & ML, for providing the insightful demo video that greatly enriched this blog. I also appreciate the valuable assistance of Abishek R, Automation and ML Engineer, in compiling the content. Their expertise and contributions have been instrumental in bringing this post together.