Select Page

Use cases

Data Analysis with AI-powered Copilot

cloudly

Our projects are innovative initiatives designed to address specific challenges and create positive impact. They leverage cutting-edge technologies and collaborative approaches to deliver tangible results and drive sustainable growth.

Data Analysis with AI-powered Copilot

Introduction:

In today’s data-driven world, the ability to extract meaningful insights from vast amounts of data is essential. Natural language input has emerged as a convenient way for users to interact with databases, allowing them to pose complex queries in everyday language. Consider the following example: “Compute the weekly sales total for orders created after July 1st, 1998, and fulfilled. Describe any observed trends and patterns.”

To efficiently tackle such queries, data analytics copilots leverage a series of logical tasks:

  1. Natural Language Input: Users provide instructions in natural language, making database interaction intuitive and accessible.
  2. Task Breakdown by Copilot: The copilot breaks down the input instruction into multiple logical tasks, using a set of predefined tools. Recognizing the need to convert the instruction into an SQL statement, it first accesses the relevant database schema, Data Definition Language (DDL), and metadata using a dedicated tool.
  3. SQL Statement Generation: Armed with the necessary information, the copilot passes this data along with the user’s query to a Language Model (LLM), which generates a matching SQL statement.
  4. Data Retrieval: The copilot then calls a tool to fetch the dataset from the data mart based on the generated SQL statement. If the SQL statement is flawed, an error is reported, prompting the copilot to retry or terminate the task.
  5. Dataset Parsing and Description: Upon successfully retrieving the dataset, the copilot parses it and generates a detailed description, including information on columns, data types, and values.
  6. Graphical Representation and Analysis: Next, the copilot provides the dataset description to an LLM, which generates Python code to dynamically create a suitable graph. This graph not only fits the dataset but also addresses the user’s query. This graph helps visualize the data, making it easier to identify trends and patterns. The LLM then analyzes the graph to identify trends and patterns.
  7. Task Completion Verification: Finally, the copilot verifies that all tasks have been completed successfully. If any issues are detected, it continues to call the necessary tools until the tasks are completed satisfactorily, before returning the results to the user.

Closing:

By seamlessly integrating AI into the data analysis process, tasks that traditionally required manual intervention and complex coding can now be efficiently managed with just a few simple inputs. This not only saves time but also enhances the accuracy and effectiveness of data analysis procedures.