AI-Brain – AI charged tool for internal usage
Enhancing Workflow and Knowledge Management with AI-Brain at monday.com
Here at monday.com, within the BigBrain department (Meet the Geniuses Behind Our BI Tool, BigBrain), we’re dedicated to creating internal tools that benefit monday employees. With the recent justified hype around AI, we aimed to harness AI’s power for our employees’ benefit. Initially, our goal was to make ChatGPT accessible to our employees without risking the exposure of sensitive information. To achieve this, we utilized the Azure OpenAI service, providing us with a dedicated ChatGPT instance and ensuring all communication and data remained in-house.
Phase 1 simply looked like this:
Following that, our ambition grew to enable monday employees to ask any monday-related questions and receive answers based on our data sources. These sources include Guru cards, monday boards, monday docs, Google Docs, and even Slack channels.
We developed a dedicated microservice and database for this purpose, opting for Postgres with the pg-vector extension to store the vectorial representation of each document. This microservice periodically reads content links from a dedicated monday board, updates the database with the content, and stores the embedding vector representation of each document.
We chose to split documents into chunks as a best practice for precision and to adhere to the embedding model’s token limitation. When a user asks a monday-related question, we convert their question into a vector, match it with the three most relevant documents, and send these documents one by one to our ChatGPT instance, along with the question, until we find a valid answer. Users can then continue their interaction with ChatGPT as usual or use the ask-monday mode as needed:
Currently, every document is accessible to all monday.com employees. We are now in the process of adding document-specific roles, ensuring users access only the documents relevant to their permissions (e.g., the Legal team will have access to legal-only documents in addition to all-employee documents).
During a recent 2-day AI-Hackathon we had in BigBrain, we further enhanced the system by allowing each user to add their custom context for ChatGPT mode (DIY Brain). Users can now integrate any supported data source with their own custom prompts, significantly personalising their AI-Brain experience.
The next phase evolved to look like this:
In parallel, we integrated all common ChatGPT features, such as multiple chat histories, switching between different ChatGPT models (3.5, 4 Turbo, etc.), receiving user feedback, and anonymously monitoring ask-monday-related questions. This helps us understand our users’ information needs and add commonly searched but missing information. We also added various events to gauge user engagement and usage patterns, here’s a partial snippet of our AI-Brain dashboard:
As the word spread, more and more employees approached us with ideas and suggestions. Some of these were enabled effortlessly, requiring only the provision of the necessary REST API to communicate with the AI-Brain microservice. Below are a few initiatives we implemented:
- Salesforce integration – to understand to which sales person to route a user question to. Each potential customer question left on the website was sent to AI-Brain, along with a prompt such as
“Act as a product consultant working for monday.com, there are 3 products that your company sales: 1 CRM, 2 Dev and 3 Work Management. Here is a description for each one:CRM - ...Please analyse the following inquiries from prospects and for each one classify the most relevant product. please answer only with the product name. If the inquiry doesn't match any product please reply with 0. Please reply only with the product name ('CRM'/'Dev'/'Work Management'). This is the inquiry:”
- Zendesk – For our CS management platform, we use AI-Brain to summarise correspondence between users and agents and gauge the sentiment, aiming to boost our CS agents’ performance.
- Sales: One sales manager mentioned to me that each salesperson, before approaching a new client, consults ChatGPT with several questions to learn about the company’s needs. I developed a simple shortcut:
/company <company_name>
, which, when used, sends several predefined questions along with the company name to ChatGPT.
- Event management – Within BigBrain, we amass a vast collection of event types triggered by a variety of services across monday.com. Our previous endeavour to catalog these events—detailing their business implications, fields and their meanings, and ownership—was unsuccessful, primarily due to our analysts’ time constraints. To address this, I personally conducted a modest yet successful experiment now under review by one of our teams. Given our access to GitHub, where each event’s genesis lies within our codebase, I devised a new shortcut:
/event <event_name>
. This command searches our company’s repositories for the specified event name and forwards it to ChatGPT, along with a request to elucidate the event’s business relevance and detail its various fields.
- GitHub code review – With our existing GitHub integration, I introduced an additional shortcut: /review <pull_request_link>. This feature propels the contents of a pull request to ChatGPT, accompanied by a prompt that instructs it to conduct a code review on the pull request. This integration streamlines the review process, leveraging AI to provide initial insights and feedback on code submissions..
The current phase now looks something like this (simplified for this article):
And this article? It was co-written with AI-Brain as well. 🙂