Acknowledgments
Additionally, thank you Alon Segal, Amit Hanoch, Doron Brikman, Moshe Zemah, Neil Nachman, Ron Nachmany and Morphex 🙂
At monday.com, we faced a challenge that once seemed impossible – breaking apart our massive JavaScript client monolith, a task originally estimated to take 8 person-years of manual effort. With Morphex, our AI-powered migration system, we turned that scary timeline into just 6 months.
In this article, we will share the story of going through the challenge of leveraging AI in one of our most complex engineering projects. We will share why we chose to build our own system, the various practices we adopted, and our learnings throughout.
The journey to Morphex began during an “AI Month”[1] initiative at monday.com.
In a bold move, we set the goal of our team shifting the company into AI development. For that month, almost every engineer in the company worked either on building internal AI tools or adding AI capabilities to our product.
One of the task forces we started aimed to take on an extremely ambitious goal. We wanted to pick a challenge that, on one hand, would stretch beyond what we thought was possible to achieve with AI and inspire our R&D organization. On the other hand, it would have an immense impact on our day-to-day. A moonshot.
And nothing fits better than splitting our giant client-side monolith and rebuilding it with a modern stack. For those not in the know, a monolith is what the industry refers to as a single codebase that typically holds an impossible amount of code, with overwhelming dependencies and internal complexity.
monday.com’s client-side application is a vast, distributed, and extensible architecture that allows hundreds of our internal developers, together with thousands of developers (and AI agents) from the community, to enhance it every day. Allowing the building of standalone widgets and components that are highly interactive and reactive to one another, it uses a centralized state system based on Redux. Most of which resides in the monolith.
However, our monolith contained more than a decade’s worth of coding, spanning over thousands of files of code, consisting of thousands and thousands of Redux-based actions, selectors, and reducers, along with hundreds of constants, services, utils, and more. We had to have them untangled and rewritten from JavaScript to TypeScript, while migrating to a Zustand-based state management system.
We set the ambitious goal of breaking the monolith in a mere 6 person-months.
Quickly, many questions and concerns arose: how could we map out the codebase and the dependencies into work items? Would our bots work well in parallel and without conflicts? How do we create visibility over our progress?
Where would we even start?
Naturally, we chose monday.com to organize the work and manage the process. But analyzing a codebase of tens of millions of lines into a practical plan is not an easy feat. Our first breakthrough was to map every file in the monolith into a monday.com board.
Morphex repeatedly scans and parses the entire client-side codebase into a board. Each file is mapped into an item and receives a score based on multiple data points:
Morphex progress board
Then Morphex follows an iterative process:
The monday.com board acts in two roles.
It is a memory bank and task synchronizer for the continuous and parallel execution of the AI. And at the same time, it serves as a task interface for us humans to manage and report progress, using the same items as the source of truth for both roles.
A monday.com dashboard based on Morphex’s board to track the progress.
The reason we created Morphex was not because the Cursor/Claude code CLI wasn’t good enough; we use it daily. It was because they relied solely on raw AI, and couldn’t carry out a very large task without a lot of guidelines or a human to drive it. From our experience, when we tried giving AI such complex tasks, it would get lost very quickly and start hallucinating. For example, it would suddenly decide that, “Okay, it’s tested and passing,” when in fact it wasn’t.
Breaking the task into smaller prompts, providing strict rules, or using tools like CursorRIPER yielded much better results, but even those weren’t independent enough to just “fire and forget”. In addition, to handle thousands of files, the process had to be executed in parallel, and completely independently without humans to prompt, approve, and resume.
Morphex embodies a hybrid approach, combining the power of AI with good ol’ deterministic code that would execute small prompts or traditional codemod tools.
It is built with the following principles:
Formulating a clear and deterministic migration plan using NodeJS as the orchestrator, rather than AI, composed of small steps in a well-defined order of execution, following the pattern of Research -> Plan -> Review.
Making each step as simple and straightforward as possible for the AI to carry out.
Validating each step AI made before proceeding. Ensuring the linter passes, tests pass, and performing a code review, among other tasks. Upon failure, Morphex would retry and pass previous failures as context for the next iteration. This was key to minimizing hallucinations and completing prompts successfully.
Incorporating codemods and static analysis. Giving the prompt information about the dependencies instead of analyzing them by itself, or performing simple extractions like constants.
Deterministic flow broken into small steps, including verifications and commit steps.
Each step has multiple validations, ensuring high quality and task completion.
Let’s deep dive into how Morphex carried out its tasks in collaboration with us humans and see how the migration looks in practice.
Starting from the “before code”, examining the artifacts.
The “before code”
It generates the following result:
The “after code”
In this example, we can see how Morphex:
We taught Morphex to add Human Todos for review whenever it applied judgment and deviated from the original implementation, or whenever it needed to flag potential risks or areas requiring manual human review.
We practically gave the AI the tool to “activate a human”.
In addition, we see how the AI made sure the newly migrated code was kept behind a feature flag, allowing us to enable it gradually and monitor its performance in a controlled environment.
Feature flag created by Morphex
Ultimately, Morphex added a comprehensive test suite for the extracted code and made sure to also run it as a comparison test side by side with the “before code”:
Tests by morphex
Upon successful completion of the workflow, Morphex created a Pull Request and announced it in a joint channel with humans. It sent a Slack message summarizing the work and costs, broken into steps. The PR underwent a rigorous review by at least two human engineers, serving as the ultimate safeguard.
Slack message sent by Morphex
Morphex was born out of the need to tackle one of monday.com’s most ambitious engineering challenges: breaking free from our massive JavaScript monolith.
Existing AI tools weren’t sufficient for such scale and complexity, so we built a hybrid system that combined AI with deterministic orchestration, codemods, and rigorous validation loops. This approach gave us the confidence to safely extract high-impact, complex code while improving quality, documentation, and tests – all without incidents.
Through this journey, we learned that success with AI requires treating it not as a black box but as a managed workforce. Splitting large goals into small, verifiable steps, layering in codemods and validation, and ensuring human review at key points allowed us to overcome AI’s tendency to hallucinate or drift. Our biggest takeaway is that the right balance between automation and oversight transforms AI into a reliable engineering partner, capable of operating at a pace and scale that manual work could never match. And this isn’t all; we didn’t even talk about our Merge-a-thon, managing conflicts between simultaneous PRs, prompt structuring techniques and debugging tools, git history as context for prompts, and so much more. But there is only so much one can cover in a single article.
Ultimately, the results speak for themselves: once operational, Morphex extracted 1% of our client-side codebase in just a single day – a task once estimated at months of manual effort.
Without AI, we would probably have never set off to split the monolith, making the 8-year estimation feel more like a “never”.
Additionally, thank you Alon Segal, Amit Hanoch, Doron Brikman, Moshe Zemah, Neil Nachman, Ron Nachmany and Morphex 🙂
Tom Bogin, Software Engineering Tech Lead @ monday.com
Tech Lead in monday.com’s client foundation team. Founded Morphex and built various parts of the client infrastructure.
Oron Morad, Distinguished Software Engineer @ monday.com
Tech Lead in monday.com’s platform team. Lead the initiative of the client monolith split and multiple key product initiatives in the platform.
[1] Read about our AI Month: Article