From the various available agentic frameworks, the team chose LangGraph for this POC. For more information, see LangGraph.
LangGraph provides the following benefits:
- Permits the implementation of fully customizable agentic workflows, as opposed to boilerplate agent workflows that may not be easily augmented, to suit specific business use cases. Further, LangGraph facilitates the use of custom prompts at each step. Unlike some open-source frameworks, LangGraph does not use underlying prompts within the framework for agent workflows that can interfere with custom prompts. This is especially important if you are using SLMs and open-source LLMs. Also, the code for each step is written by the user, so additional data or performance logging, data validation steps, and other tools are easily incorporated into the required steps.
- Allows for the emulation of function calling by implementing external tool calls along with conditional edges that are implemented with LLM decisions. This enables the use of LLMs or SLMs that are not explicitly trained for function calling within workflows.
- Permits different LLMs to be used in different workflow steps and for different LLM roles. Some other frameworks allow a particular LLM to be assigned a role, but not for individual workflow steps.
- Is tightly coupled with the LangChain ecosystem, allowing access to the latest LLM features and the ability to swap LLMs as required. Further, LangChain can facilitate parsing and validation of LLM-generated content. LangChain's integration with Ollama is notable, as it allows for easy swapping of open-source LLMs without the need to change the core codebase or the prompts.
- Is a stateful implementation, meaning that data that is transferred between different workflow steps can be persisted, allowing for data backup or caching and resumption in case of failures. As well as maintaining state history, LangGraph also supports multiple sessions, where multiple tracked sessions can be run in parallel.
- Enables the use of tools such as LangSmith to trace LLM calls or costs in a visualized manner through its UI.