Learn how to build and deploy AI agents with LangGraph using watsonx.ai
Artificial intelligence no longer just responds, it also makes decisions. With frameworks like LangGraph and platforms like watsonx.ai , you can build agents that reason and act autonomously 🤯.
In this article, we will explain how to implement a ReAct (Reasoning + Action) agent locally and deploy it on IBM Cloud, all this with a practical example that includes a weather query tool 🌤️.
A practical guide to using your agents with LangGraph and Watsonx.ai
Project architecture
- Machine with local project
- Here you develop and test the agent with Python, LangGraph and dependencies.
- ZIP (pip-zip)
- Package with your code and additional tools.
- Software Specification
- Environment with libraries necessary to execute the agent.
- watsonx.ai
- Platform where you deploy the service as a REST API.
- IBM Cloud Object Storage
- Stores deployment assets.
Let’s prepare the environment for our agent
We need:
- Python 3.12 installed
- Access to IBM Cloud and watsonx.ai
- Poetry for dependency management
Have you got everything? Well, first things first, clone the repository that we will use as an example. It is based on the official IBM examples.
git clone https://github.com/thomassuedbroecker/watsonx-agent-langgraph-deployment-example.git cd ./agents/langgraph-arxiv-research
First of all, let’s understand the example project.
[Developer Workstation] → [CI/Build Process] → [Deployment] ↓[IBM Cloud / watsonx.ai]
The main files of the agent are:
ai_service.py |
Main file that starts the agent service in production.
|
agent.py |
Core logic of the AI agent based on LangGraph. Defines the workflow.
|
tools.py |
Tools connected to the agent (Weather API).
|
Let’s configure the environment
python3.12 -m venv .venv
source
./.venv
/bin/activate
python3 -m pip
install
--upgrade pip
python3 -m pip
install
poetry
We also recommend the use of Anaconda or miniconda. It allows us to manage virtual environments or Python packages in a simple way and is widely used in ML.
In order for Python to find our custom modules (such as agents and tools), we need to include the current directory in the environment variable PYTHONPATH
export
PYTHONPATH=$(
pwd
):${PYTHONPATH}
echo
${PYTHONPATH}
Once we have the environment ready, it is time for the variables. You must create a config.toml file if you don’t already have one and use your IBM Cloud credentials:
[deployment]
watsonx_apikey
=
"TU_APIKEY"
space_id
=
"SPACE_ID"
deployment_id
=
"YOUR_DEPLOYMENT_ID"
[deployment.custom]
model_id
=
"mistralai/mistral-large"
# underlying model of WatsonxChat
thread_id
=
"thread-1"
# Más información: https://langchain-ai.github.io/langgraph/how-tos/persistence/
sw_runtime_spec
=
"runtime-24.1-py3.11"
You will find your variables here:
https://dataplatform.cloud.ibm.com/developer-access
Once there, select your deployment space and copy the necessary data (API Key
, Space ID
, etc.).
Execution at the agent’s premises
It is time to test the agent:
source
./.venv
/bin/activate
poetry run python examples
/execute_ai_service_locally
.py
Since it’s a weather agent why don’t you try it with something like something like…?
“What is the current weather in Madrid?”
The console should give you the time in Madrid. Congratulations! we only need to do the deploy in watsonx.ai
Agent deployment in watsonx.ai
source
./.venv
/bin/activate
poetry run python scripts
/deploy
.py
- Read the configuration (
config.toml
) with your credentials and deployment space. - Package your code in a ZIP file for uploading to IBM Cloud.
- Creates a custom software specification based on a base environment (such as
runtime-24.1-py3.11
). - Deploy the agent as a REST service in watsonx.ai.
- Save the
deployment_id
, needed to interact with the agent later.
✅ In short:
takes your local agent, prepares it and turns it into a cloud-accessible service.
{
"messages": [
{
"content": "What is the weather in Malaga?",
"data": {
"endog": [
0
],
"exog": [
0
]
},
"role": "User"
}
]
}
source ./.venv/bin/activate poetry run python examples/query_existing_deployment.py
Conclusions
If you have any doubts, we recommend the following video tutorial where you can follow step by step the development connected with watsonx.ai
If you want to continue exploring these types of implementations or learn more about cloud development and artificial intelligence, we invite you to explore our AI courses.👇