2025-04-06 16:32:51 +08:00
< p align = "center" >
2025-04-10 00:39:55 +08:00
< img src = "assets/humanus.png" width = "200" / >
2025-04-06 16:32:51 +08:00
< / p >
2025-03-19 18:44:54 +08:00
2025-04-06 16:32:51 +08:00
# humanus.cpp
2025-03-19 18:44:54 +08:00
2025-04-06 16:32:51 +08:00
Humanus (meaning "human" in Latin) is a lightweight framework inspired by [OpenManus ](https://github.com/mannaandpoem/OpenManus ) and [mem0 ](https://github.com/mem0ai/mem0 ), integrated with the Model Context Protocol (MCP). `humanus.cpp` enables more flexible tool choices, and provides a foundation for building powerful local LLM agents.
2025-03-23 14:35:54 +08:00
2025-04-06 16:32:51 +08:00
Let's embrace local LLM agents **w/** humanus.cpp!
2025-03-19 18:44:54 +08:00
2025-04-06 16:32:51 +08:00
## Project Demo
2025-03-19 18:44:54 +08:00
## How to Build
```bash
2025-03-19 18:46:55 +08:00
git submodule update --init --recursive
2025-03-19 18:44:54 +08:00
cmake -B build
cmake --build build --config Release
```
## How to Run
2025-04-13 00:02:18 +08:00
### Configuration
2025-03-23 14:46:39 +08:00
Switch to your own configration first:
2025-04-13 00:02:18 +08:00
1. Copy configuration files from `config/example` to `config` .
2. Replace `base_url` , `api_key` , .etc in `config/config_llm.toml` and other configurations in `config/config*.toml` according to your need.
> Note: `llama-server` in [llama.cpp](https://github.com/ggml-org/llama.cpp) also support embedding models.
3. Fill in `args` after `"@modelcontextprotocol/server-filesystem"` for `filesystem` to control the access to files. For example:
2025-03-23 14:46:39 +08:00
```
[filesystem]
type = "stdio"
command = "npx"
args = ["-y",
"@modelcontextprotocol/server-filesystem",
2025-04-13 00:02:18 +08:00
"/Users/{Username}/Desktop",
2025-03-23 14:46:39 +08:00
"other/path/to/your/files]
```
2025-04-13 00:02:18 +08:00
### `mcp_server`
(for tools, only `python_execute` as an example now)
Start a MCP server with tool `python_execute` on port 8895 (or pass the port as an argument):
2025-03-23 14:35:54 +08:00
```bash
2025-04-13 00:02:18 +08:00
./build/bin/mcp_server < port > # Unix/MacOS
2025-03-23 14:35:54 +08:00
```
```shell
2025-04-13 00:02:18 +08:00
.\build\bin\Release\mcp_server.exe < port > # Windows
2025-03-23 14:35:54 +08:00
```
2025-04-13 00:02:18 +08:00
### `humanus_cli`
Run with tools `python_execute` , `filesystem` and `playwright` (for browser use):
2025-03-23 14:35:54 +08:00
2025-03-19 18:44:54 +08:00
```bash
./build/bin/humanus_cli # Unix/MacOS
2025-03-23 14:35:54 +08:00
```
2025-03-19 18:44:54 +08:00
2025-03-23 14:35:54 +08:00
```shell
2025-03-19 18:44:54 +08:00
.\build\bin\Release\humanus_cli.exe # Windows
```
2025-04-13 00:02:18 +08:00
### `humanus_cli_plan` (WIP)
Run planning flow (only agent `humanus` as executor):
2025-03-23 14:35:54 +08:00
```bash
./build/bin/humanus_cli_plan # Unix/MacOS
```
```shell
.\build\bin\Release\humanus_cli_plan.exe # Windows
```
2025-04-13 00:02:18 +08:00
### `humanus_server` (WIP)
Run agents in MCP the server (default running on port 8896):
- `humanus_initialze` : Pass JSON configuration (like in `config/config.toml` ) to initialize an agent for a session. (Only one agent will be maintained for each session/client)
- `humanus_run` : Pass `prompt` to tell the agent what to do. (Only one task at a time)
- `humanus_terminate` : Stop the current task.
- `humanus_status` : Get the current states and other information about the agent and the task. Returns:
- `state` : Agent state.
- `current_step` : Current step index of the agent.
- `max_steps` : Maximum steps executing without interaction with the user.
- `prompt_tokens` : Prompt (input) tokens consumption.
- `completion_tokens` : Completion (output) tokens consumption.
- `log_buffer` : Logs in the buffer, like `humanus_cli` . Will be cleared after fetched.
- `result` : Explaining what the agent did. Not empty if the task is finished.
```bash
./build/bin/humanus_server < port > # Unix/MacOS
```
```shell
.\build\bin\Release\humanus_cli_plan.exe < port > # Windows
```
Configure it in Cursor:
```json
{
"mcpServers": {
"humanus": {
"url": "http://localhost:8896/sse"
}
}
}
```
> What if add `humanus` to `mcp_servers`? It might be interesting.
2025-04-10 00:39:55 +08:00
## Acknowledgement
2025-03-23 14:35:54 +08:00
2025-04-10 00:39:55 +08:00
< p align = "center" >
2025-04-12 21:29:18 +08:00
< img src = "assets/whu.png" height = "180" / >
< img src = "assets/myth.png" height = "180" / >
< / p >
## Cite
```
2025-04-13 00:02:18 +08:00
@misc {humanus_cpp,
2025-04-12 21:29:18 +08:00
author = {Zihong Zhang and Zuchao Li},
title = {humanus.cpp: A Lightweight C++ Framework for Local LLM Agents},
year = {2025}
}
```