131 lines
3.6 KiB
Markdown
131 lines
3.6 KiB
Markdown
<p align="center">
|
|
<img src="assets/humanus.png" width="200"/>
|
|
</p>
|
|
|
|
# humanus.cpp
|
|
|
|
Humanus (meaning "human" in Latin) is a lightweight framework inspired by [OpenManus](https://github.com/mannaandpoem/OpenManus) and [mem0](https://github.com/mem0ai/mem0), integrated with the Model Context Protocol (MCP). `humanus.cpp` enables more flexible tool choices, and provides a foundation for building powerful local LLM agents.
|
|
|
|
Let's embrace local LLM agents **w/** humanus.cpp!
|
|
|
|
## Project Demo
|
|
|
|
## How to Build
|
|
|
|
```bash
|
|
git submodule update --init
|
|
|
|
cmake -B build
|
|
cmake --build build --config Release
|
|
```
|
|
|
|
## How to Run
|
|
|
|
### Configuration
|
|
|
|
To set up your custom configuration, follow these steps:
|
|
|
|
1. Copy all files from `config/example` to `config`.
|
|
2. Replace `base_url`, `api_key`, .etc in `config/config_llm.toml` and other configurations in `config/config*.toml` according to your need.
|
|
> Note: `llama-server` in [llama.cpp](https://github.com/ggml-org/llama.cpp) also supports embedding models.
|
|
3. Fill in `args` after `"@modelcontextprotocol/server-filesystem"` for `filesystem` to control the access to files. For example:
|
|
```
|
|
[filesystem]
|
|
type = "stdio"
|
|
command = "npx"
|
|
args = ["-y",
|
|
"@modelcontextprotocol/server-filesystem",
|
|
"/Users/{Username}/Desktop",
|
|
"other/path/to/your/files]
|
|
```
|
|
4. Ensure all requirements for the MCP servers are installed. For example, run `npx playwright install` first for `playwright`.
|
|
|
|
### `mcp_server`
|
|
|
|
(for tools, only `python_execute` as an example now)
|
|
|
|
Start a MCP server with tool `python_execute` on port 8895 (or pass the port as an argument):
|
|
```bash
|
|
./build/bin/mcp_server <port> # Unix/MacOS
|
|
```
|
|
|
|
```shell
|
|
.\build\bin\Release\mcp_server.exe <port> # Windows
|
|
```
|
|
|
|
### `humanus_cli`
|
|
|
|
Run with tools `python_execute`, `filesystem` and `playwright` (for browser use):
|
|
|
|
```bash
|
|
./build/bin/humanus_cli # Unix/MacOS
|
|
```
|
|
|
|
```shell
|
|
.\build\bin\Release\humanus_cli.exe # Windows
|
|
```
|
|
|
|
### `humanus_cli_plan` (WIP)
|
|
|
|
Run planning flow (only agent `humanus` as executor):
|
|
```bash
|
|
./build/bin/humanus_cli_plan # Unix/MacOS
|
|
```
|
|
|
|
```shell
|
|
.\build\bin\Release\humanus_cli_plan.exe # Windows
|
|
```
|
|
|
|
### `humanus_server` (WIP)
|
|
|
|
Run agents in MCP the server (default running on port 8896):
|
|
- `humanus_initialze`: Pass JSON configuration (like in `config/config.toml`) to initialize an agent for a session. (Only one agent will be maintained for each session/client)
|
|
- `humanus_run`: Pass `prompt` to tell the agent what to do. (Only one task at a time)
|
|
- `humanus_terminate`: Stop the current task.
|
|
- `humanus_status`: Get the current states and other information about the agent and the task. Returns:
|
|
- `state`: Agent state.
|
|
- `current_step`: Current step index of the agent.
|
|
- `max_steps`: Maximum steps executing without interaction with the user.
|
|
- `prompt_tokens`: Prompt (input) tokens consumption.
|
|
- `completion_tokens`: Completion (output) tokens consumption.
|
|
- `log_buffer`: Logs in the buffer, like `humanus_cli`. Will be cleared after fetched.
|
|
- `result`: Explaining what the agent did. Not empty if the task is finished.
|
|
|
|
```bash
|
|
./build/bin/humanus_server <port> # Unix/MacOS
|
|
```
|
|
|
|
```shell
|
|
.\build\bin\Release\humanus_cli_plan.exe <port> # Windows
|
|
```
|
|
|
|
Configure it in Cursor:
|
|
```json
|
|
{
|
|
"mcpServers": {
|
|
"humanus": {
|
|
"url": "http://localhost:8896/sse"
|
|
}
|
|
}
|
|
}
|
|
```
|
|
|
|
> What if add `humanus` to `mcp_servers`? It might be interesting.
|
|
|
|
|
|
## Acknowledgement
|
|
|
|
<p align="center">
|
|
<img src="assets/whu.png" height="180"/>
|
|
<img src="assets/myth.png" height="180"/>
|
|
</p>
|
|
|
|
## Cite
|
|
|
|
```
|
|
@misc{humanus_cpp,
|
|
author = {Zihong Zhang and Zuchao Li},
|
|
title = {humanus.cpp: A Lightweight C++ Framework for Local LLM Agents},
|
|
year = {2025}
|
|
}
|
|
``` |