Go to file
hkr04 ec0d299833 fix Chinese input problem on Windows console 2025-03-23 17:09:19 +08:00
.devops mem0: WIP 2025-03-23 14:35:54 +08:00
agent mem0: WIP 2025-03-23 14:35:54 +08:00
config started to refactor memory 2025-03-20 16:03:26 +08:00
examples fix Chinese input problem on Windows console 2025-03-23 17:09:19 +08:00
flow mem0: WIP 2025-03-23 14:35:54 +08:00
mcp@5e9ff48b07 update README 2025-03-23 14:46:39 +08:00
memory mem0: WIP 2025-03-23 14:35:54 +08:00
server fix some bugs; planning flow on test 2025-03-19 18:44:54 +08:00
spdlog Initial commit 2025-03-16 17:17:01 +08:00
tool fix bug; refine planning flow 2025-03-20 01:12:15 +08:00
.dockerignore mem0: WIP 2025-03-23 14:35:54 +08:00
.gitignore Initial commit 2025-03-16 17:17:01 +08:00
.gitmodules add .gitmodules 2025-03-17 16:00:04 +08:00
CMakeLists.txt mem0: WIP 2025-03-23 14:35:54 +08:00
DOCKER.md mem0: WIP 2025-03-23 14:35:54 +08:00
README.md update README 2025-03-23 14:46:39 +08:00
config.cpp mem0: WIP 2025-03-23 14:35:54 +08:00
config.h update README 2025-03-23 14:46:39 +08:00
llm.cpp a workable version (but ...) 2025-03-17 01:58:37 +08:00
llm.h mem0: WIP 2025-03-23 14:35:54 +08:00
logger.cpp fix compilation error on Windows 2025-03-18 16:40:16 +08:00
logger.h zh -> en 2025-03-17 16:35:11 +08:00
prompt.cpp mem0: WIP 2025-03-23 14:35:54 +08:00
prompt.h mem0: WIP 2025-03-23 14:35:54 +08:00
schema.cpp fix compilation error on Windows 2025-03-18 16:40:16 +08:00
schema.h started to refactor memory 2025-03-20 16:03:26 +08:00
toml.hpp Initial commit 2025-03-16 17:17:01 +08:00

README.md

Introduction

Humanus (meaning "human" in Latin) is a lightweight framework inspired by OpenManus, integrated with the Model Context Protocol (MCP). humanus.cpp enables more flexible tool choices, and provides a foundation for building powerful local LLM agents.

Let's embrace local LLM agents w/ humanus.cpp!

Overview

humanus.cpp/
├── 📄 config.cpp/.h               # 配置系统头文件
├── 📄 llm.cpp/.h                  # LLM集成主实现文件
├── 📄 logger.cpp/.h               # 日志系统实现文件
├── 📄 main.cpp                    # 程序入口文件
├── 📄 prompt.cpp/.h               # 预定义提示词
├── 📄 schema.cpp/.h               # 数据结构定义实现文件
├── 📄 toml.hpp                    # TOML配置文件解析库
├── 📂 agent/                      # 代理模块目录
│   ├── 📄 base.h                  # 基础代理接口定义
│   ├── 📄 humanus.h               # Humanus核心代理实现
│   ├── 📄 react.h                 # ReAct代理实现
│   └── 📄 toolcall.cpp/.h         # 工具调用实现文件
├── 📂 flow/                       # 工作流模块目录
│   ├── 📄 base.h                  # 基础工作流接口定义
│   ├── 📄 flow_factory.h          # 工作流工厂类
│   └── 📄 planning.cpp/.h         # 规划型工作流实现文件
├── 📂 mcp/                        # 模型上下文协议(MCP)实现目录
├── 📂 memory/                     # 内存管理模块
│   ├── 📄 base.h                  # 基础内存接口定义
│   └── 📂 mem0/                   # TODO: mem0记忆实现
├── 📂 server/                     # 服务器模块
│   ├── 📄 mcp_server_main.cpp     # MCP服务器入口文件
│   └── 📄 python_execute.cpp      # Python执行环境集成实现
├── 📂 spdlog/                     # 第三方日志库
└── 📂 tool/                       # 工具模块目录
    ├── 📄 base.h                  # 基础工具接口定义
    ├── 📄 filesystem.h            # 文件系统操作工具
    ├── 📄 planning.cpp/.h         # 规划工具实现
    ├── 📄 puppeteer.h             # Puppeteer浏览器自动化工具
    ├── 📄 python_execute.h        # Python执行工具
    ├── 📄 terminate.h             # 终止工具
    └── 📄 tool_collection.h       # 工具集合定义

Features

How to Build

git submodule update --init --recursive

cmake -B build
cmake --build build --config Release

How to Run

Switch to your own configration first:

  1. Replace base_url, api_key, .etc in config/config_llm.toml according to your need.
  2. Fill in args after "@modelcontextprotocol/server-filesystem" for filesystem to control the access to files. For example:
[filesystem]
type = "stdio"
command = "npx"
args = ["-y",
        "@modelcontextprotocol/server-filesystem",
        "/Users/{username}/Desktop",
        "other/path/to/your/files]

Start a MCP server with tool python_execute on port 8818:

./build/bin/mcp_server # Unix/MacOS
.\build\bin\Release\mcp_server.exe  # Windows

Run agent Humanus with tools python_execute, filesystem and puppeteer (for browser use):

./build/bin/humanus_cli # Unix/MacOS
.\build\bin\Release\humanus_cli.exe # Windows

Run experimental planning flow (only agent Humanus as executor):

./build/bin/humanus_cli_plan # Unix/MacOS
.\build\bin\Release\humanus_cli_plan.exe # Windows