Toolhouse

Overview

Toolhouse is a Backend-as-a-Service (BaaS) to build, run, and manage AI agents. Toolhouse simplifies the process of building agents in a local environment and running them in production.

With Toolhouse, you define agents as code and deploy them as APIs using a single command. Toolhouse agents are automatically connected to the Toolhouse MCP Server; it gives agents access to RAG, memory, code execution, browser use, and all the functionality agents need to perform actions. You can add MCP Servers and even define custom code that the agent can use to perform actions not covered by public MCP Servers. Toolhouse has built-in eval, prompt optimization, and agentic orchestration.

For further information about the framework, please check the official documentation:


Integration via Python

Installation

pip install toolhouse openai python-dotenv

Optionally add: pip install groq or other SDKs, depending on the target LLM platform.


Connection Setup

  1. Create a .env file in your project:

  1. Example Python integration (toolhouse_example.py):


GUI Integration

The Toolhouse GUI (https://app.toolhouse.ai) supports:

  • API key management

  • Tool selection via Bundles

  • Agent execution & history

  • Monitoring tool calls in logs

Tool configuration is managed entirely through their GUI and reflected in tool discovery (th.get_tools()).


Integration via TypeScript

Installation

Install the required dependencies:


Connection Setup

  1. Create a .env file in the project root:

  1. Create a TypeScript file (toolhouse.ts) with the following content:

  1. Run the script:


GUI Integration

Toolhouse provides a browser-based GUI at app.toolhouse.ai where you can:

  • Manage API keys

  • Add and organize tools via Bundles

  • Monitor execution logs

  • Trigger and test agents visually

βœ… Toolhouse integration with AIMLAPI is fully supported via baseURL override in the OpenAI-compatible SDK.


βœ… Supported AIMLAPI Models

All chat-compatible models served by AIMLAPI are supported, including:

  • Mistralai – Mistral-7B-Instruct, Mixtral-8x7B

  • Meta – Meta-LLaMA-3.1, LLaMA-3.3

  • Anthropic – Claude-3.5-Haiku

  • NVIDIA – Nemotron-70B

  • Google, xAI, Alibaba, Cohere, DeepSeek – all supported through the unified https://api.aimlapi.com/v1 endpoint.

πŸ“˜ View our full text (chat) model catalog.


βš™οΈ Supported Parameters

No AIMLAPI-specific parameter differences were found. Use standard OpenAI-compatible parameters:

  • model

  • messages

  • temperature

  • max_tokens

  • stream

  • tools (Toolhouse integration)


🧠 Supported Call Features

Feature
Via Python
Via TypeScript

Synchronous calls

βœ…

βœ…

Asynchronous use

🟑 (manual)

βœ… (via Promises)

Tool Calling

βœ…

βœ…

Streaming

βœ…

βœ…

Threads

❌

❌

Local tools

βœ…

βœ… via registerLocalTool()

Last updated

Was this helpful?