Documentation Index
Fetch the complete documentation index at: https://docs.useterse.ai/llms.txt
Use this file to discover all available pages before exploring further.
Terse is currently in private beta. Email us at support@useterse.ai to request access, and let us know what use cases you’re looking to build.
Quickstart
Get a workflow live in under 5 minutes.
Context as Code
How generated helpers make workflows more stable.
How it works
AI agents are fast to build but make mistakes. Deterministic workflows are reliable but slow to set up. Terse lets you mix both in the same workflow: call tools directly when you need precision, hand off to an agent when you need judgment. Deploy serverlessly withterse deploy, build faster with a generated SDK from your workspace.
terse deploy from the CLI to package and deploy your workflow. Terse handles hosting, execution, and scaling for you. Learn more about how deployment works.
Why Terse
Code first, version controlled
Workflows are TypeScript in your repo. Review them in PRs, track changes in git history, and collaborate with your team the same way you ship any other code. No more black-box automations buried in a UI builder that nobody can audit or roll back.Fewer tokens, better results
Terse curates the tools available to the LLM based on the skills you define. Instead of dumping hundreds of MCP tools into context and hoping the model picks the right one, your agent only sees what it needs. Fewer tokens in, more reliable outputs out.Secure by default
Integration credentials live in Terse’s secret manager with automatic OAuth token refresh. Your code never touches API keys or tokens. Each skill is scoped to specific resources (repos, channels, lists), and tool-level approval gates let you control exactly what the agent can do. Compare that to raw MCP servers where any connected tool has full access to everything.Deterministic when you want it
Not everything needs an LLM. CallAgent.tools.slack.sendMessage() directly for predictable operations, then hand off to Agent.runAndWait() when you need model judgment. Mix both in the same workflow.
Full observability
Every run is tracked with a complete action log: what tools were called, what changed, and where. Failed runs surface the error. Approval-gated tools pause execution and resume when approved. No more wondering what your automation did at 3am.Supported integrations
GitHub
Slack
Attio
Linear

Notion

Snowflake

Gmail

PostHog

Datadog

LaunchDarkly
What you can build
CRM enrichment
Enrich new deals with Apollo, score fit, and write the result back to Attio.
Deal Alerts
Post Slack alerts when a deal hits a key stage.
Weekly Pipeline Reporting
Summarize open pipeline weekly and post the digest to Slack.
Additional Templates
Start from a working GTM workflow instead of a blank file.
Where to start
Quickstart
Get a weekly pipeline digest live in under 10 minutes.
Context as Code
Understand how generated helpers make workflows more stable.
Hosting & Deployment
Learn how terse deploy packages and runs your code serverlessly.
Compare approaches
See where Terse fits relative to Zapier, custom scripts, and AI agents.
Key concepts
| Concept | What it means |
|---|---|
| Context as Code | terse generate compiles your connected integrations into generated helpers, removing the need for runtime discovery of IDs, channels, or lists |
| Workflow | A TypeScript automation with triggers, skills, and a handler, deployed via terse deploy |
| Serverless hosting | terse deploy packages your code and hosts it for you. Workflows run in isolated containers that spin up on trigger and tear down on completion. No servers to manage. Learn more. |
| Trigger | The event (e.g. Triggers.github.onPROpened()) or schedule (e.g. Triggers.schedule.cron()) that starts a workflow |
| Skill | An integration capability available to the workflow (e.g. Skills.slack(), Skills.github()) |
| Generated helpers | Resource constants, trigger builders, skill constructors, and deterministic wrappers |
