Quick Start¶
Get up and running with oneprompt in 5 minutes.
1. Initialize a project¶
This creates the following files in your current directory:
| File | Purpose |
|---|---|
.env |
Configuration — API key and database URL |
DATABASE.md |
Schema documentation template |
docker-compose.yml |
Docker stack for MCP servers |
example.py |
Ready-to-run example script |
2. Configure credentials¶
Edit .env with your API key and database connection:
Get your API key
Get a free Gemini API key at Google AI Studio.
3. Document your schema¶
Edit DATABASE.md to describe your tables, columns, and relationships. The more detail you provide, the better the AI will write SQL queries.
# Database Schema
## Tables
### products
| Column | Type | Description |
|--------|------|-------------|
| id | integer | Primary key |
| name | text | Product name |
| price | numeric | Unit price in USD |
| category | text | Product category |
### orders
| Column | Type | Description |
|--------|------|-------------|
| id | integer | Primary key |
| product_id | integer | FK → products.id |
| quantity | integer | Units ordered |
| total | numeric | Order total |
| created_at | timestamp | Order date |
## Relationships
- products.id → orders.product_id (one product, many orders)
See the Schema Documentation Guide for the full recommended format.
4. Start services¶
This builds and launches 4 Docker containers:
| Service | Port | Description |
|---|---|---|
| Artifact Store | 3336 | Generated file storage (CSV, JSON, HTML) |
| PostgreSQL MCP | 3333 | SQL query execution |
| Chart MCP | 3334 | AntV chart generation |
| Python MCP | 3335 | Sandboxed Python execution |
First run
The first op start builds Docker images, which may take a few minutes. Subsequent starts are much faster.
5. Run your first query¶
import oneprompt as op
client = op.Client() # Reads from .env automatically
# Query your database
result = client.query("What are the top 10 products by revenue?")
print(result.summary)
print(result.preview)
Or run the generated example:
6. Generate a chart¶
# Use the query result as input
chart = client.chart("Bar chart of top products", data_from=result)
print(chart.summary)
# Download the chart HTML and open it in your browser
for art in chart.artifacts:
art.download("./output/")
7. Run Python analysis¶
analysis = client.analyze("Calculate descriptive statistics", data_from=result)
print(analysis.summary)
# Read or download output artifacts
for art in analysis.artifacts:
print(art.read_text())
art.download("./output/")
What's next?¶
- Configuration — Customize settings, ports, and model
- Schema Documentation — Write better schema docs for more accurate queries
- Chaining Agents — Pipe results between query, analyze, and chart
- Client Reference — Full API reference for the Python SDK