⚠️ v2 Alpha — This is the v2 branch of Prompty, currently in alpha. The API, file format, and tooling are under active development and may change. Feedback welcome via Issues.
Prompty is a markdown file format (.prompty) for LLM prompts. Write your prompt once — run it from VS Code, Python, or TypeScript.
---
name: greeting
model:
id: gpt-4o-mini
provider: openai
connection:
kind: key
apiKey: ${env:OPENAI_API_KEY}
template:
format:
kind: jinja2
parser:
kind: prompty
---
system:
You are a friendly assistant.
user:
Say hello to {{name}}.
Python
pip install "prompty[jinja2,openai]"import prompty
result = prompty.invoke("greeting.prompty", inputs={"name": "Jane"})
print(result)TypeScript
npm install @prompty/core @prompty/openaiimport { invoke } from "@prompty/core";
import "@prompty/openai";
const result = await invoke("greeting.prompty", { name: "Jane" });
console.log(result);VS Code — open the .prompty file and press F5.
Prompty normalizes text files to LF line endings via .gitattributes. Enable the
repo hook once per clone so staged files are normalized before each commit and
whitespace errors are blocked locally:
git config core.hooksPath .githooksBefore opening a PR, you can run the same core hygiene checks directly:
git diff --check
git ls-files --eol | grep 'w/crlf'The v2 extension includes a connections sidebar, live preview, chat mode, and a redesigned trace viewer.
Right-click in the explorer → New Prompty to scaffold a new prompt file.
See the rendered prompt with live markdown rendering and template interpolation as you type.
Manage model connections from the sidebar — add OpenAI, Microsoft Foundry, or Anthropic endpoints, set a default, and browse available models.
Thread-enabled prompts automatically open an interactive chat panel with tool calling support.
Every execution generates a .tracy trace file. Click to inspect the full pipeline — render, parse, execute, process — with timing and payloads.
pip install "prompty[all]" # everything
pip install "prompty[jinja2,openai]" # just OpenAI
pip install "prompty[jinja2,foundry]" # Microsoft Foundry
pip install "prompty[jinja2,anthropic]" # Anthropicimport prompty
# Full pipeline: load → render → parse → execute → process
result = prompty.invoke("my-prompt.prompty", inputs={...})
# Step-by-step
agent = prompty.load("my-prompt.prompty")
messages = prompty.prepare(agent, inputs={...})
result = prompty.run(agent, messages)
# Async
result = await prompty.invoke_async("my-prompt.prompty", inputs={...})See runtime/python/prompty/README.md for full API docs.
npm install @prompty/core @prompty/openai # OpenAI
npm install @prompty/core @prompty/foundry # Microsoft Foundry
npm install @prompty/core @prompty/anthropic # Anthropicimport { load, prepare, run, invoke } from "@prompty/core";
import "@prompty/openai"; // registers the provider
// Full pipeline
const result = await invoke("my-prompt.prompty", { name: "Jane" });
// Step-by-step
const agent = await load("my-prompt.prompty");
const messages = await prepare(agent, { name: "Jane" });
const result = await run(agent, messages);See runtime/typescript/packages/core/README.md for full API docs.
A .prompty file has two parts: YAML frontmatter (model config, inputs, tools) and a markdown body (the prompt with role markers and template syntax).
---
name: my-prompt
model:
id: gpt-4o
provider: foundry
connection:
kind: key
endpoint: ${env:AZURE_OPENAI_ENDPOINT}
apiKey: ${env:AZURE_OPENAI_API_KEY}
options:
temperature: 0.7
inputs:
- name: question
kind: string
default: What is the meaning of life?
tools:
- name: get_weather
kind: function
description: Get the current weather
parameters:
- name: location
kind: string
template:
format:
kind: jinja2
parser:
kind: prompty
---
system:
You are a helpful assistant.
user:
{{question}}
Lines starting with system:, user:, or assistant: define message boundaries.
Jinja2 ({{variable}}, {% if %}, {% for %}) or Mustache ({{variable}}, {{#section}}).
| Syntax | Purpose |
|---|---|
${env:VAR} |
Environment variable (required) |
${env:VAR:default} |
With fallback value |
${file:path.json} |
Load file content |
Prompty v1 files are automatically migrated with deprecation warnings. See the Python README for details.
See SUPPORT.md for help and CODE_OF_CONDUCT.md for community guidelines.
To release a new version, see RELEASING.md.




