Documentation Index
Fetch the complete documentation index at: https://docs.dottxt.ai/llms.txt
Use this file to discover all available pages before exploring further.
The dottxt Python SDK accepts a wide range of Python types as response_format on generate(...), for many shapes you don’t need to write or generate JSON Schema at all.
Install
DotTxt.generate(...) and AsyncDotTxt.generate(...) accept response_format as any of:
- a Pydantic model class
- a TypedDict type
- a dataclass type
- an Enum class
- a
typing.Literal[...] type
- a
typing.Union[...] type
- a
typing.Optional[...] type
- typed containers such as
list[...], dict[...], tuple[...]
- a JSON string containing JSON Schema
- a JSON object (
dict)
The return type follows the input: a Pydantic model class returns a validated model instance, and other supported types return parsed JSON.
Pydantic model
Use a Pydantic model when you want runtime validation alongside generation. The result is a validated model instance.
from typing import Literal
from pydantic import BaseModel, Field
from dottxt import DotTxt
class IncidentSummary(BaseModel):
severity: Literal["low", "medium", "high"]
team: str = Field(max_length=32)
client = DotTxt()
result = client.generate(
model="openai/gpt-oss-20b",
input=(
"Summarize this incident: checkout errors are blocking purchases. "
"Return a JSON object with keys severity and team."
),
response_format=IncidentSummary,
)
print(result)
# severity='high' team='checkout'
print(result.model_dump())
# {'severity': 'high', 'team': 'checkout'}
See Pydantic for the full schema mapping.
TypedDict
Use a TypedDict for a lightweight class declaration without Pydantic as a dependency. The result is a plain dict.
from __future__ import annotations
from typing import Literal, TypedDict
from dottxt import DotTxt
class IncidentPayload(TypedDict):
severity: Literal["low", "medium", "high"]
team: str
client = DotTxt()
result = client.generate(
model="openai/gpt-oss-20b",
input=(
"Summarize this incident: checkout errors are blocking purchases. "
"Return a JSON object with keys severity and team."
),
response_format=IncidentPayload,
)
print(result)
# {'severity': 'high', 'team': 'checkout'}
Dataclass
Standard library @dataclass types are supported as well.
from __future__ import annotations
from dataclasses import dataclass
from typing import Literal
from dottxt import DotTxt
@dataclass
class IncidentPayload:
severity: Literal["low", "medium", "high"]
team: str
client = DotTxt()
result = client.generate(
model="openai/gpt-oss-20b",
input=(
"Summarize this incident: checkout errors are blocking purchases. "
"Return a JSON object with keys severity and team."
),
response_format=IncidentPayload,
)
print(result)
# {'severity': 'high', 'team': 'checkout'}
Enum and Literal
Pass an Enum class or a typing.Literal[...] when the entire output is a single value drawn from a fixed set.
from enum import Enum
from typing import Literal
from dottxt import DotTxt
class Severity(str, Enum):
low = "low"
medium = "medium"
high = "high"
client = DotTxt()
severity = client.generate(
model="openai/gpt-oss-20b",
input="Classify the severity of: checkout errors are blocking purchases.",
response_format=Severity,
)
print(severity)
# 'high'
label = client.generate(
model="openai/gpt-oss-20b",
input="Classify the sentiment of: 'I love this product!'",
response_format=Literal["positive", "negative", "neutral"],
)
print(label)
# 'positive'
Union and Optional
typing.Union[...] and typing.Optional[...] constrain the output to one of several types.
from typing import Optional, Union
from dottxt import DotTxt
client = DotTxt()
value = client.generate(
model="openai/gpt-oss-20b",
input="How many incidents this week? Reply with a number, or null if unknown.",
response_format=Optional[int],
)
print(value)
mixed = client.generate(
model="openai/gpt-oss-20b",
input="Reply with the user's age as an integer, or their name as a string.",
response_format=Union[int, str],
)
print(mixed)
Typed containers
Typed containers like list[...], dict[...], and tuple[...] work directly.
from dottxt import DotTxt
client = DotTxt()
teams = client.generate(
model="openai/gpt-oss-20b",
input="List three engineering teams that own checkout systems.",
response_format=list[str],
)
print(teams)
# ['checkout', 'payments', 'orders']
scores = client.generate(
model="openai/gpt-oss-20b",
input="Score each team's incident impact on a scale of 0-10.",
response_format=dict[str, int],
)
print(scores)
# {'checkout': 9, 'payments': 6, 'orders': 4}
JSON Schema (string or dict)
You can also pass JSON Schema directly as a Python dict or as a JSON string. This is useful when you have a schema authored elsewhere — by hand, by Quicktype, by Genson, or shared from another service.
from dottxt import DotTxt
schema = {
"type": "object",
"properties": {
"severity": {"type": "string", "enum": ["low", "medium", "high"]},
"team": {"type": "string", "maxLength": 32},
},
"required": ["severity", "team"],
"additionalProperties": False,
}
client = DotTxt()
result = client.generate(
model="openai/gpt-oss-20b",
input="Summarize this incident: checkout errors are blocking purchases.",
response_format=schema,
)
print(result)
# {'severity': 'high', 'team': 'checkout'}