Get structured output from LLMs with OpenAI
OpenAI's Structured Outputs feature guarantees your responses match a JSON schema. No more parsing failures or missing fields - the model output conforms to your schema or the request fails.
Prerequisites
- +Node 20+ or Python 3.10+
- +OpenAI API key with GPT-4o access
- +Familiarity with JSON Schema or Zod
Step-by-Step
- 1
Install the SDK
Use the latest openai package. Structured Outputs shipped in SDK v4.55+.
pnpm add openai zod - 2
Define your schema with Zod
Zod schemas convert to JSON Schema automatically. Define the exact shape you need.
import { z } from 'zod'; const ProductReview = z.object({ sentiment: z.enum(['positive', 'neutral', 'negative']), score: z.number().min(1).max(5), summary: z.string(), pros: z.array(z.string()), cons: z.array(z.string()), }); type ProductReview = z.infer<typeof ProductReview>; - 3
Use response_format with json_schema
Pass the schema to the API. GPT-4o constrains its output to match exactly.
import OpenAI from 'openai'; import { zodResponseFormat } from 'openai/helpers/zod'; const client = new OpenAI(); const completion = await client.beta.chat.completions.parse({ model: 'gpt-4o-2024-08-06', messages: [{ role: 'user', content: 'Analyze this review: "Great product but shipping was slow"' }], response_format: zodResponseFormat(ProductReview, 'product_review'), }); const review = completion.choices[0].message.parsed; console.log(review.sentiment, review.score); - 4
Handle refusals
If the model cannot comply (policy violation), it sets a refusal field instead of parsed content.
if (completion.choices[0].message.refusal) { console.error('Model refused:', completion.choices[0].message.refusal); } else { const data = completion.choices[0].message.parsed!; } - 5
Use strict mode for tool calls
Structured Outputs also applies to function calling. Set strict: true on your tool definition.
const tools = [{ type: 'function' as const, function: { name: 'create_event', strict: true, parameters: zodToJsonSchema(EventSchema), }, }]; - 6
Stream structured output
Use withResponse for streaming. Parse incrementally as tokens arrive.
const stream = await client.beta.chat.completions.stream({ model: 'gpt-4o-2024-08-06', messages: [{ role: 'user', content: 'List 5 project ideas' }], response_format: zodResponseFormat(ProjectList, 'projects'), }); for await (const chunk of stream) { process.stdout.write(chunk.choices[0]?.delta?.content || ''); }
Common Pitfalls
- !Using GPT-4o-mini or older models that do not support Structured Outputs.
- !Deeply nested schemas (>5 levels) can confuse the model. Flatten when possible.
- !Forgetting to handle refusals - your parsed field will be null.
- !Optional fields must use .nullable() or .optional() in Zod, not just omission.
DevDigest Academy
Structured AI engineering courses with hands-on labs. Build production-ready apps faster.
What's Next
- ->Combine with function calling for structured agent workflows.
- ->Add streaming for real-time UI updates.
- ->Build extraction pipelines that never fail schema validation.
