When OpenAI released their API to the public, I was amongst the first ones to play with it. The official OpenAI Playground on their website is great, but I wanted something I could run locally, tweak to my liking, and understand from the inside out. So I built my own minimal version: OpenAI Playground.
Why Build Your Own?
The official OpenAI Playground is a polished tool, but it comes with limitations. You need to be logged in, you’re bound to their interface, and you can’t easily integrate it into your own workflow. I wanted something simpler, so I built this local tool where I could quickly fire prompts at the API and see what comes back, without any friction.
What It Does
The app is intentionally minimal. It’s a single-page interface with:
- A text input for your prompt
- A submit button
- A display area for the model’s response
That’s it. No bells and whistles. You type a prompt, hit submit, and the response from the OpenAI API appears on screen. The simplicity is the point (it’s a playground for experimentation, not a production chatbot).
How It Works
The frontend is a simple React component that captures user input and sends it to a Next.js API route at /api/generate. This API route acts as a thin backend layer that communicates with the OpenAI API using the official Node.js SDK.
The API route calls the text-davinci-003 model with sensible defaults:
const request = {
model: 'text-davinci-003',
prompt: input,
temperature: .6,
max_tokens: 1024,
top_p: 1,
frequency_penalty: 0,
presence_penalty: 0,
};
A temperature of 0.6 is a nice balance between creativity and coherence. The 1024 max tokens give the model enough room for detailed responses without going overboard.
Running it locally takes about 30 seconds:
git clone https://github.com/rouralberto/openai-playground.git
cd openai-playground
export OPENAI_API_KEY=your_key_here
npm install
npm start
Then open localhost:3000 and start prompting.
You’ll need an OpenAI API key, which you can get from the OpenAI platform. The key is read from the OPENAI_API_KEY environment variable, so your credentials never touch the codebase.