A serverless API built with Vercel that integrates OpenAI's GPT models with LangSmith
- Node.js 18+
- Vercel CLI
- OpenAI API Key
- LangSmith API Key and Project
-
Clone the repository
-
Install dependencies
npm install
-
install vercel globally
-
Set up environment variables
Create a
.env.local
file in the root directory:OPENAI_API_KEY=your_openai_api_key_here LANGCHAIN_TRACING_V2=true LANGCHAIN_API_KEY=your_langsmith_api_key_here LANGCHAIN_PROJECT=your_project_name
-
Start the development server
npx run vercel
The API will be available at
http://localhost:3000
-
Test the API endpoint
curl -X POST http://localhost:3000/api/langsmith_tracing \ -H "Content-Type: application/json" \ -d '{ "model": "gpt-4o", "temperature": 0.2, "input": [ { "role": "system", "content": [ { "type": "input_text", "text": "You are a helpful assistant." } ] }, { "role": "user", "content": [ { "type": "input_text", "text": "Hello, how are you?" } ] } ], "text": {"format": {"type": "text"}}, "store": true }'
const response = await fetch('http://localhost:3000/api/langsmith_tracing', {
method: 'POST',
headers: {
'Content-Type': 'application/json',
},
body: JSON.stringify({
model: 'gpt-4o',
temperature: 0.2,
input: [
{
role: 'user',
content: [{
type: 'input_text',
text: 'What is the capital of France?'
}]
}
],
store: true
})
});
- Deploy to Vercel
vercel --prod
- Set environment variables in Vercel Dashboard
- Update CORS settings for production:
res.setHeader('Access-Control-Allow-Origin', 'https://yourdomain.com');
MIT License - see LICENSE file for details