Use this file to discover all available pages before exploring further.
Together’s API endpoints for chat, language and code, images, and embeddings are fully compatible with OpenAI’s API.If you have an application that uses one of OpenAI’s client libraries, you can easily configure it to point to Together’s API servers, and start running your existing applications using our open-source models.
To start using Together with OpenAI’s client libraries, pass in your Together API key to the api_key option, and change the base_url to https://api.together.xyz/v1:
Now that your OpenAI client is configured to point to Together, you can start using one of our open-source models for your inference queries.For example, you can query one of our chat models, like Meta Llama 3:
python
Typescript
import osimport openaiclient = openai.OpenAI(api_key=os.environ.get("TOGETHER_API_KEY"),base_url="https://api.together.xyz/v1",)response = client.chat.completions.create(model="meta-llama/Llama-3-8b-chat-hf",messages=[{"role": "system", "content": "You are a travel agent. Be descriptive and helpful."},{"role": "user", "content": "Tell me about San Francisco"},])print(response.choices[0].message.content)
import OpenAI from 'openai';const client = new OpenAI({apiKey: process.env.TOGETHER_API_KEY,baseURL: 'https://api.together.xyz/v1',});const response = await client.chat.completions.create({model: 'meta-llama/Llama-3-8b-chat-hf',messages: [{ role: 'user', content: 'What are some fun things to do in New York?' },],});console.log(response.choices[0].message.content);
Or you can use a language model to generate a code completion:
You can also use OpenAI’s streaming capabilities to stream back your response:
python
TypeScript
import osimport openaisystem_content = "You are a travel agent. Be descriptive and helpful."user_content = "Tell me about San Francisco"client = openai.OpenAI(api_key=os.environ.get("TOGETHER_API_KEY"),base_url="https://api.together.xyz/v1",)stream = client.chat.completions.create(model="mistralai/Mixtral-8x7B-Instruct-v0.1",messages=[{"role": "system", "content": system_content},{"role": "user", "content": user_content},],stream=True,)for chunk in stream:print(chunk.choices[0].delta.content or "", end="", flush=True)
import OpenAI from 'openai';const client = new OpenAI({apiKey: process.env.TOGETHER_API_KEY,baseURL: 'https://api.together.xyz/v1',});async function run() {const stream = await client.chat.completions.create({model: 'mistralai/Mixtral-8x7B-Instruct-v0.1',messages: [ { role: 'system', content: 'You are an AI assistant' }, { role: 'user', content: 'Who won the world series in 2020?' },],stream: true,});for await (const chunk of stream) {// use process.stdout.write instead of console.log to avoid newlinesprocess.stdout.write(chunk.choices[0]?.delta?.content || '');}}run();