Configuring OpenAI to use Together’s API
To start using Together with OpenAI’s client libraries, pass in your Together API key to theapi_key
option, and change the base_url
to https://api.together.xyz/v1
:
- python
- Typescript
Querying an Inference model
Now that your OpenAI client is configured to point to Together, you can start using one of our open-source models for your inference queries. For example, you can query one of our chat models, like Meta Llama 3:- python
- Typescript
- python
- Typescript
Streaming with OpenAI
You can also use OpenAI’s streaming capabilities to stream back your response:- python
- TypeScript