Skip to content

Commit

Permalink
docs: anthropic quickstart (langchain-ai#18440)
Browse files Browse the repository at this point in the history
  • Loading branch information
baskaryan authored and gkorland committed Mar 30, 2024
1 parent 1174105 commit 0a51300
Show file tree
Hide file tree
Showing 2 changed files with 64 additions and 6 deletions.
35 changes: 32 additions & 3 deletions docs/docs/get_started/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -65,10 +65,10 @@ We will link to relevant docs.

## LLM Chain

We'll show how to use models available via API, like OpenAI and Cohere, and local open source models, using integrations like Ollama.
We'll show how to use models available via API, like OpenAI, and local open source models, using integrations like Ollama.

<Tabs>
<TabItem value="openai" label="OpenAI (API)" default>
<TabItem value="openai" label="OpenAI" default>

First we'll need to import the LangChain x OpenAI integration package.

Expand Down Expand Up @@ -115,7 +115,36 @@ llm = Ollama(model="llama2")
```

</TabItem>
<TabItem value="cohere" label="Cohere (API)" default>
<TabItem value="anthropic" label="Anthropic">

First we'll need to import the LangChain x Anthropic package.

```shell
pip install langchain-anthropic
```

Accessing the API requires an API key, which you can get by creating an account [here](https://claude.ai/login). Once we have a key we'll want to set it as an environment variable by running:

```shell
export ANTHROPIC_API_KEY="..."
```

We can then initialize the model:

```python
from langchain_anthropic import ChatAnthropic

llm = ChatAnthropic(model="claude-2.1", temperature=0.2, max_tokens=1024)
```

If you'd prefer not to set an environment variable you can pass the key in directly via the `anthropic_api_key` named parameter when initiating the Anthropic Chat Model class:

```python
llm = ChatAnthropic(anthropic_api_key="...")
```

</TabItem>
<TabItem value="cohere" label="Cohere">

First we'll need to import the Cohere SDK package.

Expand Down
35 changes: 32 additions & 3 deletions docs/docs/modules/model_io/quick_start.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@ sidebar_position: 0
The quick start will cover the basics of working with language models. It will introduce the two different types of models - LLMs and ChatModels. It will then cover how to use PromptTemplates to format the inputs to these models, and how to use Output Parsers to work with the outputs. For a deeper conceptual guide into these topics - please see [this documentation](./concepts)

## Models
For this getting started guide, we will provide two options: using OpenAI (a popular model available via API) or using a local open source model.
For this getting started guide, we will provide a few options: using an API like Anthropic or OpenAI, or using a local open source model via Ollama.

import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
Expand Down Expand Up @@ -62,6 +62,35 @@ from langchain_community.chat_models import ChatOllama

llm = Ollama(model="llama2")
chat_model = ChatOllama()
```

</TabItem>
<TabItem value="anthropic" label="Anthropic (chat model only)">

First we'll need to import the LangChain x Anthropic package.

```shell
pip install langchain-anthropic
```

Accessing the API requires an API key, which you can get by creating an account [here](https://claude.ai/login). Once we have a key we'll want to set it as an environment variable by running:

```shell
export ANTHROPIC_API_KEY="..."
```

We can then initialize the model:

```python
from langchain_anthropic import ChatAnthropic

chat_model = ChatAnthropic(model="claude-2.1", temperature=0.2, max_tokens=1024)
```

If you'd prefer not to set an environment variable you can pass the key in directly via the `anthropic_api_key` named parameter when initiating the Anthropic Chat Model class:

```python
chat_model = ChatAnthropic(anthropic_api_key="...")
```

</TabItem>
Expand All @@ -84,15 +113,15 @@ We can then initialize the model:
```python
from langchain_community.chat_models import ChatCohere

llm = ChatCohere()
chat_model = ChatCohere()
```

If you'd prefer not to set an environment variable you can pass the key in directly via the `cohere_api_key` named parameter when initiating the Cohere LLM class:

```python
from langchain_community.chat_models import ChatCohere

llm = ChatCohere(cohere_api_key="...")
chat_model = ChatCohere(cohere_api_key="...")
```

</TabItem>
Expand Down

0 comments on commit 0a51300

Please sign in to comment.