Skip to content

Commit

Permalink
anthropic[minor]: claude 3 (langchain-ai#18508)
Browse files Browse the repository at this point in the history
  • Loading branch information
efriis authored and gkorland committed Mar 30, 2024
1 parent f8fe71c commit 91375f2
Show file tree
Hide file tree
Showing 8 changed files with 241 additions and 165 deletions.
69 changes: 40 additions & 29 deletions docs/docs/integrations/chat/anthropic.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@
},
{
"cell_type": "code",
"execution_count": null,
"execution_count": 2,
"id": "01578ae3",
"metadata": {},
"outputs": [],
Expand All @@ -66,10 +66,13 @@
"source": [
"The code provided assumes that your ANTHROPIC_API_KEY is set in your environment variables. If you would like to manually specify your API key and also choose a different model, you can use the following code:\n",
"```python\n",
"chat = ChatAnthropic(temperature=0, anthropic_api_key=\"YOUR_API_KEY\", model_name=\"claude-instant-1.2\")\n",
"chat = ChatAnthropic(temperature=0, anthropic_api_key=\"YOUR_API_KEY\", model_name=\"claude-3-opus-20240229\")\n",
"\n",
"```\n",
"Please note that the default model is \"claude-2,\" and you can check the available models at [here](https://docs.anthropic.com/claude/reference/selecting-a-model)."
"\n",
"In these demos, we will use the Claude 3 Opus model, and you can also use the launch version of the Sonnet model with `claude-3-sonnet-20240229`.\n",
"\n",
"You can check the model comparison doc [here](https://docs.anthropic.com/claude/docs/models-overview#model-comparison)."
]
},
{
Expand All @@ -87,7 +90,7 @@
{
"data": {
"text/plain": [
"AIMessage(content=' 저는 파이썬을 좋아합니다.')"
"AIMessage(content='저는 파이썬을 사랑합니다.\\n\\nTranslation:\\nI love Python.')"
]
},
"execution_count": 3,
Expand All @@ -99,7 +102,7 @@
"from langchain_anthropic import ChatAnthropic\n",
"from langchain_core.prompts import ChatPromptTemplate\n",
"\n",
"chat = ChatAnthropic(temperature=0, model_name=\"claude-2\")\n",
"chat = ChatAnthropic(temperature=0, model_name=\"claude-3-opus-20240229\")\n",
"\n",
"system = (\n",
" \"You are a helpful assistant that translates {input_language} to {output_language}.\"\n",
Expand Down Expand Up @@ -127,7 +130,7 @@
},
{
"cell_type": "code",
"execution_count": 4,
"execution_count": 5,
"id": "c5fac0e9-05a4-4fc1-a3b3-e5bbb24b971b",
"metadata": {
"ExecuteTime": {
Expand All @@ -140,24 +143,24 @@
{
"data": {
"text/plain": [
"AIMessage(content=\" Why don't bears like fast food? Because they can't catch it!\")"
"AIMessage(content='Sure, here\\'s a joke about a bear:\\n\\nA bear walks into a bar and says to the bartender, \"I\\'ll have a pint of beer and a.......... packet of peanuts.\"\\n\\nThe bartender asks, \"Why the big pause?\"\\n\\nThe bear replies, \"I don\\'t know, I\\'ve always had them!\"')"
]
},
"execution_count": 4,
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"chat = ChatAnthropic(temperature=0, model_name=\"claude-2\")\n",
"chat = ChatAnthropic(temperature=0, model_name=\"claude-3-opus-20240229\")\n",
"prompt = ChatPromptTemplate.from_messages([(\"human\", \"Tell me a joke about {topic}\")])\n",
"chain = prompt | chat\n",
"await chain.ainvoke({\"topic\": \"bear\"})"
]
},
{
"cell_type": "code",
"execution_count": 5,
"execution_count": 6,
"id": "025be980-e50d-4a68-93dc-c9c7b500ce34",
"metadata": {
"ExecuteTime": {
Expand All @@ -171,37 +174,45 @@
"name": "stdout",
"output_type": "stream",
"text": [
" Here are some of the most famous tourist attractions in Japan:\n",
"\n",
"- Tokyo - Tokyo Tower, Tokyo Skytree, Imperial Palace, Sensoji Temple, Meiji Shrine, Shibuya Crossing\n",
"\n",
"- Kyoto - Kinkakuji (Golden Pavilion), Fushimi Inari Shrine, Kiyomizu-dera Temple, Arashiyama Bamboo Grove, Gion Geisha District\n",
"\n",
"- Osaka - Osaka Castle, Dotonbori, Universal Studios Japan, Osaka Aquarium Kaiyukan \n",
"\n",
"- Hiroshima - Hiroshima Peace Memorial Park and Museum, Itsukushima Shrine (Miyajima Island)\n",
"\n",
"- Mount Fuji - Iconic and famous mountain, popular for hiking and viewing from places like Hakone and Kawaguchiko Lake\n",
"Here is a list of famous tourist attractions in Japan:\n",
"\n",
"- Himeji - Himeji Castle, one of Japan's most impressive feudal castles\n",
"\n",
"- Nara - Todaiji Temple, Nara Park with its bowing deer, Horyuji Temple with some of world's oldest wooden structures \n",
"\n",
"- Nikko - Elaborate shrines and temples nestled around Nikko National Park\n",
"\n",
"- Sapporo - Snow"
"1. Tokyo Skytree (Tokyo)\n",
"2. Senso-ji Temple (Tokyo)\n",
"3. Meiji Shrine (Tokyo)\n",
"4. Tokyo DisneySea (Urayasu, Chiba)\n",
"5. Fushimi Inari Taisha (Kyoto)\n",
"6. Kinkaku-ji (Golden Pavilion) (Kyoto)\n",
"7. Kiyomizu-dera (Kyoto)\n",
"8. Nijo Castle (Kyoto)\n",
"9. Osaka Castle (Osaka)\n",
"10. Dotonbori (Osaka)\n",
"11. Hiroshima Peace Memorial Park (Hiroshima)\n",
"12. Itsukushima Shrine (Miyajima Island, Hiroshima)\n",
"13. Himeji Castle (Himeji)\n",
"14. Todai-ji Temple (Nara)\n",
"15. Nara Park (Nara)\n",
"16. Mount Fuji (Shizuoka and Yamanashi Prefectures)\n",
"17."
]
}
],
"source": [
"chat = ChatAnthropic(temperature=0.3, model_name=\"claude-2\")\n",
"chat = ChatAnthropic(temperature=0.3, model_name=\"claude-3-opus-20240229\")\n",
"prompt = ChatPromptTemplate.from_messages(\n",
" [(\"human\", \"Give me a list of famous tourist attractions in Japan\")]\n",
")\n",
"chain = prompt | chat\n",
"for chunk in chain.stream({}):\n",
" print(chunk.content, end=\"\", flush=True)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "b6bb2aa2",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand Down
2 changes: 2 additions & 0 deletions docs/docs/integrations/llms/anthropic.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@
"\n",
"This example goes over how to use LangChain to interact with `Anthropic` models.\n",
"\n",
"NOTE: AnthropicLLM only supports legacy Claude 2 models. To use the newest Claude 3 models, please use [`ChatAnthropic`](/docs/integrations/chat/anthropic) instead.\n",
"\n",
"## Installation"
]
},
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/integrations/platforms/anthropic.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ You can import this wrapper with the following code:

```
from langchain_anthropic import ChatAnthropic
model = ChatAnthropic(model='claude-2.1')
model = ChatAnthropic(model='claude-3-opus-20240229')
```

Read more in the [ChatAnthropic documentation](/docs/integrations/chat/anthropic).
Expand Down
2 changes: 1 addition & 1 deletion libs/partners/anthropic/langchain_anthropic/chat_models.py
Original file line number Diff line number Diff line change
Expand Up @@ -77,7 +77,7 @@ class Config:
model: str = Field(alias="model_name")
"""Model name to use."""

max_tokens: int = Field(default=256, alias="max_tokens_to_sample")
max_tokens: int = Field(default=1024, alias="max_tokens_to_sample")
"""Denotes the number of tokens to predict per generation."""

temperature: Optional[float] = None
Expand Down
2 changes: 1 addition & 1 deletion libs/partners/anthropic/langchain_anthropic/llms.py
Original file line number Diff line number Diff line change
Expand Up @@ -35,7 +35,7 @@ class _AnthropicCommon(BaseLanguageModel):
model: str = Field(default="claude-2", alias="model_name")
"""Model name to use."""

max_tokens_to_sample: int = Field(default=256, alias="max_tokens")
max_tokens_to_sample: int = Field(default=1024, alias="max_tokens")
"""Denotes the number of tokens to predict per generation."""

temperature: Optional[float] = None
Expand Down

0 comments on commit 91375f2

Please sign in to comment.