Skip to content

Commit

Permalink
Merge branch 'master' into eugene/extraction_use_Case
Browse files Browse the repository at this point in the history
  • Loading branch information
eyurtsev committed Mar 5, 2024
2 parents a8ec227 + e1924b3 commit 94b46d5
Show file tree
Hide file tree
Showing 65 changed files with 17,736 additions and 1,035 deletions.
2 changes: 1 addition & 1 deletion .github/workflows/codespell.yml
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,6 @@ jobs:
- name: Codespell
uses: codespell-project/actions-codespell@v2
with:
skip: guide_imports.json,*.ambr,./cookbook/data/imdb_top_1000.csv
skip: guide_imports.json,*.ambr,./cookbook/data/imdb_top_1000.csv,*.lock
ignore_words_list: ${{ steps.extract_ignore_words.outputs.ignore_words_list }}
exclude_file: libs/community/langchain_community/llms/yuan2.py
747 changes: 747 additions & 0 deletions cookbook/RAPTOR.ipynb

Large diffs are not rendered by default.

2 changes: 1 addition & 1 deletion cookbook/llm_checker.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -50,7 +50,7 @@
"\n",
"checker_chain = LLMCheckerChain.from_llm(llm, verbose=True)\n",
"\n",
"checker_chain.run(text)"
"checker_chain.invoke(text)"
]
},
{
Expand Down
2 changes: 1 addition & 1 deletion cookbook/llm_math.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -51,7 +51,7 @@
"llm = OpenAI(temperature=0)\n",
"llm_math = LLMMathChain.from_llm(llm, verbose=True)\n",
"\n",
"llm_math.run(\"What is 13 raised to the .3432 power?\")"
"llm_math.invoke(\"What is 13 raised to the .3432 power?\")"
]
},
{
Expand Down
1 change: 1 addition & 0 deletions docs/.yarnrc.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1 @@
nodeLinker: node-modules
11 changes: 10 additions & 1 deletion docs/docs/get_started/introduction.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,16 @@ This framework consists of several parts.
- **[LangServe](/docs/langserve)**: A library for deploying LangChain chains as a REST API.
- **[LangSmith](/docs/langsmith)**: A developer platform that lets you debug, test, evaluate, and monitor chains built on any LLM framework and seamlessly integrates with LangChain.

![Diagram outlining the hierarchical organization of the LangChain framework, displaying the interconnected parts across multiple layers.](/svg/langchain_stack.svg "LangChain Framework Overview")
import ThemedImage from '@theme/ThemedImage';

<ThemedImage
alt="Diagram outlining the hierarchical organization of the LangChain framework, displaying the interconnected parts across multiple layers."
sources={{
light: '/svg/langchain_stack.svg',
dark: '/svg/langchain_stack_dark.svg',
}}
title="LangChain Framework Overview"
/>

Together, these products simplify the entire application lifecycle:
- **Develop**: Write your applications in LangChain/LangChain.js. Hit the ground running using Templates for reference.
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/get_started/quickstart.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -134,7 +134,7 @@ We can then initialize the model:
```python
from langchain_anthropic import ChatAnthropic

llm = ChatAnthropic(model="claude-2.1", temperature=0.2, max_tokens=1024)
llm = ChatAnthropic(model="claude-3-sonnet-20240229", temperature=0.2, max_tokens=1024)
```

If you'd prefer not to set an environment variable you can pass the key in directly via the `anthropic_api_key` named parameter when initiating the Anthropic Chat Model class:
Expand Down
155 changes: 124 additions & 31 deletions docs/docs/integrations/chat/anthropic.ipynb

Large diffs are not rendered by default.

205 changes: 33 additions & 172 deletions docs/docs/integrations/chat/anthropic_functions.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,13 @@
"id": "5125a1e3",
"metadata": {},
"source": [
"# Anthropic Functions\n",
"# Anthropic Tools\n",
"\n",
"This notebook shows how to use an experimental wrapper around Anthropic that gives it the same API as OpenAI Functions."
"This notebook shows how to use an experimental wrapper around Anthropic that gives it tool calling and structured output capabilities. It follows Anthropic's guide [here](https://docs.anthropic.com/claude/docs/functions-external-tools)\n",
"\n",
"The wrapper is available from the `langchain-anthropic` package, and it also requires the optional dependency `defusedxml` for parsing XML output from the llm.\n",
"\n",
"Note: this is a beta feature that will be replaced by Anthropic's formal implementation of tool calling, but it is useful for testing and experimentation in the meantime."
]
},
{
Expand All @@ -17,225 +21,82 @@
"metadata": {},
"outputs": [],
"source": [
"from langchain_experimental.llms.anthropic_functions import AnthropicFunctions"
"%pip install -qU langchain-anthropic defusedxml\n",
"from langchain_anthropic.experimental import ChatAnthropicTools"
]
},
{
"cell_type": "markdown",
"id": "65499965",
"metadata": {},
"source": [
"## Initialize Model\n",
"\n",
"You can initialize this wrapper the same way you'd initialize ChatAnthropic"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "e1d535f6",
"metadata": {},
"outputs": [],
"source": [
"model = AnthropicFunctions(model=\"claude-2\")"
]
},
{
"cell_type": "markdown",
"id": "fcc9eaf4",
"metadata": {},
"source": [
"## Passing in functions\n",
"## Tool Binding\n",
"\n",
"You can now pass in functions in a similar way"
"`ChatAnthropicTools` exposes a `bind_tools` method that allows you to pass in Pydantic models or BaseTools to the llm."
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "0779c320",
"metadata": {},
"outputs": [],
"source": [
"functions = [\n",
" {\n",
" \"name\": \"get_current_weather\",\n",
" \"description\": \"Get the current weather in a given location\",\n",
" \"parameters\": {\n",
" \"type\": \"object\",\n",
" \"properties\": {\n",
" \"location\": {\n",
" \"type\": \"string\",\n",
" \"description\": \"The city and state, e.g. San Francisco, CA\",\n",
" },\n",
" \"unit\": {\"type\": \"string\", \"enum\": [\"celsius\", \"fahrenheit\"]},\n",
" },\n",
" \"required\": [\"location\"],\n",
" },\n",
" }\n",
"]"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "ad75a933",
"metadata": {},
"outputs": [],
"source": [
"from langchain_core.messages import HumanMessage"
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "fc703085",
"metadata": {},
"outputs": [],
"source": [
"response = model.invoke(\n",
" [HumanMessage(content=\"whats the weater in boston?\")], functions=functions\n",
")"
]
},
{
"cell_type": "code",
"execution_count": 7,
"id": "04d7936a",
"id": "e1d535f6",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"AIMessage(content=' ', additional_kwargs={'function_call': {'name': 'get_current_weather', 'arguments': '{\"location\": \"Boston, MA\", \"unit\": \"fahrenheit\"}'}}, example=False)"
"AIMessage(content='', additional_kwargs={'tool_calls': [{'function': {'name': 'Person', 'arguments': '{\"name\": \"Erick\", \"age\": \"27\"}'}, 'type': 'function'}]})"
]
},
"execution_count": 7,
"execution_count": 3,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"response"
]
},
{
"cell_type": "markdown",
"id": "0072fdba",
"metadata": {},
"source": [
"## Using for extraction\n",
"from langchain_core.pydantic_v1 import BaseModel\n",
"\n",
"You can now use this for extraction."
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "7af5c567",
"metadata": {},
"outputs": [],
"source": [
"from langchain.chains import create_extraction_chain\n",
"\n",
"schema = {\n",
" \"properties\": {\n",
" \"name\": {\"type\": \"string\"},\n",
" \"height\": {\"type\": \"integer\"},\n",
" \"hair_color\": {\"type\": \"string\"},\n",
" },\n",
" \"required\": [\"name\", \"height\"],\n",
"}\n",
"inp = \"\"\"\n",
"Alex is 5 feet tall. Claudia is 1 feet taller Alex and jumps higher than him. Claudia is a brunette and Alex is blonde.\n",
" \"\"\""
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "bd01082a",
"metadata": {},
"outputs": [],
"source": [
"chain = create_extraction_chain(schema, model)"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "b5a23e9f",
"metadata": {},
"outputs": [],
"source": [
"chain.invoke(inp)"
"class Person(BaseModel):\n",
" name: str\n",
" age: int\n",
"\n",
"\n",
"model = ChatAnthropicTools(model=\"claude-3-opus-20240229\").bind_tools(tools=[Person])\n",
"model.invoke(\"I am a 27 year old named Erick\")"
]
},
{
"cell_type": "markdown",
"id": "90ec959e",
"id": "fcc9eaf4",
"metadata": {},
"source": [
"## Using for tagging\n",
"## Structured Output\n",
"\n",
"You can now use this for tagging"
"`ChatAnthropicTools` also implements the [`with_structured_output` spec](/docs/guides/structured_output) for extracting values. Note: this may not be as stable as with models that explicitly offer tool calling."
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "03c1eb0d",
"metadata": {},
"outputs": [],
"source": [
"from langchain.chains import create_tagging_chain"
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "581c0ece",
"metadata": {},
"outputs": [],
"source": [
"schema = {\n",
" \"properties\": {\n",
" \"sentiment\": {\"type\": \"string\"},\n",
" \"aggressiveness\": {\"type\": \"integer\"},\n",
" \"language\": {\"type\": \"string\"},\n",
" }\n",
"}"
]
},
{
"cell_type": "code",
"execution_count": 14,
"id": "d9a8570e",
"metadata": {},
"outputs": [],
"source": [
"chain = create_tagging_chain(schema, model)"
]
},
{
"cell_type": "code",
"execution_count": 15,
"id": "cf37d679",
"execution_count": 4,
"id": "0779c320",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"{'sentiment': 'positive', 'aggressiveness': '0', 'language': 'english'}"
"Person(name='Erick', age=27)"
]
},
"execution_count": 15,
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"chain.invoke(\"this is really cool\")"
"chain = ChatAnthropicTools(model=\"claude-3-opus-20240229\").with_structured_output(\n",
" Person\n",
")\n",
"chain.invoke(\"I am a 27 year old named Erick\")"
]
}
],
Expand All @@ -255,7 +116,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.0"
"version": "3.11.4"
}
},
"nbformat": 4,
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/integrations/document_loaders/telegram.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,7 @@
" chat_entity=\"<CHAT_URL>\", # recommended to use Entity here\n",
" api_hash=\"<API HASH >\",\n",
" api_id=\"<API_ID>\",\n",
" user_name=\"\", # needed only for caching the session.\n",
" username=\"\", # needed only for caching the session.\n",
")"
]
},
Expand Down
2 changes: 2 additions & 0 deletions docs/docs/integrations/llms/anthropic.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,8 @@
"\n",
"This example goes over how to use LangChain to interact with `Anthropic` models.\n",
"\n",
"NOTE: AnthropicLLM only supports legacy Claude 2 models. To use the newest Claude 3 models, please use [`ChatAnthropic`](/docs/integrations/chat/anthropic) instead.\n",
"\n",
"## Installation"
]
},
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,10 @@
"source": [
"## Setup\n",
"\n",
"The integration lives in the `langchain-community` package, so we need to install that. We also need to install the `pymongo` package.\n",
"The integration lives in the `langchain-mongodb` package, so we need to install that.\n",
"\n",
"```bash\n",
"pip install -U --quiet langchain-community pymongo\n",
"pip install -U --quiet langchain-mongodb\n",
"```"
]
},
Expand Down Expand Up @@ -80,7 +80,7 @@
},
"outputs": [],
"source": [
"from langchain_community.chat_message_histories import MongoDBChatMessageHistory\n",
"from langchain_mongodb.chat_message_histories import MongoDBChatMessageHistory\n",
"\n",
"chat_message_history = MongoDBChatMessageHistory(\n",
" session_id=\"test_session\",\n",
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/integrations/platforms/anthropic.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -26,7 +26,7 @@ You can import this wrapper with the following code:

```
from langchain_anthropic import ChatAnthropic
model = ChatAnthropic(model='claude-2.1')
model = ChatAnthropic(model='claude-3-opus-20240229')
```

Read more in the [ChatAnthropic documentation](/docs/integrations/chat/anthropic).
Expand Down
1 change: 1 addition & 0 deletions docs/docs/integrations/platforms/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -16,6 +16,7 @@ These providers have standalone `langchain-{provider}` packages for improved ver
- [Astra DB](/docs/integrations/providers/astradb)
- [Exa Search](/docs/integrations/providers/exa_search)
- [Google](/docs/integrations/platforms/google)
- [Groq](/docs/integrations/providers/groq)
- [IBM](/docs/integrations/providers/ibm)
- [MistralAI](/docs/integrations/providers/mistralai)
- [Nomic](/docs/integrations/providers/nomic)
Expand Down

0 comments on commit 94b46d5

Please sign in to comment.