Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix links #15566

Merged
merged 4 commits into from
Jan 5, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/docs/guides/debugging.md
Original file line number Diff line number Diff line change
Expand Up @@ -656,6 +656,6 @@ agent.run("Who directed the 2023 film Oppenheimer and what is their age? What is

## Other callbacks

`Callbacks` are what we use to execute any functionality within a component outside the primary component logic. All of the above solutions use `Callbacks` under the hood to log intermediate steps of components. There are a number of `Callbacks` relevant for debugging that come with LangChain out of the box, like the [FileCallbackHandler](/docs/modules/callbacks/how_to/filecallbackhandler). You can also implement your own callbacks to execute custom functionality.
`Callbacks` are what we use to execute any functionality within a component outside the primary component logic. All of the above solutions use `Callbacks` under the hood to log intermediate steps of components. There are a number of `Callbacks` relevant for debugging that come with LangChain out of the box, like the [FileCallbackHandler](/docs/modules/callbacks/filecallbackhandler). You can also implement your own callbacks to execute custom functionality.

See here for more info on [Callbacks](/docs/modules/callbacks/), how to use them, and customize them.
8 changes: 4 additions & 4 deletions docs/docs/guides/deployments/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -20,11 +20,11 @@ This guide aims to provide a comprehensive overview of the requirements for depl

Understanding these components is crucial when assessing serving systems. LangChain integrates with several open-source projects designed to tackle these issues, providing a robust framework for productionizing your LLM applications. Some notable frameworks include:

- [Ray Serve](/docs/ecosystem/integrations/ray_serve)
- [Ray Serve](/docs/integrations/providers/ray_serve)
- [BentoML](https://github.com/bentoml/BentoML)
- [OpenLLM](/docs/ecosystem/integrations/openllm)
- [Modal](/docs/ecosystem/integrations/modal)
- [Jina](/docs/ecosystem/integrations/jina#deployment)
- [OpenLLM](/docs/integrations/providers/openllm)
- [Modal](/docs/integrations/providers/modal)
- [Jina](/docs/integrations/providers/jina)

These links will provide further information on each ecosystem, assisting you in finding the best fit for your LLM deployment needs.

Expand Down
17 changes: 9 additions & 8 deletions docs/docs/guides/safety/hugging_face_prompt_injection.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -28,9 +28,7 @@
"cell_type": "code",
"execution_count": null,
"id": "9bdbfdc7c949a9c1",
"metadata": {
"collapsed": false
},
"metadata": {},
"outputs": [],
"source": [
"!pip install \"optimum[onnxruntime]\""
Expand All @@ -44,8 +42,7 @@
"ExecuteTime": {
"end_time": "2023-12-18T11:41:24.738278Z",
"start_time": "2023-12-18T11:41:20.842567Z"
},
"collapsed": false
}
},
"outputs": [],
"source": [
Expand Down Expand Up @@ -80,7 +77,9 @@
"outputs": [
{
"data": {
"text/plain": "'hugging_face_injection_identifier'"
"text/plain": [
"'hugging_face_injection_identifier'"
]
},
"execution_count": 10,
"metadata": {},
Expand Down Expand Up @@ -119,7 +118,9 @@
"outputs": [
{
"data": {
"text/plain": "'Name 5 cities with the biggest number of inhabitants'"
"text/plain": [
"'Name 5 cities with the biggest number of inhabitants'"
]
},
"execution_count": 11,
"metadata": {},
Expand Down Expand Up @@ -374,7 +375,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.1"
"version": "3.10.1"
}
},
"nbformat": 4,
Expand Down
2 changes: 1 addition & 1 deletion docs/docs/guides/safety/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,6 +4,6 @@ One of the key concerns with using LLMs is that they may generate harmful or une

- [Amazon Comprehend moderation chain](/docs/guides/safety/amazon_comprehend_chain): Use [Amazon Comprehend](https://aws.amazon.com/comprehend/) to detect and handle Personally Identifiable Information (PII) and toxicity.
- [Constitutional chain](/docs/guides/safety/constitutional_chain): Prompt the model with a set of principles which should guide the model behavior.
- [Hugging Face prompt injection identification](/docs/guides/safety/huggingface_prompt_injection_identification): Detect and handle prompt injection attacks.
- [Hugging Face prompt injection identification](/docs/guides/safety/hugging_face_prompt_injection): Detect and handle prompt injection attacks.
- [Logical Fallacy chain](/docs/guides/safety/logical_fallacy_chain): Checks the model output against logical fallacies to correct any deviation.
- [Moderation chain](/docs/guides/safety/moderation): Check if any output text is harmful and flag it.
2 changes: 1 addition & 1 deletion docs/docs/integrations/callbacks/streamlit.md
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
[![Open in GitHub Codespaces](https://github.com/codespaces/badge.svg)](https://codespaces.new/langchain-ai/streamlit-agent?quickstart=1)

In this guide we will demonstrate how to use `StreamlitCallbackHandler` to display the thoughts and actions of an agent in an
interactive Streamlit app. Try it out with the running app below using the [MRKL agent](/docs/modules/agents/how_to/mrkl/):
interactive Streamlit app. Try it out with the running app below using the MRKL agent:

<iframe loading="lazy" src="https://langchain-mrkl.streamlit.app/?embed=true&embed_options=light_theme"
style={{ width: 100 + '%', border: 'none', marginBottom: 1 + 'rem', height: 600 }}
Expand Down
4 changes: 2 additions & 2 deletions docs/docs/integrations/document_loaders/docugami.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -346,7 +346,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"We can use a [self-querying retriever](/docs/modules/data_connection/retrievers/how_to/self_query/) to improve our query accuracy, using this additional metadata:"
"We can use a [self-querying retriever](/docs/modules/data_connection/retrievers/self_query/) to improve our query accuracy, using this additional metadata:"
]
},
{
Expand Down Expand Up @@ -656,7 +656,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.9.16"
"version": "3.10.1"
}
},
"nbformat": 4,
Expand Down
4 changes: 2 additions & 2 deletions docs/docs/integrations/document_loaders/psychic.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@
"metadata": {},
"source": [
"# Psychic\n",
"This notebook covers how to load documents from `Psychic`. See [here](/docs/ecosystem/integrations/psychic) for more details.\n",
"This notebook covers how to load documents from `Psychic`. See [here](/docs/integrations/providers/psychic) for more details.\n",
"\n",
"## Prerequisites\n",
"1. Follow the Quick Start section in [this document](/docs/ecosystem/integrations/psychic)\n",
Expand Down Expand Up @@ -118,7 +118,7 @@
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.3"
"version": "3.10.1"
},
"vscode": {
"interpreter": {
Expand Down