Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

langchain_ibm[patch] update docstring, dependencies, tests #18386

Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
22 changes: 1 addition & 21 deletions libs/partners/ibm/README.md
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# langchain-ibm

This package provides the integration between LangChain and IBM Watson AI through the `ibm-watsonx-ai` SDK.
This package provides the integration between LangChain and IBM watsonx.ai through the `ibm-watsonx-ai` SDK.

## Installation

Expand All @@ -10,10 +10,6 @@ To use the `langchain-ibm` package, follow these installation steps:
pip install langchain-ibm
```





## Usage

### Setting up
Expand Down Expand Up @@ -44,15 +40,10 @@ In alternative, you can set the environment variable in your terminal.
set WATSONX_APIKEY=your_ibm_api_key
```





### Loading the model

You might need to adjust model parameters for different models or tasks. For more details on the parameters, refer to IBM's [documentation](https://ibm.github.io/watsonx-ai-python-sdk/fm_model.html#metanames.GenTextParamsMetaNames).


```python
parameters = {
"decoding_method": "sample",
Expand Down Expand Up @@ -83,7 +74,6 @@ watsonx_llm = WatsonxLLM(
- You need to specify the model you want to use for inferencing through `model_id`. You can find the list of available models [here](https://ibm.github.io/watsonx-ai-python-sdk/fm_model.html#ibm_watsonx_ai.foundation_models.utils.enums.ModelTypes).



Alternatively you can use Cloud Pak for Data credentials. For more details, refer to IBM's [documentation](https://ibm.github.io/watsonx-ai-python-sdk/setup_cpd.html).

```python
Expand All @@ -99,9 +89,6 @@ watsonx_llm = WatsonxLLM(
)
```




### Create a Chain

Create `PromptTemplate` objects which will be responsible for creating a random question.
Expand All @@ -123,10 +110,6 @@ response = llm_chain.invoke("dog")
print(response)
```





### Calling the Model Directly
To obtain completions, you can call the model directly using a string prompt.

Expand All @@ -149,9 +132,6 @@ response = watsonx_llm.generate(
print(response)
```




### Streaming the Model output

You can stream the model output.
Expand Down
14 changes: 13 additions & 1 deletion libs/partners/ibm/langchain_ibm/llms.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ class WatsonxLLM(BaseLLM):
"""
IBM watsonx.ai large language models.

To use, you should have ``ibm_watsonx_ai`` python package installed,
To use, you should have ``langchain_ibm`` python package installed,
and the environment variable ``WATSONX_APIKEY`` set with your API key, or pass
it as a named parameter to the constructor.

Expand Down Expand Up @@ -103,6 +103,18 @@ def is_lc_serializable(cls) -> bool:

@property
def lc_secrets(self) -> Dict[str, str]:
"""A map of constructor argument names to secret ids.

For example:
{
"url": "WATSONX_URL",
"apikey": "WATSONX_APIKEY",
"token": "WATSONX_TOKEN",
"password": "WATSONX_PASSWORD",
"username": "WATSONX_USERNAME",
"instance_id": "WATSONX_INSTANCE_ID",
}
"""
return {
"url": "WATSONX_URL",
"apikey": "WATSONX_APIKEY",
Expand Down
37 changes: 19 additions & 18 deletions libs/partners/ibm/poetry.lock

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

26 changes: 11 additions & 15 deletions libs/partners/ibm/pyproject.toml
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
[tool.poetry]
name = "langchain-ibm"
version = "0.1.0"
version = "0.1.1"
description = "An integration package connecting IBM watsonx.ai and LangChain"
authors = ["IBM"]
readme = "README.md"
Expand All @@ -12,21 +12,20 @@ license = "MIT"

[tool.poetry.dependencies]
python = ">=3.10,<4.0"
langchain-core = "^0.1.26"
ibm-watsonx-ai = "^0.1.8"
langchain-core = "^0.1.27"
ibm-watsonx-ai = "^0.2.0"

[tool.poetry.group.test]
optional = true

[tool.poetry.group.test.dependencies]
pytest = "^7.3.0"
freezegun = "^1.2.2"
pytest-mock = "^3.10.0"
ibm-watsonx-ai = "^0.1.8"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we don't actually need to specify ibm-watsonx-ai in any of the group dependency since it's already a hard dependency of the package

pytest-mock = "^3.10.0"
syrupy = "^4.0.2"
pytest-watcher = "^0.3.4"
pytest-asyncio = "^0.21.1"
langchain-core = {path = "../../core", develop = true}
langchain-core = { path = "../../core", develop = true }

[tool.poetry.group.codespell]
optional = true
Expand All @@ -38,7 +37,6 @@ codespell = "^2.2.0"
optional = true

[tool.poetry.group.test_integration.dependencies]
ibm-watsonx-ai = "^0.1.8"

[tool.poetry.group.lint]
optional = true
Expand All @@ -48,29 +46,27 @@ ruff = "^0.1.5"

[tool.poetry.group.typing.dependencies]
mypy = "^0.991"
langchain-core = {path = "../../core", develop = true}
langchain-core = { path = "../../core", develop = true }
types-requests = "^2"

[tool.poetry.group.dev]
optional = true

[tool.poetry.group.dev.dependencies]
langchain-core = {path = "../../core", develop = true}
langchain-core = { path = "../../core", develop = true }

[tool.ruff]
select = [
"E", # pycodestyle
"F", # pyflakes
"I", # isort
"E", # pycodestyle
"F", # pyflakes
"I", # isort
]

[tool.mypy]
disallow_untyped_defs = "True"

[tool.coverage.run]
omit = [
"tests/*",
]
omit = ["tests/*"]

[build-system]
requires = ["poetry-core>=1.0.0"]
Expand Down