generated from caikit/caikit-template
-
Notifications
You must be signed in to change notification settings - Fork 51
Permalink
Choose a base ref
{{ refName }}
default
Choose a head ref
{{ refName }}
default
Comparing changes
Choose two branches to see what’s changed or to start a new pull request.
If you need to, you can also or
learn more about diff comparisons.
Open a pull request
Create a new pull request by comparing changes across two branches. If you need to, you can also .
Learn more about diff comparisons here.
base repository: caikit/caikit-nlp
Failed to load repositories. Confirm that selected base ref is valid, then try again.
Loading
base: v0.5.7
Could not load branches
Nothing to show
Loading
Could not load tags
Nothing to show
{{ refName }}
default
Loading
...
head repository: caikit/caikit-nlp
Failed to load repositories. Confirm that selected head ref is valid, then try again.
Loading
compare: v0.5.8
Could not load branches
Nothing to show
Loading
Could not load tags
Nothing to show
{{ refName }}
default
Loading
- 7 commits
- 3 files changed
- 3 contributors
Commits on Oct 8, 2024
-
Enable using kwargs for selecting pad-to-max-length strategy for toke…
…nizer in embeddings Signed-off-by: kcirred <16872435+kcirred@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for 4f8a821 - Browse repository at this point
Copy the full SHA 4f8a821View commit details
Commits on Oct 15, 2024
-
Added mocking to test tokenizer changes, directly pass padding strategy
Signed-off-by: kcirred <16872435+kcirred@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for f79d65b - Browse repository at this point
Copy the full SHA f79d65bView commit details
Commits on Oct 16, 2024
-
Merge pull request #393 from kcirred/main
Enable using kwargs for selecting pad-to-max-length strategy for tokenizer in embeddings
Configuration menu - View commit details
-
Copy full SHA for 0219d50 - Browse repository at this point
Copy the full SHA 0219d50View commit details
Commits on Oct 17, 2024
-
Update torch requirement from <2.5.0,>=2.3.1 to >=2.3.1,<2.6.0
Updates the requirements on [torch](https://github.com/pytorch/pytorch) to permit the latest version. - [Release notes](https://github.com/pytorch/pytorch/releases) - [Changelog](https://github.com/pytorch/pytorch/blob/main/RELEASE.md) - [Commits](pytorch/pytorch@v2.3.1...v2.5.0) --- updated-dependencies: - dependency-name: torch dependency-type: direct:production ... Signed-off-by: dependabot[bot] <support@github.com>
Configuration menu - View commit details
-
Copy full SHA for e471a2a - Browse repository at this point
Copy the full SHA e471a2aView commit details
Commits on Oct 28, 2024
-
[embeddings] extend kwargs to high level functions
Signed-off-by: kcirred <16872435+kcirred@users.noreply.github.com>
Configuration menu - View commit details
-
Copy full SHA for 874982b - Browse repository at this point
Copy the full SHA 874982bView commit details
Commits on Oct 30, 2024
-
Merge pull request #398 from caikit/dependabot/pip/torch-gte-2.3.1-an…
…d-lt-2.6.0 Update torch requirement from <2.5.0,>=2.3.1 to >=2.3.1,<2.6.0
Configuration menu - View commit details
-
Copy full SHA for bfd3d4d - Browse repository at this point
Copy the full SHA bfd3d4dView commit details
Commits on Nov 7, 2024
-
Merge pull request #400 from kcirred/main
[embeddings] extend kwargs to functions that call _encode_with_retry
Configuration menu - View commit details
-
Copy full SHA for 56b7e18 - Browse repository at this point
Copy the full SHA 56b7e18View commit details
There are no files selected for viewing