This repository contains the Romanian version of DistilBERT.
-
Updated
Dec 24, 2021 - Jupyter Notebook
This repository contains the Romanian version of DistilBERT.
A template for use in creating Autodistill Target Model packages.
A tutorial on how to prune the embedding layer of a language model and crafting a suitable tokenizer
summer internship project @ JetBrains Research
This is a fork of the distilling-step-by-step repository with the aim of creating a task-specific LLM distillation framework for healthcare.
Efficient Inference techniques implemented in PyTorch for computer vision.
Distillation examples. Trying to make Speaker Recognition Faster through different Model Compression techniques
Distillation of GANs with fairness constraints
Prompt engineering for developers
A PyTorch-based knowledge distillation toolkit for natural language processing
This is an implementation for paper Automated training of location-specific edge models for traffic counting
An entrance test for a Computer Vision / NLP researcher job
A Series on Optimizing Transformer-Based Models
Alternus Vera Project
Learn about making a smaller network as good as a big ensemble model that can accelarate inference time.
Optimising train, inference and throughput of expensive ML models
【NCA】Learning Metric Space with Distillation for Large-Scale Multi-Label Text Classification
DINOv1 implementation in Pytorch
Deep Mutual Learning in PaddlePaddle
Add a description, image, and links to the distillation topic page so that developers can more easily learn about it.
To associate your repository with the distillation topic, visit your repo's landing page and select "manage topics."