The Wayback Machine - https://web.archive.org/web/20220318174543/https://github.com/topics/pre-trained-language-models
#
pre-trained-language-models
Here are
27 public repositories
matching this topic...
RoBERTa中文预训练模型: RoBERTa for Chinese
Updated
Jul 8, 2021
Python
Must-read papers on prompt-based tuning for pre-trained language models.
Top2Vec learns jointly embedded topic, document and word vectors.
Updated
Mar 16, 2022
Python
An Open-Source Framework for Prompt-Learning.
Updated
Mar 18, 2022
Python
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
A novel method to tune language models. Codes and datasets for paper ``GPT understands, too''.
Updated
Nov 30, 2021
Python
Keyphrase or Keyword Extraction 基于预训练模型的中文关键词抽取方法(论文SIFRank: A New Baseline for Unsupervised Keyphrase Extraction Based on Pre-trained Language Model 的中文版代码)
Updated
May 17, 2020
Python
A PyTorch-based model pruning toolkit for pre-trained language models
Updated
Mar 4, 2022
Python
The code of our paper "SIFRank: A New Baseline for Unsupervised Keyphrase Extraction Based on Pre-trained Language Model"
Updated
Mar 20, 2021
Python
We start a company-name recognition task with a small scale and low quality training data, then using skills to enhanced model training speed and predicting performance with least artificial participation. The methods we use involve lite pre-training models such as Albert-small or Electra-small with financial corpus, knowledge of distillation and multi-stage learning. The result is that we improve the recall rate of company names recognition task from 0.73 to 0.92 and get 4 times as fast as BERT-Bilstm-CRF model.
Updated
Aug 10, 2020
Python
Must-read papers on improving efficiency for pre-trained language models.
Code for the ICLR2022 paper "Differentiable Prompt Makes Pre-trained Language Models Better Few-shot Learners"
Updated
Jan 24, 2022
Python
Zero-shot Transfer Learning from English to Arabic
Updated
Nov 19, 2021
Python
Code for EMNLP 2021 main conference paper "Dynamic Knowledge Distillation for Pre-trained Language Models"
Updated
Mar 10, 2022
Python
Updated
Oct 15, 2020
Python
Super Tickets in Pre-Trained Language Models: From Model Compression to Improving Generalization (ACL 2021)
Updated
Jul 28, 2021
Python
Calculating FLOPs of Pre-trained Models in NLP
Updated
Mar 29, 2021
Python
Zero-shot Transfer Learning from English to Arabic
Updated
Apr 14, 2021
Python
Code for CascadeBERT, Findings of EMNLP 2021
Updated
Sep 27, 2021
Python
The source code for the project FormalWriter.com
Updated
Feb 14, 2022
Cython
Usage example for the AllenNLP BiDAF pre-trained model
Updated
Oct 12, 2018
Jupyter Notebook
Question Answering Chatbot with DistilRoBERTa Sentence Embeddings, Dialogflow and Ngrok
Updated
Jun 13, 2021
Jupyter Notebook
Evaluation of the ability of GPT-2 to learn human biases in implicit causality.
Updated
Mar 4, 2022
Jupyter Notebook
Updated
Dec 12, 2020
Python
IndoELECTRA: Pre-Trained Language Model for Indonesian Language Understanding
Updated
Aug 15, 2019
Python
This repository contains the source code of the paper: "PTMT: Multi-Target Stance Detection with PTM-enhanced Multi-Task Learning"
Improve this page
Add a description, image, and links to the
pre-trained-language-models
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
pre-trained-language-models
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session.
You signed out in another tab or window. Reload to refresh your session.