The Wayback Machine - https://web.archive.org/web/20210728181955/https://github.com/topics/pre-trained-language-models
#
pre-trained-language-models
Here are
18 public repositories
matching this topic...
RoBERTa中文预训练模型: RoBERTa for Chinese
Updated
Jul 8, 2021
Python
Top2Vec learns jointly embedded topic, document and word vectors.
Updated
Jul 9, 2021
Python
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Must-read papers on prompt-based tuning for pre-trained language models.
基于预训练模型的中文关键词抽取方法(论文SIFRank: A New Baseline for Unsupervised Keyphrase Extraction Based on Pre-trained Language Model 的中文版代码)
Updated
May 17, 2020
Python
The code of our paper "SIFRank: A New Baseline for Unsupervised Keyphrase Extraction Based on Pre-trained Language Model"
Updated
Mar 20, 2021
Python
We start a company-name recognition task with a small scale and low quality training data, then using skills to enhanced model training speed and predicting performance with least artificial participation. The methods we use involve lite pre-training models such as Albert-small or Electra-small with financial corpus, knowledge of distillation and multi-stage learning. The result is that we improve the recall rate of company names recognition task from 0.73 to 0.92 and get 4 times as fast as BERT-Bilstm-CRF model.
Updated
Aug 10, 2020
Python
Zero-shot Transfer Learning from English to Arabic
Updated
May 18, 2021
Python
Updated
Oct 15, 2020
Python
Zero-shot Transfer Learning from English to Arabic
Updated
Apr 14, 2021
Python
Calculating FLOPs of Pre-trained Models in NLP
Updated
Mar 29, 2021
Python
Super Tickets in Pre-Trained Language Models: From Model Compression to Improving Generalization (ACL 2021)
Updated
Jul 28, 2021
Python
Usage example for the AllenNLP BiDAF pre-trained model
Updated
Oct 12, 2018
Jupyter Notebook
Updated
Dec 12, 2020
Python
IndoELECTRA: Pre-Trained Language Model for Indonesian Language Understanding
Question Answering Chatbot with DistilRoBERTa Sentence Embeddings, Dialogflow and Ngrok
Updated
Jun 13, 2021
Jupyter Notebook
Updated
Aug 15, 2019
Python
create "entropic" pre-training sequences for neural language models
Updated
Apr 21, 2021
Python
Improve this page
Add a description, image, and links to the
pre-trained-language-models
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
pre-trained-language-models
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session.
You signed out in another tab or window. Reload to refresh your session.