The Wayback Machine - https://web.archive.org/web/20210520061212/https://github.com/topics/pretrain
Here are
9 public repositories
matching this topic...
大规模中文自然语言处理语料 Large Scale Chinese Corpus for NLP
Large-scale Pre-training Corpus for Chinese 100G 中文预训练语料
Bert-based models(BERT, MTB, CP) for relation extraction.
-
Updated
Nov 25, 2020
-
Python
BERT-CCPoem is an BERT-based pre-trained model particularly for Chinese classical poetry
-
Updated
Jun 11, 2020
-
Python
An official implementation for " UniVL: A Unified Video and Language Pre-Training Model for Multimodal Understanding and Generation"
-
Updated
Apr 21, 2021
-
Python
ALBERT trained on Mongolian text corpus
-
Updated
Jan 10, 2021
-
Jupyter Notebook
This repository provides code solution for Data Fusion Contest task 1
-
Updated
Mar 31, 2021
-
Jupyter Notebook
Understanding "A Lite BERT". An Transformer approach for learning self-supervised Language Models. (wip)
-
Updated
Mar 16, 2020
-
Python
Improve this page
Add a description, image, and links to the
pretrain
topic page so that developers can more easily learn about it.
Curate this topic
Add this topic to your repo
To associate your repository with the
pretrain
topic, visit your repo's landing page and select "manage topics."
Learn more
You can’t perform that action at this time.
You signed in with another tab or window. Reload to refresh your session.
You signed out in another tab or window. Reload to refresh your session.