中文NER的那些事儿
-
Updated
Jul 6, 2022 - Python
中文NER的那些事儿
This repository contains the official release of the model "BanglaBERT" and associated downstream finetuning code and datasets introduced in the paper titled "BanglaBERT: Language Model Pretraining and Benchmarks for Low-Resource Language Understanding Evaluation in Bangla" accpeted in Findings of the Annual Conference of the North American Chap…
Can we use explanations to improve hate speech models? Our paper accepted at AAAI 2021 tries to explore that question.
Experiments for automated personality detection using Language Models and psycholinguistic features on various famous personality datasets including the Essays dataset (Big-Five)
classy is a simple-to-use library for building high-performance Machine Learning models in NLP.
BERTMap: A BERT-Based Ontology Alignment System
Fine Tuning BERT for Text Classification and Question Answering Using TensorFlow & PyTorch Frameworks
Example Of Fine-Tuning BERT For Named-Entity Recognition Task And Preparing For Cloud Deployment Using Flask, React, And Docker
bert文本多分类(情感分析)、bert-bilstm-crf序列标注任务(快递地址的序列标注任务)
Super Tickets in Pre-Trained Language Models: From Model Compression to Improving Generalization (ACL 2021)
Crisis Dataset for Benchmarks Experiments
This repository contains code to reproduce the results in our paper "Transformers are Short Text Classifiers: A Study of Inductive Short Text Classifiers on Benchmarks and Real-world Datasets".
Code for the FullStack AI Live Coding Series- Part 1 (CellStrat AI Lab)
Bimodal and Unimodal Sentiment Analysis of Internet Memes (Image+Text)
Simple Text Classification[WIP]
Class for Aspect-term extraction and Aspect-based sentiment analysis with BERT and Adapters
BERT fine-tuning for POS tagging task (google's tensorflow)
[NAACL 2022] This is the code repo for our paper `ACTUNE: Uncertainty-Aware Active Self-Training for Active Fine-Tuning of Pretrained Language Models'.
A custom Turkish question answering system made by fine-tuning BERTurk.
Add a description, image, and links to the bert-fine-tuning topic page so that developers can more easily learn about it.
To associate your repository with the bert-fine-tuning topic, visit your repo's landing page and select "manage topics."