all kinds of text classification models and more with deep learning
-
Updated
Sep 17, 2019 - 310 commits
- 10 contributors
- Python
all kinds of text classification models and more with deep learning
A TensorFlow Implementation of the Transformer: Attention Is All You Need
A collection of important graph embedding, classification and representation learning papers with implementations.
Attention mechanism Implementation for Keras.
Speech synthesis, voice conversion, self-supervised learning, music generation,Automatic Speech Recognition, Speaker Verification, Speech Synthesis, Language Modeling
Graph Attention Networks (https://arxiv.org/abs/1710.10903)
Text classifier for Hierarchical Attention Networks for Document Classification
Sequence-to-sequence framework with a focus on Neural Machine Translation based on Apache MXNet
TensorFlow Implementation of "Show, Attend and Tell"
Visualizing RNNs using the attention mechanism
Show, Attend, and Tell | a PyTorch Tutorial to Image Captioning
Pytorch implementation of the Graph Attention Network model by Veličković et. al (2017, https://arxiv.org/abs/1710.10903)
Neural Machine Translation with Keras
A Structured Self-attentive Sentence Embedding
Need help for retraining and cross validation and see if the ROUGE score matches exactly (or better) with the numbers reported in the paper.
I just train for 500k iteration (with batch size 8) with pointer generation enabled + coverage loss disabled and next 100k iteration (with batch size 8) with pointer generation enabled + coverage loss enabled.
It would be great if someone can help re-r
基于金融-司法领域(兼有闲聊性质)的聊天机器人,其中的主要模块有信息抽取、NLU、NLG、知识图谱等,并且利用Django整合了前端展示,目前已经封装了nlp和kg的restful接口
Action recognition using soft attention based deep recurrent neural networks
Transformer of "Attention Is All You Need" (Vaswani et al. 2017) by Chainer.
Implementation of "Knowing When to Look: Adaptive Attention via A Visual Sentinel for Image Captioning"
Code for PaperRobot: Incremental Draft Generation of Scientific Ideas
End-to-End speech recognition implementation base on TensorFlow (CTC, Attention, and MTL training)
基于seq2seq模型的简单对话系统的tf实现,具有embedding、attention、beam_search等功能,数据集是Cornell Movie Dialogs
A curated list of NLP resources focused on BERT, attention mechanism, Transformer networks, and transfer learning.
Attention mechanism for processing sequential data that considers the context for each timestamp.
Code/Model release for NIPS 2017 paper "Attentional Pooling for Action Recognition"
Pointer-generator reinforced seq2seq summarization in PyTorch
Keras implementation of the graph attention networks (GAT) by Veličković et al. (2017; https://arxiv.org/abs/1710.10903)
A PyTorch implementation of "SimGNN: A Neural Network Approach to Fast Graph Similarity Computation" (WSDM 2019).
Sparse and structured neural attention mechanisms
I have been training moran on my own set and it is giving me good results, but i was not able to download the training set from BaiduPan due to some issues,
If anybody has been able to download the set,could they please upload it to google drive and share the link.
Thanks in advance.