New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
RAG: error in outputs = model(input_ids=input_ids, labels=input_dict["labels"])
#7554
opened Oct 3, 2020 by
shamanez
Incorrect tokenization with tokens added using tokenizer.add_tokens()
#7549
opened Oct 2, 2020 by
Muks14x
2 of 5
T5: forward and generate produce different results even for greedy decoding of a single token
t5
#7541
opened Oct 2, 2020 by
Iwontbecreative
1 of 1
Trainer fails to correctly tackle XLNetForSequenceClassification outputs
#7539
opened Oct 2, 2020 by
StepinSlience
[GPT-2] How many columns in LM model wte layer are positional embeddings?
#7529
opened Oct 1, 2020 by
palooney
Overflow error: Can't convert negative value to unsigned it [RAG Model]
#7517
opened Oct 1, 2020 by
sashank06
2 of 2
[XLNet] attention_mask / input_mask - Why two `attention_mask` inputs?
#7512
opened Oct 1, 2020 by
patrickvonplaten
[Transfo-XL] Impossible to pass `attention_mask` to model
#7511
opened Oct 1, 2020 by
patrickvonplaten
[Reformer, Longformer, Roberta, GPT2, CTRL] attention_mask should be at second argument
#7510
opened Oct 1, 2020 by
patrickvonplaten
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.

