Hi there 👋 Welcome to my github repository!
🌴
On vacation
Block or Report
Block or report ArthurZucker
Report abuse
Contact GitHub support about this user’s behavior. Learn more about reporting abuse.
Report abusePinned
-
huggingface/transformers
huggingface/transformers Public🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
-
huggingface/tokenizers
huggingface/tokenizers Public💥 Fast State-of-the-Art Tokenizers optimized for Research and Production
1,748 contributions in the last year
| Day of Week | January Jan | February Feb | March Mar | April Apr | May May | June Jun | July Jul | August Aug | September Sep | October Oct | November Nov | December Dec | |||||||||||||||||||||||||||||||||||||||||
| Sunday Sun | |||||||||||||||||||||||||||||||||||||||||||||||||||||
| Monday Mon | |||||||||||||||||||||||||||||||||||||||||||||||||||||
| Tuesday Tue | |||||||||||||||||||||||||||||||||||||||||||||||||||||
| Wednesday Wed | |||||||||||||||||||||||||||||||||||||||||||||||||||||
| Thursday Thu | |||||||||||||||||||||||||||||||||||||||||||||||||||||
| Friday Fri | |||||||||||||||||||||||||||||||||||||||||||||||||||||
| Saturday Sat | |||||||||||||||||||||||||||||||||||||||||||||||||||||
Less
No contributions.
Low contributions.
Medium-low contributions.
Medium-high contributions.
High contributions.
More
Activity overview
Contributed to
huggingface/transformers,
huggingface/tokenizers,
ArthurZucker/git-pr-bot
and 18 other
repositories
Contribution activity
January 2024
Created 3 commits in 1 repository
Reviewed 29 pull requests in 1 repository
huggingface/transformers
25 pull requests
-
[Phi] Extend implementation to use GQA/MQA.
This contribution was made on Jan 8
-
Bugfix / ffmpeg input device (mic) not working on Windows
This contribution was made on Jan 8
-
Fix
_merge_input_ids_with_image_featuresfor llava modelThis contribution was made on Jan 8 -
Fix building alibi tensor when num_heads is not a power of 2
This contribution was made on Jan 8
-
[i18n-fr] Translate pipeline tutorial to French
This contribution was made on Jan 8
-
[AttentionMaskConverter] fix sdpa unmask unattended
This contribution was made on Jan 8
-
Enhancing Code Readability and Maintainability with Simplified Activation Function Selection.
This contribution was made on Jan 8
-
Support : Adding Support for LlamaForQuestionAnswering class
This contribution was made on Jan 8
-
Fix pos_mask application and update tests accordingly
This contribution was made on Jan 5
-
fix FA2 when using quantization for remaining models
This contribution was made on Jan 5
-
Don't check the device when device_map=auto
This contribution was made on Jan 5
-
Use mmap option to load_state_dict
This contribution was made on Jan 5
-
[Phi2] Add support for phi2 models
This contribution was made on Jan 5
-
don't initialize the output embeddings if we're going to tie them to input embeddings
This contribution was made on Jan 5
-
enable training mask2former and maskformer for transformers trainer
This contribution was made on Jan 4
-
Enable instantiating model with pretrained backbone weights
This contribution was made on Jan 4
-
[
Core generation] Adds support for static KV cacheThis contribution was made on Jan 4 -
Remove token_type_ids from model_input_names (like #24788)
This contribution was made on Jan 3
-
[
Refactor Attention mask handling] Moves attention mask processing to the Attention classThis contribution was made on Jan 3 -
Enhancing Code Readability and Maintainability with Simplified Activation Function Selection.
This contribution was made on Jan 3
-
Fix initialization for missing parameters in
from_pretrainedunder ZeRO-3This contribution was made on Jan 3 -
[
Awq] Add llava fused modules supportThis contribution was made on Jan 3 -
README: install transformers from conda-forge channel
This contribution was made on Jan 3
-
Bump tj-actions/changed-files from 22.2 to 41 in /.github/workflows
This contribution was made on Jan 3
-
Add new meta w2v2-conformer BERT-like model
This contribution was made on Jan 3
- Some pull request reviews not shown.






