Issues: microsoft/DeepSpeed
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Author
Label
Projects
Milestones
Assignee
Sort
Issues list
TypeError: getattr(): attribute name must be string
bug
Something isn't working
#2255
opened Aug 24, 2022 by
henrydylan
[BUG] GPT2 inference example problems from deepspeed version 0.6.2 to 0.7.0
bug
Something isn't working
#2251
opened Aug 23, 2022 by
appleeji
[BUG] /tmp/torch_extensions directory created without global write permission
bug
Something isn't working
#2244
opened Aug 20, 2022 by
skiingpacman
[BUG] DeepSpeed non-deterministic inference with HF GPT2 when Something isn't working
replace_with_kernel_inject=True
bug
#2243
opened Aug 19, 2022 by
trianxy
[BUG] Wrong time unit in flops profiler
bug
Something isn't working
#2240
opened Aug 19, 2022 by
yzs981130
ZeroQuant not compressing and making BERT slower
bug
Something isn't working
#2239
opened Aug 19, 2022 by
K2triinK
[REQUEST] Support profiling backward using Flops Profiler as a standalone package
enhancement
New feature or request
#2234
opened Aug 18, 2022 by
yzs981130
[BUG][master branch] garbage GPTJ output for multi-gpu inference
bug
Something isn't working
inference
#2233
opened Aug 18, 2022 by
mallorbc
[BUG] Patch and reload tensor methods in flops_profiler
bug
Something isn't working
#2231
opened Aug 17, 2022 by
pyf98
[BUG] Inference predictions dont match Huggingface for GPT-J
bug
Something isn't working
inference
#2230
opened Aug 17, 2022 by
rahul003
[BUG] Different outputs by original model and inference engine
bug
Something isn't working
inference
#2229
opened Aug 17, 2022 by
reymondzzzz
[BUG] High VRAM Usage For Inference, Torch Dtype Doesn't Matter
bug
Something isn't working
inference
#2227
opened Aug 17, 2022 by
mallorbc
[BUG] DS Inference Bloom OOM / get_sd_loader_json() missing 1 argument
bug
Something isn't working
inference
#2222
opened Aug 16, 2022 by
oborchers
[REQUEST] Example of H5 dataloader based training on azure VM for multi-node
enhancement
New feature or request
#2209
opened Aug 10, 2022 by
vishalghor
[BUG] Sample alexnet example for flops profiler does not work.
bug
Something isn't working
#2203
opened Aug 9, 2022 by
LM-AuroTripathy
InferenceEngine problem with data parrallel [BUG]
bug
Something isn't working
inference
#2200
opened Aug 9, 2022 by
SihengLi99
[BUG]Spend lots of time on loading model with zero_optimization stage=3
bug
Something isn't working
#2199
opened Aug 9, 2022 by
diruoshui
[BUG] deepspeed-inference seems not working correctly with torch.half on Pascal GPU
bug
Something isn't working
inference
#2194
opened Aug 8, 2022 by
wkkautas
[BUG] RuntimeError: Error building extension 'utils' (Something isn't working
ninja related?)
bug
#2187
opened Aug 5, 2022 by
josephrocca
[BUG] Failed to inference Megatron gpt-3 MoE model with Something isn't working
inference
deepspeed.init_inference
bug
#2183
opened Aug 4, 2022 by
Gabriel4256
[BUG] AssertionError: Distributed backend is not initialized.
bug
Something isn't working
#2168
opened Aug 2, 2022 by
chinoll
Previous Next
ProTip!
Mix and match filters to narrow down what you’re looking for.

