Original author(s) | Microsoft Research |
---|---|
Developer(s) | Microsoft |
Initial release | May 18, 2020 |
Stable release | v0.14.4
/ June 21, 2024 |
Repository | github |
Written in | Python, CUDA, C++ |
Type | Software library |
License | Apache License 2.0 |
Website | deepspeed |
DeepSpeed is an open source deep learning optimization library for PyTorch.[1] The library is designed to reduce computing power and memory use and to train large distributed models with better parallelism on existing computer hardware.[2][3] DeepSpeed is optimized for low latency, high throughput training. It includes the Zero Redundancy Optimizer (ZeRO) for training models with 1 trillion or more parameters.[4] Features include mixed precision training, single-GPU, multi-GPU, and multi-node training as well as custom model parallelism. The DeepSpeed source code is licensed under MIT License and available on GitHub.[5]
The team claimed to achieve up to a 6.2x throughput improvement, 2.8x faster convergence, and 4.6x less communication.[6]
See also
[edit]References
[edit]- ^ "Microsoft Updates Windows, Azure Tools with an Eye on The Future". PCMag UK. May 22, 2020.
- ^ Yegulalp, Serdar (February 10, 2020). "Microsoft speeds up PyTorch with DeepSpeed". InfoWorld.
- ^ "Microsoft unveils "fifth most powerful" supercomputer in the world". Neowin. 18 June 2023.
- ^ "Microsoft trains world's largest Transformer language model". February 10, 2020.
- ^ "microsoft/DeepSpeed". July 10, 2020 – via GitHub.
- ^ "DeepSpeed: Accelerating large-scale model inference and training via system optimizations and compression". Microsoft Research. 2021-05-24. Retrieved 2021-06-19.
Further reading
[edit]- Rajbhandari, Samyam; Rasley, Jeff; Ruwase, Olatunji; He, Yuxiong (2019). "ZeRO: Memory Optimization Towards Training A Trillion Parameter Models". arXiv:1910.02054 [cs.LG].
External links
[edit]- AI at Scale - Microsoft Research
- GitHub - microsoft/DeepSpeed
- ZeRO & DeepSpeed: New system optimizations enable training models with over 100 billion parameters - Microsoft Research
Well, that’s interesting to know that Psilotum nudum are known as whisk ferns. Psilotum nudum is the commoner species of the two. While the P. flaccidum is a rare species and is found in the tropical islands. Both the species are usually epiphytic in habit and grow upon tree ferns. These species may also be terrestrial and grow in humus or in the crevices of the rocks.
View the detailed Guide of Psilotum nudum: Detailed Study Of Psilotum Nudum (Whisk Fern), Classification, Anatomy, Reproduction