일 | 월 | 화 | 수 | 목 | 금 | 토 |
---|---|---|---|---|---|---|
1 | 2 | 3 | ||||
4 | 5 | 6 | 7 | 8 | 9 | 10 |
11 | 12 | 13 | 14 | 15 | 16 | 17 |
18 | 19 | 20 | 21 | 22 | 23 | 24 |
25 | 26 | 27 | 28 | 29 | 30 | 31 |
Tags
- linux
- 커널 프로그래밍
- core dumped
- hardware
- Samsung
- ssd
- FTL
- performance
- storage system
- Intel
- Machine Learning
- 시스템 프로그래밍
- memory
- overflow
- deep learning
- github
- framework
- Cache
- Git
- Flash Memory
- 시스템 소프트웨어
- kernel
- 포트 번호 변경
- 키워드
- software
- Operating System
- USENIX
- rocksdb
Archives
- Today
- Total
목록DLT (1)
Happy to visit my research note ^^
(관심 논문) Shade: Enable Fundamental Cacheability forDistributed Deep Learning Training
Redwan Ibne Seraj Khan and Ahmad Hossein Yazdani, Virginia Tech; Yuqi Fu, University of Virginia; Arnab K. Paul, BITS Pilani; Bo Ji and Xun Jian, Virginia Tech; Yue Cheng, University of Virginia; Ali R. Butt, Virginia Tech February 21–23, 2023 • Santa Clara, CA, USA USENIX Association 21st USENIX Conference on File and Storage Technologies Abstract Deep learning training (DLT) applications는 stor..
논문/관심 논문
2023. 3. 21. 14:52