Clearing CUDA memory on Kaggle
Jan 9, 2021
Sometimes when run PyTorch model with GPU on Kaggle we get error “RuntimeError: CUDA out of memory. Tried to allocate …”
Clear memory with command:
torch.cuda.empty_cache()
Check CUDA memory
!pip install GPUtil
from GPUtil import showUtilization as gpu_usagegpu_usage()
Output
| ID | GPU | MEM |
------------------
| 0 | 0% | 0% |