Hi,
I built a interactive container using
SUBMISSION_TRACK=fincrime CPU_or_GPU=gpu make interact-container
Then in the container, when I run the following script
import torch
X = torch.tensor([1]).to("cuda")
I got the following error
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/opt/conda/envs/condaenv/lib/python3.9/site-packages/torch/cuda/__init__.py", line 211, in _lazy_init
raise AssertionError("Torch not compiled with CUDA enabled")
AssertionError: Torch not compiled with CUDA enabled
I exported the condaenv in to a .yml file and built a new environment directly on my local machine, and the same error happens. My machine does have a GPU with Cuda 11.7, and my GPU works previously. May I know how I can get cuda/gpu working in the container? Thank you!