vLLM + CUDA mismatch

Just a quick note after seeing this error pop up in the environment while trying to perform a smoke test:

ImportError: undefined symbol: _ZN3c104cuda9SetDeviceEi

This means vLLM was installed for a different PyTorch/CUDA version than what’s actually running. Do vLLM, PyTorch, and CUDA versions all match?