Replies: 2 comments
-
For me (NVIDIA GeForce RTX 4090 24G), ComfyUI v0.3.14 + Python v3.12.7 + pytorch 2.6.0+cu126 is the best combination so far. |
Beta Was this translation helpful? Give feedback.
-
The PyTorch packages bundle the CUDA version support, with fat binaries compiled to support the different GPU architectures. I'm not sure if PyTorch itself has Python written to take advantage of additional features when supported by newer GPUs and compute versions, and the same would be for projects like ComfyUI, but you do get the compiled CUDA PTX tailored to those versions, so those should at least be optimized. No clue how much of an improvement it'd be, you'd have to benchmark and compare. |
Beta Was this translation helpful? Give feedback.
-
I'm using version CUDA118, and I'm curious if upgrading to 126 will show better performance in ComfyUI.
Beta Was this translation helpful? Give feedback.
All reactions