support for RTX 4090 graphics card?

#8
by ArbitrationCity - opened

Hi, I tried running the tts server (v.0.2 and v.0.1) locally using an RTX 4090 graphics card, but getting a Pytorch error related to the architecture. I believe the RTX 4090 (Ada Lovelace) needs CUDA 11.8 or later - and a corresponding Pytorch version. Anything I can to do make it work?

Error: I0716 18:25:17.133145 1 libtorch.cc:2076] TRITONBACKEND_ModelInstanceInitialize: tts_decoder (GPU device 0)
[W cuda_graph_fuser.h:17] Warning: RegisterCudaFuseGraph::registerPass() is deprecated. Please use torch::jit::fuser::cuda::setEnabled(). (function registerPass)
I0716 18:25:17.817603 1 model_lifecycle.cc:693] successfully loaded 'tts_decoder' version 1
[INFO:/opt/balacoon_tts/build_server_on_docker/_deps/balacoon_neural-src/src/lib/triton_metrics_service.cc:128] 0.0.0.0:8002: metrics server[W cuda_graph_fuser.h:17] Warning: RegisterCudaFuseGraph::registerPass() is deprecated. Please use torch::jit::fuser::cuda::setEnabled(). (function registerPass)
[W cuda_graph_fuser.h:17] Warning: RegisterCudaFuseGraph::registerPass() is deprecated. Please use torch::jit::fuser::cuda::setEnabled(). (function registerPass)
terminate called after throwing an instance of 'std::runtime_error'
what(): async response status: Internal - PyTorch execute failure: nvrtc: error: invalid value for --gpu-architecture (-arch)

nvrtc compilation failed:

#define NAN __int_as_float(0x7fffffff)
#define POS_INFINITY __int_as_float(0x7f800000)
#define NEG_INFINITY __int_as_float(0xff800000)

balacoon org

Hi, indeed service is based on triton:22.08 (https://docs.nvidia.com/deeplearning/triton-inference-server/release-notes/rel_22-08.html), which has cuda 11.7.1 in it. As far as I see 4090 needs cuda 11.8. I need to rebuild my docker with newer triton. I will do it once there is a bit more time on my hands. reach out clement@balacoon.com or join our slack https://join.slack.com/t/balacoon/shared_invite/zt-1syqpvq75-s7iCBJhZcQrsmrLrAU3fhw for more prompt communication

clementruhm changed discussion status to closed

Sign up or log in to comment