When the input string is long, the interface call will fail. So what is the maximum token length of the model when using Inference API ?
· Sign up or log in to comment