Is there anyway to run this in colab on t4 gpu?
#27
by
bloomedout
- opened
I keep trying to run given code but it runs it on ram instead of gpu even after changing runtime type and restarting.
@bloomedout
No, this is simply too large to fit in a free colab instance. If you are using a paid instance, you should do pipe.to('cuda')
If you do not want to use a paid instance, use kaggle and it should fit.