Using a GPU for inference
#2
by
woofadu
- opened
How do you load the model onto a GPU if there is no 'device' or 'device_map' parameter for the MegatronBert model type?
Use CUDA_VISIBLE_DEVICES. Here is an example: https://github.com/uf-hobi-informatics-lab/ClinicalTransformerNER
set GPU
export CUDA_VISIBLE_DEVICES=0