Spaces:
Sleeping
Sleeping
GSK-2498-suggest-a-dataset-for-model
#46
by
ZeroCommand
- opened
Small comments:
- If the user has chosen the dataset first, it will be overwritten.
- If model is not presented in leaderboard, there is no recommended dataset.
ZeroCommand
changed pull request status to
open
LGTM, thanks!
Just a small request:
Can we remove
run_local = gr.Checkbox(value=False, label="Run Locally with Pipeline [Slow]")
and just put the tips saying like "it will be slow and occupying the computing resource of this space", under the inference API checkbox?
Are we completely moving the choice to run without inference api (meaning we are not supporting the models that don't have inference api anymore)?
The statement is good!
No, we can simply run the pipeline model for text classification in case of not choosing "Run with Inference API" not providing HF token for inference API.
inoki-giskard
changed pull request status to
merged