Spaces:
Runtime error
Runtime error
Is this space supposed to work in tandem with Lighteval?
#5
by
sadra-barikbin
- opened
Hi there!
Thanks for this great space. Is it supposed to work with the format of results pushed by Lighteval to the hub? I ask this as in read_evals.py::EvalResults::init_from_json_file()
it expects a config
entry in the result json file which is missing there. There's config_general
entry there.
Hi!
Not necessarily, it can work with lighteval, the harness, or any kind of eval script you'd want - in which case you need to adapt the space to fit your use case.
clefourrier
changed discussion status to
closed