YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

The models have been hacked together because their base weights share a similar architecture. But for now using the Pythia inference code only gibberish is generated, while when trying to use the MPT based inference code, i am running into errors that stop it from working.

Currently trying to adapt the "MPT-7b Storywriter 65k" based inference code to work with this new model merge. I'd appreciate tips if anyone tries their hand at it.

This model is not functional as is.

Downloads last month
13
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.