Edit model card

WebLINX: Real-World Website Navigation with Multi-Turn Dialogue

Xing Han Lù*, Zdeněk Kasner*, Siva Reddy

Quickstart

from datasets import load_dataset
from huggingface_hub import snapshot_download
from transformers import pipeline

# Load validation split
valid = load_dataset("McGill-NLP/weblinx", split="validation")

# Download and load the templates
snapshot_download(
    "McGill-NLP/WebLINX", repo_type="dataset", allow_patterns="templates/*.txt", local_dir="./"
)
with open('templates/llama.txt') as f:
    template = f.read()

turn = valid[0]
turn_text = template.format(**turn)

# Load action model and input the text to get prediction
action_model = pipeline(
    model="McGill-NLP/Sheared-LLaMA-1.3B-weblinx", device=0, torch_dtype='auto'
)
out = action_model(turn_text, return_full_text=False, max_new_tokens=64, truncation=True)
pred = out[0]['generated_text']

print("Ref:", turn["action"])
print("Pred:", pred)

Original Model

This model is finetuned on WebLINX using checkpoints previously published on Huggingface Hub.
Click here to access the original model.

License

This model is derived from LLaMA-2, which can only be used with the LLaMA 2 Community License Agreement. By using or distributing any portion or element of this model, you agree to be bound by this Agreement.

Downloads last month
11
Safetensors
Model size
1.35B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Datasets used to train McGill-NLP/Sheared-LLaMA-1.3B-weblinx

Collection including McGill-NLP/Sheared-LLaMA-1.3B-weblinx