GraphGPT
GraphGPT is a graph-oriented Large Language Model tuned by Graph Instruction Tuning paradigm.
Model Details
GraphGPT is a graph-oriented Large Language Model tuned by Graph Instruction Tuning paradigm based on the Vicuna-7B-v1.5 model.
- Developed by: Data Intelligence Lab@HKU
- Model type: An auto-regressive language model based on the transformer architecture.
- Finetuned from model: Vicuna-7B-v1.5 model.
Model Sources
- Repository: https://github.com/HKUDS/GraphGPT
- Paper:
- Project: https://graphgpt.github.io/
Uses
This version of GraphGPT is tuned utilizing the mixing instruction data, which is able to handle both node classification and link prediction for different graph datasets.
How to Get Started with the Model
- Command line interface: Plaese refer to https://github.com/HKUDS/GraphGPT to evaluate our GraphGPT.
- Gradio demo is under development.
- Downloads last month
- 179
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.