Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model) for the financial domain

For Chinese natural language processing in specific domains, we provide Chinese DKPLM (Decomposable Knowledge-enhanced Pre-trained Language Model) for the financial domain named pai-dkplm-financial-base-zh, from our AAAI 2021 paper named DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding.

This repository is developed based on the EasyNLP framework: https://github.com/alibaba/EasyNLP developed by the Alibaba PAI team.

Citation

If you find the resource is useful, please cite the following papers in your work.

  • For the EasyNLP framework:
@article{easynlp, 
title = {EasyNLP: A Comprehensive and Easy-to-use Toolkit for Natural Language Processing},   			publisher = {arXiv}, 
  author = {Wang, Chengyu and Qiu, Minghui and Zhang, Taolin and Liu, Tingting and Li, Lei and Wang, Jianing and Wang, Ming and Huang, Jun and Lin, Wei}, 
  url = {https://arxiv.org/abs/2205.00258}, 
  year = {2022} 
} 
  • For DKPLM:
@article{dkplm, 
  title = {DKPLM: Decomposable Knowledge-enhanced Pre-trained Language Model for Natural Language Understanding}, 
  author = {Zhang, Taolin and Wang, Chengyu and Hu, Nan and Qiu, Minghui and Tang, Chengguang and He, Xiaofeng and Huang, Jun}, 
  url = {https://arxiv.org/abs/2112.01047},   			
  publisher = {arXiv}, 
  year = {2021} 
} 
Downloads last month
2
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.