Edit model card

Dr. Niko (70B)

This repository contains the full merged model.

image/png

The Dr-Niko model is designed to assist medical professionals, researchers, and students with a wide range of tasks, including answering medical and scientific questions, summarizing research papers and clinical notes, generating medical reports and documentation, providing medical advice and recommendations (with appropriate disclaimers), and assisting with medical decision-making and diagnosis (in a supporting role).

Model Details

  • Model Name: Dr-Niko
  • Model Type: Medical Large Language Model (LLM)
  • Model Size: 70 billion parameters (base model is miqu-70B)
  • Training Data: The model was fine-tuned on a curated dataset of high-quality medical and scientific literature.
  • Fine-Tuning Approach: The model was then fine-tuned on a medical and scientific dataset using LLaMa-Factory for 1.5 epochs.
  • Intended Use: The Dr-Niko model is designed to assist medical professionals, researchers, and students with a wide range of tasks, including:
    • Answering medical and scientific questions
    • Summarizing research papers and clinical notes (in a supporting role)
    • Generating medical reports and documentation (in a supporting role)
    • Providing medical advice and recommendations (with appropriate disclaimers)
    • Assisting with medical decision-making and diagnosis (in a supporting role)

Model Description

This model is next in a series of medical finetuning attempts, following medfalcon and medguanaco.

  • Developed by: Nick Mitchko
  • Funded by : [My Bank Account]
  • Shared by : [My Internet]
  • Model type: [LLaMa-70B variant]
  • Language(s) (NLP): [English]
  • License: [See here]
  • Finetuned from model [optional]: Miqu-70B

Examples

image/png image/png

Model Sources

  • Repository: [Coming Soon]
  • Paper []: [Maybe]
  • Demo []: Maybe

    Citation [optional]

    Information Coming Soon

    License - NOMERGE

    NOMERGE License
    
    Copyright (c) 2024 152334H
    
    Permission is hereby granted, free of charge, to any person obtaining a copy
    of this software and associated documentation files (the "Software"), to deal
    in the Software without restriction, including without limitation the rights
    to use, copy, modify, NOT merge, publish, distribute, sublicense, and/or sell
    copies of the Software, and to permit persons to whom the Software is
    furnished to do so, subject to the following conditions:
    
    The above copyright notice and this permission notice shall be included in all
    copies or substantial portions of the Software.
    
    All tensors ("weights") provided by the Software shall not be conjoined with
    other tensors ("merging") unless given explicit permission by the license holder.
    Utilities including but not limited to "mergekit", "MergeMonster", are forbidden
    from use in conjunction with this Software.
    
    THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
    IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
    FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
    AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
    LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
    OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
    SOFTWARE.
    
Downloads last month
9
Safetensors
Model size
69B params
Tensor type
BF16
F32
I8
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for nmitchko/dr-niko-70B

Quantized
(15)
this model