--- library_name: sklearn license: mit tags: - sklearn - skops - tabular-classification model_format: skops model_file: febskxmodel_hug_1.skops widget: - structuredData: backlog_minutes: - 246897 - 265856 - 622046 backlog_num_jobs: - 211 - 298 - 369 max_minutes: - 360 - 30 - 2160 nnodes: - 1 - 1 - 1 running_minutes: - 1934324 - 1934324 - 1934214 running_num_jobs: - 6830 - 6830 - 6829 --- # Model description [More Information Needed] ## Intended uses & limitations [More Information Needed] ## Training Procedure [More Information Needed] ### Hyperparameters
Click to expand | Hyperparameter | Value | |----------------------------|----------------------------------------------------------------------------------------------------| | memory | | | steps | [('scale', StandardScaler()), ('hgbc', HistGradientBoostingClassifier(max_depth=9, max_iter=600))] | | verbose | False | | scale | StandardScaler() | | hgbc | HistGradientBoostingClassifier(max_depth=9, max_iter=600) | | scale__copy | True | | scale__with_mean | True | | scale__with_std | True | | hgbc__categorical_features | | | hgbc__class_weight | | | hgbc__early_stopping | auto | | hgbc__interaction_cst | | | hgbc__l2_regularization | 0.0 | | hgbc__learning_rate | 0.1 | | hgbc__loss | log_loss | | hgbc__max_bins | 255 | | hgbc__max_depth | 9 | | hgbc__max_iter | 600 | | hgbc__max_leaf_nodes | 31 | | hgbc__min_samples_leaf | 20 | | hgbc__monotonic_cst | | | hgbc__n_iter_no_change | 10 | | hgbc__random_state | | | hgbc__scoring | loss | | hgbc__tol | 1e-07 | | hgbc__validation_fraction | 0.1 | | hgbc__verbose | 0 | | hgbc__warm_start | False |
### Model Plot
Pipeline(steps=[('scale', StandardScaler()),('hgbc',HistGradientBoostingClassifier(max_depth=9, max_iter=600))])
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.
## Evaluation Results | Metric | Value | |-----------------------|--------------------| | accuracy | 0.9079252003561887 | | classification report | precision recall f1-score support

0 0.94 0.98 0.96 3580
1 0.76 0.55 0.64 415
2 0.63 0.51 0.56 208
3 0.68 0.47 0.55 160
4 0.91 0.94 0.93 1252

accuracy 0.91 5615
macro avg 0.78 0.69 0.73 5615
weighted avg 0.90 0.91 0.90 5615 | # How to Get Started with the Model [More Information Needed] # Model Card Authors This model card is written by following authors: [More Information Needed] # Model Card Contact You can contact the model card authors through following channels: [More Information Needed] # Citation Below you can find information related to citation. **BibTeX:** ``` [More Information Needed] ``` # citation_bibtex bibtex @inproceedings{...,year={2024}} # get_started_code import skops.io as sio model = sio.load(file, trusted=unknown_types) # model_card_authors Smruti Padhy # limitations This model is ready to be used in production. # model_description This is a Histogram-based Gradient Boosting Classification Tree model trained on HPC history jobs between 1Feb-1Aug 2022, window number1 # eval_method The model is evaluated using test split, on accuracy and F1 score with macro average. # confusion_matrix ![confusion_matrix](confusion_matrix.png)