Edit model card
YAML Metadata Error: "widget" must be an array

Model description

[More Information Needed]

Intended uses & limitations

This model is not ready to be used in production (J).

Training Procedure

Hyperparameters

The model is trained with below hyperparameters.

Click to expand
Hyperparameter Value
memory
steps [('imputer', SimpleImputer()), ('scaler', StandardScaler()), ('model', LogisticRegression())]
verbose False
imputer SimpleImputer()
scaler StandardScaler()
model LogisticRegression()
imputer__add_indicator False
imputer__copy True
imputer__fill_value
imputer__keep_empty_features False
imputer__missing_values nan
imputer__strategy mean
imputer__verbose deprecated
scaler__copy True
scaler__with_mean True
scaler__with_std True
model__C 1.0
model__class_weight
model__dual False
model__fit_intercept True
model__intercept_scaling 1
model__l1_ratio
model__max_iter 100
model__multi_class auto
model__n_jobs
model__penalty l2
model__random_state
model__solver lbfgs
model__tol 0.0001
model__verbose 0
model__warm_start False

Model Plot

The model plot is below.

Pipeline(steps=[('imputer', SimpleImputer()), ('scaler', StandardScaler()),('model', LogisticRegression())])
In a Jupyter environment, please rerun this cell to show the HTML representation or trust the notebook.
On GitHub, the HTML representation is unable to render, please try loading this page with nbviewer.org.

Evaluation Results

[More Information Needed]

How to Get Started with the Model

[More Information Needed]

Model Card Authors

This model card is written by following authors:

[More Information Needed]

Model Card Contact

You can contact the model card authors through following channels: [More Information Needed]

Citation

Below you can find information related to citation.

BibTeX:

[More Information Needed]
Downloads last month
0
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.