Interpreting your model performance
-
The Mean Absolute Error (MAE) metric indicates the average absolute difference between your model's rating and your actual rating on a held-out set of comments.
-
You want your model to have a lower MAE (indicating less error).
-
Your current MAE: {data["mae"]}
- {@html data["mae_status"]}