Abstract

Models can have different outcomes based on the different types of inputs; the training data used to build the model will change its output (a data-centric outcome), while the hyperparameter selection can also affect the model’s output and performance (a model-centric outcome). When building classification models, data scientists generally focus on model performance as the key metric, often focusing on accuracy. However, other metrics should be considered during model development that assesses performance in other ways, like fairness. Assessing the fairness during the model development process is often overlooked but should be considered as part of a model’s full assessment before deployment. This research investigates the fairness–accuracy tradeoff that occurred by changing only one hyperparameter in a neural network for binary classification to assess how this single change in the hidden layer can alter the algorithm’s accuracy and fairness. Neural networks were used for this assessment because of their wide usage across domains, limitations to current bias measurement methods, and the underlying challenges in their interpretability. Findings suggest that assessing accuracy and fairness during model development provides value while mitigating potential negative effects for users while reducing organizational risk. No particular activation function was found to be fairer than another. Notable differences in the fairness and accuracy measures could help developers deploy a model with high accuracy and robust fairness. Algorithm development should include a grid search for hyperparameter optimization that includes fairness along with performance measures, like accuracy. While the actual choices for hyperparameters may depend on the business context and dataset considered, an optimal development process should use both fairness and model performance metrics.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.