Abstract

This paper introduces the Ulimisana Optimisation Algorithm enabled Population Based Training (PBT-UOA) framework which allows for hyperparameters to be fine-tuned using a population based meta-heuristic algorithm at the same time as parameters are being optimised. Models are trained until near-convergence on the updated hyperparameters and the parameters of the best performing model are shared to warm start the other models in the next hyperparameter tuning iteration. In the PBT-UOA, all models are trained using the same dataset. This framework performed better than the Bayesian Optimisation algorithm. This paper also introduces the Ulimisana Optimisation Algorithm enabled Federated Learning (FL-UOA) framework which is an extension of the PBT-UOA. This framework is introduced to address the challenges of scattered datasets and privacy that is presented by the increase in connected end-devices. The FL-UOA learns on local data in scattered end-devices without sending datasets to a central server. The training datasets in local end-devices are used to evaluate models trained in other end-devices. The performance metrics are used to update the Social Trust Network (STN) of the FL-UOA framework. The FL-UOA outperformed the classic Federated Learning framework. This STN updating technique was tested in Machine Learning (ML) Unfairness to see how well it functioned as a regularisation term. This was achieved by training different models on subsets that contained datasets representing only specific sensitive groups. Results showed that by updating the hyperparameters while learning the parameters on the dataset scattered across different devices, the FL-UOA, takes advantage of diversified learning and reduces the ML Unfairness for models trained on group specific datasets.

Full Text
Paper version not known

Talk to us

Join us for a 30 min session where you can share your feedback and ask us any queries you have

Schedule a call

Disclaimer: All third-party content on this website/platform is and will remain the property of their respective owners and is provided on "as is" basis without any warranties, express or implied. Use of third-party content does not indicate any affiliation, sponsorship with or endorsement by them. Any references to third-party content is to identify the corresponding services and shall be considered fair use under The CopyrightLaw.