No Cover Image

E-Thesis 33 views 11 downloads

From Algorithm to Application: Enhancing Federated Learning with Adaptive Aggregation and Gradient Protection / YI HU

Swansea University Author: YI HU

  • 2025_Hu_Y.final.71338.pdf

    PDF | E-Thesis – open access

    Copyright: the author, Yi Hu, 2025. Distributed under the terms of a Creative Commons Attribution 4.0 License (CC BY 4.0)

    Download (6.5MB)

DOI (Published version): 10.23889/SUThesis.71338

Abstract

Federated Learning (FL) has emerged as a promising paradigm for decentralised machine learning, allowing multiple clients to collaboratively train models without sharing their private data. This distributed approach is particularly relevant in domains with stringent data privacy requirements, such a...

Full description

Published: Swansea University 2025
Institution: Swansea University
Degree level: Doctoral
Degree name: Ph.D
Supervisor: Xie, X
URI: https://cronfa.swan.ac.uk/Record/cronfa71338
first_indexed 2026-01-29T12:40:20Z
last_indexed 2026-01-30T06:53:11Z
id cronfa71338
recordtype RisThesis
fullrecord <?xml version="1.0"?><rfc1807><datestamp>2026-01-29T12:50:48.6696046</datestamp><bib-version>v2</bib-version><id>71338</id><entry>2026-01-29</entry><title>From Algorithm to Application: Enhancing Federated Learning with Adaptive Aggregation and Gradient Protection</title><swanseaauthors><author><sid>c724e9f97d8ed76a8c0b3ea26bbc43a3</sid><firstname>YI</firstname><surname>HU</surname><name>YI HU</name><active>true</active><ethesisStudent>false</ethesisStudent></author></swanseaauthors><date>2026-01-29</date><abstract>Federated Learning (FL) has emerged as a promising paradigm for decentralised machine learning, allowing multiple clients to collaboratively train models without sharing their private data. This distributed approach is particularly relevant in domains with stringent data privacy requirements, such as &#xFB01;nance, healthcare, and edge computing. However, despite its advantages, FL faces three critical challenges: (1) ef&#xFB01;cient model aggregation, (2) protection against privacy leakage, and (3) real-world applicability in complex domains. This thesis addresses these challenges by proposing novel strategies to optimise aggregation mechanisms, enhance its privacy protection capabilities, and explore its application in &#xFB01;nancial modelling. First, we introduce Element-Wise Weights Aggregation for FL (EWWA-FL), a novel optimisation technique that improves global model convergence by adopting element-wise adaptive weighting. Traditional FL aggregation methods, such as FedAvg, assign a single proportion to each local model without considering the varying importance of individual model parameters. In contrast, EWWA-FL assigns unique proportions to each element within local model weights, ensuring more precise updates that account for dataset heterogeneity among clients. Experimental results demonstrate that EWWA-FL signi&#xFB01;cantly improves both convergence speed and &#xFB01;nal model accuracy, outperforming FedAvg, FedOpt, and FedCAMS across multiple benchmark datasets. By incorporating an element-wise approach, EWWA-FL provides a more adaptive and &#xFB01;ne-grained aggregation strategy that enhances FL&#x2019;s performance in both Independent and Identically Distributed (IID) and Non-IID settings. Second, we propose AdaDefence, a privacy-preserving defence mechanism against gradient leakage attacks in FL. While FL eliminates the need for raw data sharing, recent attacks have demonstrated that an adversary can reconstruct private training data from shared gradients, posing a severe privacy risk. To counteract this, AdaDefence introduces a gradient stand-in approach, wherein local clients replace actual gradients with modi&#xFB01;ed gradients before sending them to the server. This method prevents attackers from reconstructing private data while maintaining model utility. AdaDefence effectively defends against state-of-the-art attacks such as Deep Leakage from Gradients (DLG), Generative Regression Neural Network (GRNN), and Inverting Gradient (IG) without signi&#xFB01;cantly compromising model accuracy. Our extensive empirical analysis shows that AdaDefence provides strong privacy guarantees while ensuring minimal performance degradation, making it a practical and scalable solution for real-world FL deployments. Finally, we explore the real-world application of FL in &#xFB01;nancial modelling, particularly in Cross-Stock Trend Integration (CSTI) for enhancing stock price prediction. Traditional &#xFB01;nancial models suffer from data fragmentation, where different &#xFB01;nancial institutions and stock markets operate in silos, limiting predictive power. To overcome this, we develop a FL-based approach that enables multiple &#xFB01;nancial institutions to collaboratively train stock prediction models without exposing sensitive trading data. This approach leverages cross-stock trend integration, allowing predictive models to learn patterns from multiple stocks while preserving privacy.Our experimental results demonstrate that federated cross-stock learning improves predictive accuracy and model robustness, outperforming conventional single-stock prediction methods.By enabling secure, multi-institution collaboration, this work highlights the potential of FL in advancing &#xFB01;nancial modelling while ensuring regulatory compliance and data con&#xFB01;dentiality. By addressing these fundamental aspects, optimisation and protection, this thesis makes signi&#xFB01;cant contributions to the &#xFB01;eld of FL. The proposed methodologies collectively enhance FL&#x2019;s ef&#xFB01;ciency, security, and real-world applicability. Through EWWA-FL, FL models achieve faster and more reliable convergence. By introducing AdaDefence, FL gains stronger privacy protections against gradient-based attacks. Finally, by demonstrating FL&#x2019;s potential in cross-stock trend integration, this thesis showcases how FL can be deployed in privacy-sensitive &#xFB01;nancial applications. These contributions pave the way for more ef&#xFB01;cient, secure, and scalable FL systems, advancing their adoption in a wide range of domains, including healthcare, autonomous systems, and &#xFB01;nancial technology.</abstract><type>E-Thesis</type><journal/><volume/><journalNumber/><paginationStart/><paginationEnd/><publisher/><placeOfPublication>Swansea University</placeOfPublication><isbnPrint/><isbnElectronic/><issnPrint/><issnElectronic/><keywords>Deep Learning, Federated Learning, Finance</keywords><publishedDay>7</publishedDay><publishedMonth>11</publishedMonth><publishedYear>2025</publishedYear><publishedDate>2025-11-07</publishedDate><doi>10.23889/SUThesis.71338</doi><url/><notes/><college>COLLEGE NANME</college><CollegeCode>COLLEGE CODE</CollegeCode><institution>Swansea University</institution><supervisor>Xie, X</supervisor><degreelevel>Doctoral</degreelevel><degreename>Ph.D</degreename><apcterm/><funders/><projectreference/><lastEdited>2026-01-29T12:50:48.6696046</lastEdited><Created>2026-01-29T12:35:37.6438235</Created><path><level id="1">Faculty of Science and Engineering</level><level id="2">School of Mathematics and Computer Science - Computer Science</level></path><authors><author><firstname>YI</firstname><surname>HU</surname><order>1</order></author></authors><documents><document><filename>71338__36140__55aad54dc2ce443ebb020f4d633cc188.pdf</filename><originalFilename>2025_Hu_Y.final.71338.pdf</originalFilename><uploaded>2026-01-29T12:50:17.8875470</uploaded><type>Output</type><contentLength>6815072</contentLength><contentType>application/pdf</contentType><version>E-Thesis &#x2013; open access</version><cronfaStatus>true</cronfaStatus><documentNotes>Copyright: the author, Yi Hu, 2025. Distributed under the terms of a Creative Commons Attribution 4.0 License (CC BY 4.0)</documentNotes><copyrightCorrect>true</copyrightCorrect><language>eng</language><licence>https://creativecommons.org/licenses/by/4.0/</licence></document></documents><OutputDurs/></rfc1807>
spelling 2026-01-29T12:50:48.6696046 v2 71338 2026-01-29 From Algorithm to Application: Enhancing Federated Learning with Adaptive Aggregation and Gradient Protection c724e9f97d8ed76a8c0b3ea26bbc43a3 YI HU YI HU true false 2026-01-29 Federated Learning (FL) has emerged as a promising paradigm for decentralised machine learning, allowing multiple clients to collaboratively train models without sharing their private data. This distributed approach is particularly relevant in domains with stringent data privacy requirements, such as finance, healthcare, and edge computing. However, despite its advantages, FL faces three critical challenges: (1) efficient model aggregation, (2) protection against privacy leakage, and (3) real-world applicability in complex domains. This thesis addresses these challenges by proposing novel strategies to optimise aggregation mechanisms, enhance its privacy protection capabilities, and explore its application in financial modelling. First, we introduce Element-Wise Weights Aggregation for FL (EWWA-FL), a novel optimisation technique that improves global model convergence by adopting element-wise adaptive weighting. Traditional FL aggregation methods, such as FedAvg, assign a single proportion to each local model without considering the varying importance of individual model parameters. In contrast, EWWA-FL assigns unique proportions to each element within local model weights, ensuring more precise updates that account for dataset heterogeneity among clients. Experimental results demonstrate that EWWA-FL significantly improves both convergence speed and final model accuracy, outperforming FedAvg, FedOpt, and FedCAMS across multiple benchmark datasets. By incorporating an element-wise approach, EWWA-FL provides a more adaptive and fine-grained aggregation strategy that enhances FL’s performance in both Independent and Identically Distributed (IID) and Non-IID settings. Second, we propose AdaDefence, a privacy-preserving defence mechanism against gradient leakage attacks in FL. While FL eliminates the need for raw data sharing, recent attacks have demonstrated that an adversary can reconstruct private training data from shared gradients, posing a severe privacy risk. To counteract this, AdaDefence introduces a gradient stand-in approach, wherein local clients replace actual gradients with modified gradients before sending them to the server. This method prevents attackers from reconstructing private data while maintaining model utility. AdaDefence effectively defends against state-of-the-art attacks such as Deep Leakage from Gradients (DLG), Generative Regression Neural Network (GRNN), and Inverting Gradient (IG) without significantly compromising model accuracy. Our extensive empirical analysis shows that AdaDefence provides strong privacy guarantees while ensuring minimal performance degradation, making it a practical and scalable solution for real-world FL deployments. Finally, we explore the real-world application of FL in financial modelling, particularly in Cross-Stock Trend Integration (CSTI) for enhancing stock price prediction. Traditional financial models suffer from data fragmentation, where different financial institutions and stock markets operate in silos, limiting predictive power. To overcome this, we develop a FL-based approach that enables multiple financial institutions to collaboratively train stock prediction models without exposing sensitive trading data. This approach leverages cross-stock trend integration, allowing predictive models to learn patterns from multiple stocks while preserving privacy.Our experimental results demonstrate that federated cross-stock learning improves predictive accuracy and model robustness, outperforming conventional single-stock prediction methods.By enabling secure, multi-institution collaboration, this work highlights the potential of FL in advancing financial modelling while ensuring regulatory compliance and data confidentiality. By addressing these fundamental aspects, optimisation and protection, this thesis makes significant contributions to the field of FL. The proposed methodologies collectively enhance FL’s efficiency, security, and real-world applicability. Through EWWA-FL, FL models achieve faster and more reliable convergence. By introducing AdaDefence, FL gains stronger privacy protections against gradient-based attacks. Finally, by demonstrating FL’s potential in cross-stock trend integration, this thesis showcases how FL can be deployed in privacy-sensitive financial applications. These contributions pave the way for more efficient, secure, and scalable FL systems, advancing their adoption in a wide range of domains, including healthcare, autonomous systems, and financial technology. E-Thesis Swansea University Deep Learning, Federated Learning, Finance 7 11 2025 2025-11-07 10.23889/SUThesis.71338 COLLEGE NANME COLLEGE CODE Swansea University Xie, X Doctoral Ph.D 2026-01-29T12:50:48.6696046 2026-01-29T12:35:37.6438235 Faculty of Science and Engineering School of Mathematics and Computer Science - Computer Science YI HU 1 71338__36140__55aad54dc2ce443ebb020f4d633cc188.pdf 2025_Hu_Y.final.71338.pdf 2026-01-29T12:50:17.8875470 Output 6815072 application/pdf E-Thesis – open access true Copyright: the author, Yi Hu, 2025. Distributed under the terms of a Creative Commons Attribution 4.0 License (CC BY 4.0) true eng https://creativecommons.org/licenses/by/4.0/
title From Algorithm to Application: Enhancing Federated Learning with Adaptive Aggregation and Gradient Protection
spellingShingle From Algorithm to Application: Enhancing Federated Learning with Adaptive Aggregation and Gradient Protection
YI HU
title_short From Algorithm to Application: Enhancing Federated Learning with Adaptive Aggregation and Gradient Protection
title_full From Algorithm to Application: Enhancing Federated Learning with Adaptive Aggregation and Gradient Protection
title_fullStr From Algorithm to Application: Enhancing Federated Learning with Adaptive Aggregation and Gradient Protection
title_full_unstemmed From Algorithm to Application: Enhancing Federated Learning with Adaptive Aggregation and Gradient Protection
title_sort From Algorithm to Application: Enhancing Federated Learning with Adaptive Aggregation and Gradient Protection
author_id_str_mv c724e9f97d8ed76a8c0b3ea26bbc43a3
author_id_fullname_str_mv c724e9f97d8ed76a8c0b3ea26bbc43a3_***_YI HU
author YI HU
author2 YI HU
format E-Thesis
publishDate 2025
institution Swansea University
doi_str_mv 10.23889/SUThesis.71338
college_str Faculty of Science and Engineering
hierarchytype
hierarchy_top_id facultyofscienceandengineering
hierarchy_top_title Faculty of Science and Engineering
hierarchy_parent_id facultyofscienceandengineering
hierarchy_parent_title Faculty of Science and Engineering
department_str School of Mathematics and Computer Science - Computer Science{{{_:::_}}}Faculty of Science and Engineering{{{_:::_}}}School of Mathematics and Computer Science - Computer Science
document_store_str 1
active_str 0
description Federated Learning (FL) has emerged as a promising paradigm for decentralised machine learning, allowing multiple clients to collaboratively train models without sharing their private data. This distributed approach is particularly relevant in domains with stringent data privacy requirements, such as finance, healthcare, and edge computing. However, despite its advantages, FL faces three critical challenges: (1) efficient model aggregation, (2) protection against privacy leakage, and (3) real-world applicability in complex domains. This thesis addresses these challenges by proposing novel strategies to optimise aggregation mechanisms, enhance its privacy protection capabilities, and explore its application in financial modelling. First, we introduce Element-Wise Weights Aggregation for FL (EWWA-FL), a novel optimisation technique that improves global model convergence by adopting element-wise adaptive weighting. Traditional FL aggregation methods, such as FedAvg, assign a single proportion to each local model without considering the varying importance of individual model parameters. In contrast, EWWA-FL assigns unique proportions to each element within local model weights, ensuring more precise updates that account for dataset heterogeneity among clients. Experimental results demonstrate that EWWA-FL significantly improves both convergence speed and final model accuracy, outperforming FedAvg, FedOpt, and FedCAMS across multiple benchmark datasets. By incorporating an element-wise approach, EWWA-FL provides a more adaptive and fine-grained aggregation strategy that enhances FL’s performance in both Independent and Identically Distributed (IID) and Non-IID settings. Second, we propose AdaDefence, a privacy-preserving defence mechanism against gradient leakage attacks in FL. While FL eliminates the need for raw data sharing, recent attacks have demonstrated that an adversary can reconstruct private training data from shared gradients, posing a severe privacy risk. To counteract this, AdaDefence introduces a gradient stand-in approach, wherein local clients replace actual gradients with modified gradients before sending them to the server. This method prevents attackers from reconstructing private data while maintaining model utility. AdaDefence effectively defends against state-of-the-art attacks such as Deep Leakage from Gradients (DLG), Generative Regression Neural Network (GRNN), and Inverting Gradient (IG) without significantly compromising model accuracy. Our extensive empirical analysis shows that AdaDefence provides strong privacy guarantees while ensuring minimal performance degradation, making it a practical and scalable solution for real-world FL deployments. Finally, we explore the real-world application of FL in financial modelling, particularly in Cross-Stock Trend Integration (CSTI) for enhancing stock price prediction. Traditional financial models suffer from data fragmentation, where different financial institutions and stock markets operate in silos, limiting predictive power. To overcome this, we develop a FL-based approach that enables multiple financial institutions to collaboratively train stock prediction models without exposing sensitive trading data. This approach leverages cross-stock trend integration, allowing predictive models to learn patterns from multiple stocks while preserving privacy.Our experimental results demonstrate that federated cross-stock learning improves predictive accuracy and model robustness, outperforming conventional single-stock prediction methods.By enabling secure, multi-institution collaboration, this work highlights the potential of FL in advancing financial modelling while ensuring regulatory compliance and data confidentiality. By addressing these fundamental aspects, optimisation and protection, this thesis makes significant contributions to the field of FL. The proposed methodologies collectively enhance FL’s efficiency, security, and real-world applicability. Through EWWA-FL, FL models achieve faster and more reliable convergence. By introducing AdaDefence, FL gains stronger privacy protections against gradient-based attacks. Finally, by demonstrating FL’s potential in cross-stock trend integration, this thesis showcases how FL can be deployed in privacy-sensitive financial applications. These contributions pave the way for more efficient, secure, and scalable FL systems, advancing their adoption in a wide range of domains, including healthcare, autonomous systems, and financial technology.
published_date 2025-11-07T05:33:52Z
_version_ 1856805832315895808
score 11.09611