Next-generation wireless networks are poised to redefine our digital world, by integrating advanced artificial intelligence (AI) capabilities, enabling intelligent support for emerging applications such as Extended Reality (XR), Holographic Telepresence, Robotics, and Autonomous Mobility. In this environment, Consumer Electronics (CE) devices, such as smartphones, smartwatches, medical implants, and IoT devices, will play a crucial role in various real-world scenarios. However, due to their limited battery and processing capacities; training complex and large deep learning models on such devices is computationally intensive and can result in high energy and power consumption, and early termination of the training. Therefore, it is crucial to optimize the training process using lightweight models and offloading computation to more powerful servers while providing good learning performance. Distributed machine/deep learning algorithms offer viable alternatives in this context. They consist of deploying distributed agents that collaborate to train complex models without sharing local data. Federated Learning (FL) is a popular distributed learning paradigm in the literature. Nevertheless, as mentioned earlier, training the full FL model on computationally constrained nodes is not feasible.
Guest editorial of the special section on split federated learning for resource-constrained consumer electronics: Applications and challenges
IEEE Transactions on Consumer Electronics, Vol. 71, N°3, August 2025
Type:
Journal
Date:
2025-11-11
Department:
Systèmes de Communication
Eurecom Ref:
8499
Copyright:
© 2025 IEEE. Personal use of this material is permitted. However, permission to reprint/republish this material for advertising or promotional purposes or for creating new collective works for resale or redistribution to servers or lists, or to reuse any copyrighted component of this work in other works must be obtained from the IEEE.
See also:
PERMALINK : https://www.eurecom.fr/publication/8499