file_icons/elidek_logo_en.png

funded project on deep learning and networking

Fragkou Evangelia, PhD Student @ ECE.UTH


supervised by

Katsaros Dimitrios, assoc.prof. @ UTH




Brief intro into the project

The aim of this page is to present information concerning the 2-year (Apr.2022 - Mar. 2024) granted project, entitled: "Deep Learning and its Applications in Wireless Big Data Networks Performance", under the 3rd Call of PhD Fellowships (Fellowship Number: 5631), supported by the Hellenic Foundation for Research and Innovation (HFRI). Fragkou Evangelia, a member of the DANA Lab, and under the supervision of associate professor Dr.Dimitrios Katsaros, has earned this grant in order to carry our her PhD studies. Specifically, in this page we present the articles, being accepted in journals and conferences and they are also found in various bibliographic databases: Google Scholar, DBLP, ACM, Scopus.

Book Chapters

[ BC1] Fragkou E., Katsaros D., "Non-Static TinyML for Ad hoc Networked Devices" in B. S. Chaudhari, S. N. Ghorpade, M. Z. R. Paskauskas (Eds.), TinyML for Edge Intelligence in IoT and LPWAN Networks, 1st edition, ISBN:9780443222030, Elsevier, pp. 229-251, 2024. [ link ]

Articles in Journals

[J4] Fragkou E., Chini E., Papadopoulou M., Papakostas D., Katsaros D., Dustdar S., "Distributed federated deep learning in clustered internet of things wireless networks with data similarity-based client participation", IEEE Internet Computing magazine, , vol.28 (6), 2024, pp. 53-61. [ link ]

[ J3] Fragkou E., Katsaros D., "A joint survey in decentralized federated learning and tinyML: A brief introduction to swarm learning", Future Internet (MDPI), Vol.16 (11), 2024. [ link ]

[ J2] Fragkou E., Koultouki M., Katsaros D., "Model Reduction of Feed Forward Neural Networks for Resource-Constrained Devices", Applied Intelligence (Springer), Vol.53 (11), 2023, pp. 14102-14127. [ link ]

[ J1] Fragkou E., Papakostas D., Kasidakis T., Katsaros D., "Multilayer Backbones for Internet of Battlefield Things", Future Internet (MDPI), vol.14 (6), 2022. [ link ]

Articles in Conferences

[C4] Fragkou E., Katsaros D. , " Distributed Federated Learning in Wireless Ad hoc Networks,", Poster, 6th Summit on Gender Equality in Computing (GEC'24), Nicosia, Cyprus, June 13-14, 2024.
[open pdf]
[link]

[ C3] Fragkou E., Katsaros D., "Transfer Deep Learning for TinyML", Poster, 5th Summit on Gender Equality in Computing, (GEC'23), Athens, June 27, 2023.
[open pdf]
[link]

[ C2] Fragkou E., Lygnos V., Katsaros D., "Transfer Learning for Convolutional Neural Networks in Tiny Deep Learning Environments", PCI22, DOI: 10.1145/3575879.3575984. [ link ]

[ C1] Fragkou E., Katsaros D., "Memory Reduction and Training Acceleration of Neural Networks for Tiny Machine Learning", Poster, 4th Summit on Gender Equality in Computing, (GEC'22), Hybrid Event, June 16-17, 2022.
[open pdf]
[link]

Main goals

  • Neural model reduction. The aim is to sparsify a deep neural network by cutting of connections of neurons among the network that don't contribute to the neural network training, during the training phase. In particular, we employ specific rule-oriented concepts, developed in the realm of network science (We base our motivation on observations in real neural networks whose actual topology is scale-free (or smallworld)),in order to sparsify the neural network with the aim of keeping the most significant linkages, which are responsible for better information distribution in the network. Thus, we reduce, drastically both the number of trainable variables and the size of the model, which leads to training acceleration and therefore our algorithms are applicable to resource-constrained (TinyML) devices.

  • Federated learning over ad hoc networks. The aim is to drastically reduce communication in purely distributed ad hoc network, by allowing only a percentage of nodes (not every node), belonging to the network, to cooperatively train a global model, in Federated Learning techniques. So, we focus on finding smart node participation protocols (like following data similarity criteria) in order to exclude specific nodes of the training procedure and therefore reduce the data, needed to be transmitted among the nodes in the network.

  • Multi-layer distributed backbones to facilitate federated learning. In the existing literature, most published studies have concentrated on single-layer networks; however, these works often fall short in representing large-scale real-world networks. This project addresses the challenge of constructing compact and robust multi-layer backbones, aiming to facilitate efficient information dissemination across network nodes. The proposed approach not only supports the federation process but also enhances the scalability of the developed algorithms.

Contact

>> Fragkou Evangelia, email: efragkou@uth.gr

>> Katsaros Dimitrios, email: dkatsar@uth.gr