Resource Allocation in Multi-access Edge Computing (MEC) Systems: Optimization and Machine Learning Algorithms

dc.contributor.advisorTabassum, Hina
dc.contributor.authorZarandi, Sheyda
dc.date.accessioned2021-07-06T12:54:49Z
dc.date.available2021-07-06T12:54:49Z
dc.date.copyright2021-04
dc.date.issued2021-07-06
dc.date.updated2021-07-06T12:54:49Z
dc.degree.disciplineElectrical and Computer Engineering
dc.degree.levelMaster's
dc.degree.nameMASc - Master of Applied Science
dc.description.abstractWith the rapid proliferation of diverse wireless applications, the next generation of wireless networks are required to meet diverse quality of service (QoS) in various applications. The existing one-size-fits-all resource allocation algorithms will not be able to sustain the sheer need of supporting diverse QoS requirements. In this context, radio access network (RAN) slicing has been recently emerged as a promising approach to virtualize networks resources and create multiple logical network slices on a common physical infrastructure. Each slice can then be tailored to a specific application with distinct QoS requirement. This would considerably reduce the cost of infrastructure providers. However, efficient virtualized network slicing is only feasible if network resources are efficiently monitored and allocated. In the first part of this thesis, leveraging on tools from fractional programming and Augmented Lagrange method, I propose an efficient algorithm to jointly optimize users offloading decisions, communication, and computing resource allocation in a sliced multi-cell multi-access edge computing (MEC) network in the presence of interference. The objective is to minimize the weighted sum of the delay deviation observed at each slice from its corresponding delay requirement. The considered problem enables slice prioritization, cooperation among MEC servers, and partial offloading to multiple MEC servers. On another note, due to high computation and time complexity, traditional centralized optimization solutions are often rendered impractical and non-scalable for real-time resource allocation purposes. Thus, the need of machine learning algorithms has become more vital than ever before. To address this issue, in the second part of this thesis, exploiting the power of federated learning (FDL) and optimization theory, I develop a federated deep reinforcement learning framework for joint offloading decision and resource allocation in order to minimize the joint delay and energy consumption in a MEC-enabled internet-of-things (IoT) network with QoS constraints. The proposed algorithm is applied to an IoT network, since the IoT devices suffer significantly from limited computation and battery capacity. The proposed algorithm is distributed in nature, exploit cooperation among devices, preserves the privacy, and is executable on resource-limited cellular or IoT devices.
dc.identifier.urihttp://hdl.handle.net/10315/38502
dc.languageen
dc.rightsAuthor owns copyright, except where explicitly noted. Please contact the author directly with licensing requests.
dc.subjectElectrical engineering
dc.subject.keywordsFederated learning
dc.subject.keywordsDouble deep Q-networks
dc.subject.keywordsReinforcement learning
dc.subject.keywordsFractional programming
dc.subject.keywordsMixed-integer non-linear programming
dc.subject.keywordsResource allocation
dc.subject.keywordsVirtually sliced networks
dc.subject.keywordsIoT networks
dc.titleResource Allocation in Multi-access Edge Computing (MEC) Systems: Optimization and Machine Learning Algorithms
dc.typeElectronic Thesis or Dissertation

Files

Original bundle
Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Zarandi_Sheyda_2021_Masters.pdf
Size:
1.13 MB
Format:
Adobe Portable Document Format
Description:
License bundle
Now showing 1 - 2 of 2
No Thumbnail Available
Name:
license.txt
Size:
1.87 KB
Format:
Plain Text
Description:
No Thumbnail Available
Name:
YorkU_ETDlicense.txt
Size:
3.39 KB
Format:
Plain Text
Description: