Toronto Metropolitan University
47d1e7d37e336d11c2ad4ee6350da7f8.pdf (3.85 MB)

Latency Efficient Cache Placement Using Learning Techniques in Mobile Edge Networks

Download (3.85 MB)
posted on 2024-03-18, 18:12 authored by Lubna Badri Mohammed
Future wireless networks provide interesting research challenges with the exponential growth in mobile data traffic, the advent of new high computational and real-time applications that cause many-fold increases in traffic and require low latency from the network. The emerging need to bring data closer to users and minimize the traffic off the macrocell base station (MBS) introduces caches at the edge of the networks. Storing most popular files in user terminals (UTs) and small base stations (SBSs) caches inside the mobile edge networks (MENs) is a promising approach to the challenges that face future data-rich wireless networks. Caching in the mobile UT allows obtaining requested contents directly from its nearby UT caches through device-to-device communication. This thesis addresses several challenges faced in developing a solution for cache placement at the edge of the network due to continuous changes in content popularity, user mobility, and the number of users within each network. It also considers the challenges related to high computation requirements of future applications that need to satisfy power and delivery time constraints. This dissertation aims to overcome those challenges in developing new solutions by employing intelligence and machine learning techniques (ILT) for mobile edge networks. We formulate the cache placement problem as a latency-efficient cache placement optimization problem that considers four objectives, to place contents in SBSs and UT caches. The multi-objective function takes advantage of user mobility patterns to decide on each SBS and UT cache content. The function is resolved into a weighted fusion decision with four objectives. Three of them are related to user mobility computed from previous data sets, and one objective is related to content popularity. We study the impact of user mobility on increasing the cache hit rate, which decreases the latency of downloading the requested data content. The results show the effect of user mobility on reducing the total energy consumed for transmitting the contents to the UTs. We propose a new cache placement algorithm based on user locations, contact probability, communication range, contact duration, and content popularity, formulating cache placement decisions as a binary classification problem (to cache and not to cache). Artificial neural networks (ANN), support vector machine (SVM), and logistic regression (LR) are used to model cache placement decisions. We investigate the characteristics of the input features (attributes) and the properties of these characteristics that affect supervised machine learning approaches. The performance of the new cache placement models using supervised learning techniques is evaluated to study the sensitivity of classification decisions with the change of system parameters. Finally, we develop a semi-supervised self-training (SSST) classification model for the cache placement problem. We assess the proposed SSST algorithm through experiments with datasets on different learning techniques. The performance comparison of different machine learning models was carried out with the same datasets. For the hit rate, we investigated the sensitivity of the classification by the changes in the environment parameters to show the effectiveness of the proposed theme.





  • Doctor of Philosophy


  • Electrical and Computer Engineering

Granting Institution

Ryerson University

LAC Thesis Type

  • Dissertation

Thesis Advisor

Alagan Anpalagan and Mohammad Jaseemuddin



Usage metrics

    Electrical and Computer Engineering (Theses)


    Ref. manager