Xtreme-NoC: Extreme Gradient Boosting Based Latency Model for Network-on-Chip Architectures
Start Date
15-4-2021 4:15 PM
End Date
15-4-2021 4:30 PM
Student's Major
Computer Information Science
Student's College
Science, Engineering and Technology
Mentor's Name
Naseef Mansoor
Mentor's Department
Computer Information Science
Mentor's College
Science, Engineering and Technology
Description
Due to the heterogeneous integration of the cores, execution of diverse applications on a many processor chip, application mapping strategies, and the design of Network-on-Chip (NoC) plays a crucial role to ensuring optimum performance of these systems. Design of an optimal NoC architecture poses a performance optimization problem with constraints on power, and area. Determination of these optimal network configurations is carried out by guided (genetic algorithm) or unguided (grid search) algorithms to explore the NoC design space (DSE). Each step of this DSE, a network configuration is simulated for performance, area and power for a wide range of applications. To perform these simulations, system level modeling is required to accurately capture the network's timing behavior, energy profile, and area requirements. Accuracy of the model, network configuration, and application running on the system, these simulations can be extremely slow. Alternative to such simulation is to use analytical network models utilizing queuing theory and treat each input channel in the NoC router as an M/M/1, M/G/1/N, or G/G/1 queue. These models provide a good estimate of network performance as latency only under certain assumptions, i.e.: a Poisson process for network traffic with exponential packet service time, and an exponential distribution for packet length. These assumptions are not guaranteed for real application-based traffic patterns, and the accuracy of the analytical models are disputable. As a result, to improve the slow DSE process of NoC architectures, an accurate NoC performance model with accelerated runtime is required. In this work, we propose Xtreme-NoC, an extreme gradient boosting based NoC latency model that can predict the accuracy of NoC architectures with 98.1% accuracy. To show the efficacy of the proposed model, we compare it to other regression models and prove that ours is more accurate at predicting latency.
Xtreme-NoC: Extreme Gradient Boosting Based Latency Model for Network-on-Chip Architectures
Due to the heterogeneous integration of the cores, execution of diverse applications on a many processor chip, application mapping strategies, and the design of Network-on-Chip (NoC) plays a crucial role to ensuring optimum performance of these systems. Design of an optimal NoC architecture poses a performance optimization problem with constraints on power, and area. Determination of these optimal network configurations is carried out by guided (genetic algorithm) or unguided (grid search) algorithms to explore the NoC design space (DSE). Each step of this DSE, a network configuration is simulated for performance, area and power for a wide range of applications. To perform these simulations, system level modeling is required to accurately capture the network's timing behavior, energy profile, and area requirements. Accuracy of the model, network configuration, and application running on the system, these simulations can be extremely slow. Alternative to such simulation is to use analytical network models utilizing queuing theory and treat each input channel in the NoC router as an M/M/1, M/G/1/N, or G/G/1 queue. These models provide a good estimate of network performance as latency only under certain assumptions, i.e.: a Poisson process for network traffic with exponential packet service time, and an exponential distribution for packet length. These assumptions are not guaranteed for real application-based traffic patterns, and the accuracy of the analytical models are disputable. As a result, to improve the slow DSE process of NoC architectures, an accurate NoC performance model with accelerated runtime is required. In this work, we propose Xtreme-NoC, an extreme gradient boosting based NoC latency model that can predict the accuracy of NoC architectures with 98.1% accuracy. To show the efficacy of the proposed model, we compare it to other regression models and prove that ours is more accurate at predicting latency.