|Ph.D Student||Peled Leeor|
|Subject||Semantic Locality in Memory Access Patterns and Machine|
Learning Approaches for Hardware Prefetching
|Department||Department of Electrical Engineering||Supervisors||Professor Emeritus Uri Weiser|
|Professor Yoav Etsion|
|Full Thesis text|
Most modern memory prefetchers rely on spatio-temporal
locality to predict the memory addresses likely to be accessed by a program in
future. Emerging workloads, however, make increasing use of irregular data
structures and thus exhibit a lower degree of spatial locality. This makes
them less amenable to spatio-temporal prefetchers.
In this work, we introduce the concept of Semantic Locality, which uses
inherent program semantics to characterize access relations. We show that
semantic locality can capture the relationship between data elements in a
manner agnostic to the actual data layout, and we argue that semantic
locality transcends spatio-temporal concerns.
We introduce several prefetchers based on the concept that analyzing
program context using machine learning techniques (such as reinforcement
learning, neural networks and context-based code analysis) can track and
approximate semantic locality. The prefetchers use various methods to isolate, learn and store access patterns, using them to predict future accesses
based on program context.
Our evaluation shows that, over a baseline with simple IP-based stride
prefetching, the proposed prefetchers can deliver on SPEC 2006 an average
speedup of 20% (and up to 2.8x), while for SPEC 2017 they deliver an
average speedup of 16% (and up to 1.85x). Over a set of kernels with
various memory access patterns we see speedups up to 4.6x. This allows
our context-based ML prefetchers to outperforms leading spatio-temporal
prefetchers as well as several specialized irregular ones. Finally, we show
that the context-based prefetching makes it possible for naive, pointer-based
implementations of irregular algorithms to achieve performance comparable
to that of spatially optimized code.