QUANG DUONG
Newer Post Older Post

Voyager Neural Data Prefetcher Implementation

Voyager

Voyager is a neural model for data prefetching that's trained offline and outperforms strong irregular baselines (ISB and Domino) as well as other neural prefetchers (Delta-LSTM). I've re-implemented it as described in:

Zhan Shi, Akanksha Jain, Kevin Swersky, Milad Hashemi, Parthasarathy Ranganathan, and Calvin Lin. 2021. A hierarchical neural model of data prefetching. In Proceedings of the 26th ACM International Conference on Architectural Support for Programming Languages and Operating Systems (ASPLOS 2021). Association for Computing Machinery, New York, NY, USA, 861–873. DOI:https://doi.org/10.1145/3445814.3446752

Usage

The code can be found at this GitHub repository.

There are several configuration files present for different model presets in the configs folder in the project root. From the paper, the base configuration files are base.yaml and base_multi.yaml where the former is a PC localized Voyager model and the latter enables the multi-labeling scheme.

CLI Options

  • --benchmark load_trace: MLDPC-style load trace path for training the model
  • --model-path model_path: Path to the model. For train.py and online.py, this specifies where the model will be saved. For generate.py and test.py, this specifies where the model with be loaded from
  • --config config.yaml: Model and training configuration file
  • --checkpoint-every num_steps: Checkpoint the model every num_steps
  • --print-every num_steps: Print metrics and loss every num_steps. If this is not specified, the scripts will use the default TensorFlow progress bar
  • --tb-dir log_path: Path to save the TensorBoard logs
  • --debug: Debug flag to reduce the number of steps per epoch
  • --prefetch-file prefetch_file: Path to save prefetches generated by the model. This option is only available for the generate.py and online.py

Results

To be posted at a later date.