- Chinook Supercomputer
- MSC User Policies
- Molecular Science Software Suite
- Graphics and Visualization Lab
MSC Benchmark Revision 2.0
The MSC Benchmark Revision 2.0 (released February 2012) is designed to both examine many specific performance characteristics of proposed systems as well as run applications of interest to EMSL. Many of the micro-benchmarks represent many of the component operations in applications of interest. Results of these micro-benchmarks will also be used as input to a scaling model to predict and test scalability of one or more of the applications of interest.
The benchmarks have been designed to have reasonable run-times. They are to be run at various system sizes up to the maximum system size available on a system that comprises technology as close as possible to the intended systems. It is expected that all program output would be returned so that EMSL staff can review it. Any specific optimizations used, in terms of compiler flags and source code modifications should be recorded and also provided.
Each benchmark includes a README file that includes instructions on the configuration of the benchmark to be run. Any changes made to source for optimization purposes, must produce valid scientific results and be of production quality. Such changes are allowed for the benchmarks submitted for an RFP; further changes to code will NOT be allowed between award and acceptance.
All benchmark results along with original source files, any source modifications and output are required. The actual source code used along with scripts to compile and run the benchmarks are also required. Submitted source shall be in a form that can readily be compiled on the system. The submission shall not contain executables, object files, core files or other large binary data files.
This requires the cachebench benchmark from http://icl.cs.utk.edu/projects/llcbench/
This requires the MPI_STREAM from http://www.cs.virginia.edu/stream/
This requires the coNCePTuaL test from http://conceptual.sourceforge.net
This requires the GA toolkit version 5.1 from http://www.emsl.pnl.gov/docs/global
This requires that the following benchmark codes be run:
Requires NWChem 6.1 from http://www.nwchem-sw.org
Requires WRF version 3.3.1 from http://www.mmm.ucar.edu/wrf/users/download/get_source.html. Two test cases will be used for WRF which are both on the same polar stereographic projection but at different resolutions of 9km (~4GB of data) and 3km (~40GB of data). The input files are available as part of the ARSC WRF Benchmarking Suite at http://weather.arsc.edu/BenchmarkSuite/.