ARCHIVED: Get started on Big Red II
On this page:
- System overview
- Accounts, access, and user policies
- Programming environment
- Run jobs
- X forwarding and interactive jobs
- Application-specific help
- Get help
Big Red II was retired from service on December 15, 2019; for more, see ARCHIVED: About Big Red II at Indiana University (retired).
Big Red II is Indiana University's main system for high-performance parallel computing. With a theoretical peak performance (Rpeak) of one thousand trillion floating-point operations per second (1 petaFLOPS) and a maximal achieved performance (Rmax) of 596.4 teraFLOPS, Big Red II is among the world's fastest research supercomputers. Owned and operated solely by IU, Big Red II is designed to accelerate discovery in a wide variety of fields, including medicine, physics, fine arts, and global climate research, and enable effective analysis of large, complex data sets (big data).
Following is a selection of IU Knowledge Base documents to help you get started using Big Red II; for additional documentation, search the Knowledge Base.
Accounts, access, and user policies
- ARCHIVED: System access
- Your responsibilities as a computer user at IU
- Policies regarding UITS research systems
- ARCHIVED: Work with data containing PHI
- Use Modules to manage your software environment
- HPC Applications
- ARCHIVED: Cray native mode (ESM) and Cluster Compatibility Mode (CCM)
- Compile programs on Big Red 3 at IU
- ARCHIVED: Compile Java programs on Big Red II at IU
- ARCHIVED: Run applications on Big Red II's GPU-enabled compute nodes
- ARCHIVED: Queue information
- ARCHIVED: Run batch jobs on Big Red II
- ARCHIVED: Run OpenMP or hybrid OpenMP/MPI jobs on Big Red II at IU
- Use PCP to bundle multiple serial jobs to run in parallel on IU research supercomputers
- Monitor memory and CPU usage for single-node batch jobs on Carbonate
X forwarding and interactive jobs
- ARCHIVED: Run interactive jobs on Big Red II at IU
- Use X forwarding on a personal computer to securely run graphical applications installed on IU's research supercomputers
- ARCHIVED: ARPACK - numerical library
- ARCHIVED: GROMACS - molecular dynamics software
- ARCHIVED: MATLAB - numerical computing environment
- ARCHIVED: NAMD - molecular dynamics software
- ARCHIVED: SAS - statistics software
- ARCHIVED: Stata - statistics software
- ARCHIVED: WRF - weather research and forecasting model
- If you have a system-specific question, contact the High Performance Systems (HPS) team.
- If you have a programming question about compilers, scientific/numerical libraries, or debuggers, contact the UITS Research Applications and Deep Learning team.
For general questions about research computing at IU, contact UITS Research Technologies.
For more options, see Research computing support at IU.
Pervasive Technology Institute
PTI enables innovation through collaboration. This helps further IU’s mission of research, education, and engagement—both in and beyond Indiana.