The Parallel Computing Toolbox for Matlab is available as a trial installation on the Helix systems. This toolbox lets you solve computationally and data-intensive problems using multicore processors, GPUs and computer clusters. For instructions on how to use the toolbox on the NIH Biowulf cluster, see
The toolbox is available on helix/biowulf until November 14, 2014 (this will probably be extended). We are considering purchasing this product if there is sufficient interest from our user community. There will be a Webex/Conference call with a 1.5 hour overview of the Parallel Computing toolbox.
Topic: Parallel Computing with MATLAB
Date: Thursday, November 6, 2014
Time: 1:00 pm, Eastern Daylight Time (New York, GMT-04:00)
Meeting Number: 592 633 643
Meeting Password: (This meeting does not require a password.)
To Join the audio, please call 866-872-4258
Conf code # 590 420 6802
To join the online meeting (Now from mobile devices!)
1. Go to https://mathworks.webex.com/mathworks/j.php?ED=314126497&UID=0&RT=MiMxMQ%3D%3D
2. If requested, enter your name and email address.
3. If a password is required, enter the meeting password: (This meeting does not require a password.)
4. Click "Join".
ACEMD is a a high performance molecular dynamics code for biomolecular
systems designed specifically for NVIDIA GPUs. Simple and fast, ACEMD
uses very similar commands and input files of NAMD and output files as
NAMD or Gromacs. Some NIH users have reported excellent simulation
performance with ACEMD.
ACEMD is a commercial product, and we currently have 2 test licenses
(for 2 GPU nodes) available on Helix. Information about running ACEMD on
Biowulf is available at
We are considering the purchase of some ACEMD licenses for the Biowulf
GPU nodes if there is sufficient interest from the Biowulf user
community. Please let us know whether you would be interested in
continuing to use ACEMD on Biowulf.
The Helix/Biowulf staff will present a series of classes in
January/February 2015. All 3 classes will be hands-on, including
the Biowulf class.
Classes are free but registration is required. Helix/Biowulf users will
have priority enrollment, and new users are highly encouraged to attend.
Introduction to Linux
Date & Time: Mon, Jan 26, 9 am - 4 pm
Location: Bldg 12A, Rm B51.
This class is intended as a starting point for individuals new to Linux
and UNIX. The class will center on basic UNIX/Linux concepts: logging
in, navigating the file system, commands for interacting with files,
running and viewing processes, checking disk space and other common
tasks. The class will also cover the use of some services specific to
Bash Shell Scripting for Helix and Biowulf
Date & Time: Tue, Jan 27, 9 am - 4 pm
Location: Bldg 12A, Rm B51.
The default shell on many Linux systems is bash. Bash shell scripting
provides a method for automating common tasks on Linux systems (such as
Helix and Biowulf), including transferring and parsing files, creating
qsub and swarm scripts, pipelining tasks, and monitoring jobs. This
class will give a hands-on tutorial on how to create and use bash shell
scripts in a Linux environment.
NIH Biowulf Cluster: Scientific Computing
Date & Time: Thu, Feb 19, 9 am - 4 pm
Location: Bldg 12A, Rm B51
Morning: Introduction to the Biowulf Linux cluster, cluster concepts,
accounts, logging in, storage options, interactive vs. batch jobs, how
to set up and submit a simple batch job, batch queues, available software,
and job monitoring.
Afternoon: Hardware and network configuration, types of nodes, selection
of nodes using properties, system software, parallel programs,
Registration for all three classes is available at
In order to improve fair user access to the modest number of Infiniband-
connected nodes on the Biowulf Cluster, effective immediately, a 240
hour walltime limit will be placed on jobs allocating nodes with the "ib"
Jobs running for longer than 240 hours will be automatically terminated
by the batch system.
Jobs having run for more than 240 hours as of Thursday morning will be
deleted at that time; they can be restarted from checkpoint files.
Users who wish to run simulations for longer than 240 hours may wish
to use a "job chaining" strategy which is explained in the following
For GROMACS, http://biowulf.nih.gov/apps/gromacs.html#chain
For NAMD, http://biowulf.nih.gov/apps/namd.html#chain
For AMBER, http://biowulf.nih.gov/apps/amber.html#chain
For CHARMM, http://biowulf.nih.gov/apps/charmm/#daisy
Please send questions and comments to firstname.lastname@example.org
In particular, if you have better ways to set up chaining jobs
than described above, please let us know.
Effective immediately, the 'ls' command will no longer be aliased to
'ls --color'. This alias allows various file types to displayed
in different colors in your terminal window. However, it also results
in additional filesystem processing which can result in poor 'ls'
response times, especially for directories containing large numbers
In order to improve filesystem performance for all users, the Helix
Staff strongly recommends that users not use the "--color" option to
the 'ls' command. If you are setting this alias in your .bashrc file
you might want to consider removing it for better directory listing
If you have any questions, please contact the NIH Helix Systems staff
On Saturday, January 17, 2015, from 7:00 AM to 3:00 PM, maintenance will be performed on the chilled water system in the building 12 data center. The NIH High Performance Computer (HPC) chilled water vendor will be performing this work in coordination with the CIT Data Center Facilities and ORF building 12 facilities teams.
The impact to Helix and Biowulf systems has been designated as "Possible Service Affecting - Low" meaning there is a low possibility a service-affecting event will occur. If an issue does occur, however, there will be a six to eight hour cooling outage, which will require Helix and Biowulf to shut down according to established contingency plans.
While CIT is taking every precaution to minimize the risk, there is a small chance that a service interruption will occur during the maintenance window. If such an incident occurs, CIT will provide additional information at the time of the event.
If you have any questions, please contact the Helix Systems staff at email@example.com.
Helix Systems Staff
Join the High Performance Computing Facility Team and Support Leading Edge Biomedical Research at the National Institutes of Health!
The Biowulf staff advances research at NIH by planning, managing, and supporting the NIH High Performance Computing (HPC) Facility, including general-purpose and high-performance computers, storage, and the biomedical applications that run on them. We are currently looking for an enthusiastic individual who will work with our staff of computational biologists and system administrators to
- Provide guidance, support, training and advice in computational biology systems to NIH intramural researchers
- Ensure that computational biology applications and services are current, appropriate and sufficient to serve researchers needs
- Communicate with researchers to answer questions, troubleshoot problems and develop appropriate computational strategies
- Install and update applications, document system changes for staff, write and maintain documentation for users
- Consult and collaborate with system administrator coworkers to determine best system configurations for applications
- Remain abreast of current and emerging computational biology technologies and tools
Required skill set: Genomics knowledge; hands-on experience with a broad spectrum of genomics applications and next-generation sequence data analysis; experience with biological databases; experience working in a HPC environment; ability to install applications; excellent oral and written communication skills; experience presenting and/or teaching; ability to work both independently and as part of the team; understanding of bottlenecks associated with genome-scale data analysis; experience with shell scripting; ability to compile and build applications; ability to develop reasonable tests and example input for genomic applications; self-motivated with excellent organizational, troubleshooting and problem-solving skills; ability to learn about new technologies or software quickly
Other desirable skills: working knowledge of R; working knowledge of Matlab; working knowledge of MySQL; ability to program in at least one of the following languages: Python, C/C++, Java or Perl
Candidate must have a PhD in the biological sciences with one to two years experience in genomic analysis. This is a contractor position.
If interested, send your resume to Steve Bailey at firstname.lastname@example.org
Due to some last-minute dropouts, there are a few spots available in
tomorrow's Biowulf class. The next Biowulf class will be in June or July.
If you are interested in attending tommorrow's class, please register at
Class: NIH Biowulf Cluster: Scientific Computing
Instructor: Steven Fellini / Susan Chacko (Helix/Biowulf staff)
Date: Thu Feb 19, 2015
Time; 9 am - 4 pm
Location: Bldg 12, Rm B51
As part of the NIH Network modernization effort, maintenance on the Helix
Systems Network will result in a brief loss of network connectivity to all
Helix Systems at some point between 10 PM and 2 AM on Tuesday/Wednesday
February 24/25 for 15-30 minutes.
During this time all network connections to the Helix Systems will terminate,
and the Helix Systems will not be accessible.
The Helix Systems include helix.nih.gov, biowulf.nih.gov, helixweb.nih.gov,
helixdrive.nih.gov, biospec.nih.gov and sciware.nih.gov.
Please send comments and questions to email@example.com
On Tuesday March 3 starting at 5:30 am, system maintenance will be performed on the Helix Systems samba servers known as Helixdrive (helixdrive.nih.gov) which could last up to 90 minutes. The Helixdrive service allows users to mount or map the Helix/Biowulf home, data and scratch directories to their local systems.
During this maintenance, any existing user mounts to home, data or scratch directories will be terminated. Users will be unable to initiate new mounts until the maintenance has been completed.
This maintenance will not affect any other Helix/Biowulf services, including Helix and Biowulf login sessions, Biowulf cluster jobs, Helixweb applications or email.
Please contact the Helix Systems staff at firstname.lastname@example.org if you have any questions.