Skip over global navigation links

Helix Systems

12/2/2014
The Helix/Biowulf staff will present a series of classes in January/February 2015. All 3 classes will be hands-on, including the Biowulf class.

Classes are free but registration is required. Helix/Biowulf users will have priority enrollment, and new users are highly encouraged to attend.

Introduction to Linux ---------------------- Date & Time: Mon, Jan 26, 9 am - 4 pm Location: Bldg 12A, Rm B51. This class is intended as a starting point for individuals new to Linux and UNIX. The class will center on basic UNIX/Linux concepts: logging in, navigating the file system, commands for interacting with files, running and viewing processes, checking disk space and other common tasks. The class will also cover the use of some services specific to Helix/Biowulf usage.

Bash Shell Scripting for Helix and Biowulf ------------------------------------------ Date & Time: Tue, Jan 27, 9 am - 4 pm Location: Bldg 12A, Rm B51. The default shell on many Linux systems is bash. Bash shell scripting provides a method for automating common tasks on Linux systems (such as Helix and Biowulf), including transferring and parsing files, creating qsub and swarm scripts, pipelining tasks, and monitoring jobs. This class will give a hands-on tutorial on how to create and use bash shell scripts in a Linux environment.

NIH Biowulf Cluster: Scientific Computing ----------------------------------------- Date & Time: Thu, Feb 19, 9 am - 4 pm Location: Bldg 12A, Rm B51 Morning: Introduction to the Biowulf Linux cluster, cluster concepts, accounts, logging in, storage options, interactive vs. batch jobs, how to set up and submit a simple batch job, batch queues, available software, and job monitoring. Afternoon: Hardware and network configuration, types of nodes, selection of nodes using properties, system software, parallel programs, programming tools.

Registration for all three classes is available at http://helixweb.nih.gov/nih/classes/index.php




12/10/2014

In order to improve fair user access to the modest number of Infiniband- connected nodes on the Biowulf Cluster, effective immediately, a 240 hour walltime limit will be placed on jobs allocating nodes with the "ib" resource. Jobs running for longer than 240 hours will be automatically terminated by the batch system.

Jobs having run for more than 240 hours as of Thursday morning will be deleted at that time; they can be restarted from checkpoint files.

Users who wish to run simulations for longer than 240 hours may wish to use a "job chaining" strategy which is explained in the following web pages:

For GROMACS, http://biowulf.nih.gov/apps/gromacs.html#chain For NAMD, http://biowulf.nih.gov/apps/namd.html#chain For AMBER, http://biowulf.nih.gov/apps/amber.html#chain For CHARMM, http://biowulf.nih.gov/apps/charmm/#daisy

Please send questions and comments to staff@biowulf.nih.gov In particular, if you have better ways to set up chaining jobs than described above, please let us know.




1/12/2015

On Saturday, January 17, 2015, from 7:00 AM to 3:00 PM, maintenance will be performed on the chilled water system in the building 12 data center. The NIH High Performance Computer (HPC) chilled water vendor will be performing this work in coordination with the CIT Data Center Facilities and ORF building 12 facilities teams. The impact to Helix and Biowulf systems has been designated as "Possible Service Affecting - Low" meaning there is a low possibility a service-affecting event will occur. If an issue does occur, however, there will be a six to eight hour cooling outage, which will require Helix and Biowulf to shut down according to established contingency plans. While CIT is taking every precaution to minimize the risk, there is a small chance that a service interruption will occur during the maintenance window. If such an incident occurs, CIT will provide additional information at the time of the event.

If you have any questions, please contact the Helix Systems staff at staff@helix.nih.gov.

Helix Systems Staff




1/14/2015

Join the High Performance Computing Facility Team and Support Leading Edge Biomedical Research at the National Institutes of Health!

The Biowulf staff advances research at NIH by planning, managing, and supporting the NIH High Performance Computing (HPC) Facility, including general-purpose and high-performance computers, storage, and the biomedical applications that run on them. We are currently looking for an enthusiastic individual who will work with our staff of computational biologists and system administrators to

  • Provide guidance, support, training and advice in computational biology systems to NIH intramural researchers
  • Ensure that computational biology applications and services are current, appropriate and sufficient to serve researchers needs
  • Communicate with researchers to answer questions, troubleshoot problems and develop appropriate computational strategies
  • Install and update applications, document system changes for staff, write and maintain documentation for users
  • Consult and collaborate with system administrator coworkers to determine best system configurations for applications
  • Remain abreast of current and emerging computational biology technologies and tools


Required skill set: Genomics knowledge; hands-on experience with a broad spectrum of genomics applications and next-generation sequence data analysis; experience with biological databases; experience working in a HPC environment; ability to install applications; excellent oral and written communication skills; experience presenting and/or teaching; ability to work both independently and as part of the team; understanding of bottlenecks associated with genome-scale data analysis; experience with shell scripting; ability to compile and build applications; ability to develop reasonable tests and example input for genomic applications; self-motivated with excellent organizational, troubleshooting and problem-solving skills; ability to learn about new technologies or software quickly

Other desirable skills: working knowledge of R; working knowledge of Matlab; working knowledge of MySQL; ability to program in at least one of the following languages: Python, C/C++, Java or Perl

Candidate must have a PhD in the biological sciences with one to two years experience in genomic analysis. This is a contractor position.

If interested, send your resume to Steve Bailey at steve.bailey@nih.gov




2/18/2015

Due to some last-minute dropouts, there are a few spots available in tomorrow's Biowulf class. The next Biowulf class will be in June or July.

If you are interested in attending tommorrow's class, please register at

http://helixweb.nih.gov/nih/classes/index.php

Class: NIH Biowulf Cluster: Scientific Computing Instructor: Steven Fellini / Susan Chacko (Helix/Biowulf staff) Date: Thu Feb 19, 2015 Time; 9 am - 4 pm Location: Bldg 12, Rm B51




2/23/2015

As part of the NIH Network modernization effort, maintenance on the Helix Systems Network will result in a brief loss of network connectivity to all Helix Systems at some point between 10 PM and 2 AM on Tuesday/Wednesday February 24/25 for 15-30 minutes.

During this time all network connections to the Helix Systems will terminate, and the Helix Systems will not be accessible.

The Helix Systems include helix.nih.gov, biowulf.nih.gov, helixweb.nih.gov, helixdrive.nih.gov, biospec.nih.gov and sciware.nih.gov.

Please send comments and questions to staff@helix.nih.gov




2/27/2015

On Tuesday March 3 starting at 5:30 am, system maintenance will be performed on the Helix Systems samba servers known as Helixdrive (helixdrive.nih.gov) which could last up to 90 minutes. The Helixdrive service allows users to mount or map the Helix/Biowulf home, data and scratch directories to their local systems.

During this maintenance, any existing user mounts to home, data or scratch directories will be terminated. Users will be unable to initiate new mounts until the maintenance has been completed.

This maintenance will not affect any other Helix/Biowulf services, including Helix and Biowulf login sessions, Biowulf cluster jobs, Helixweb applications or email.

Please contact the Helix Systems staff at staff@helix.nih.gov if you have any questions.




3/10/2015

At approximately noon yesterday, Monday Mar 9, the Biowulf cluster had a network problem that caused some nodes to temporarily lose connections to the fileservers. Some running jobs were affected.

If you had jobs running at noon yesterday, please check their status. If 'jobload' shows the jobs as stalled, i.e. 0% cpu, you should restart the jobs. You should also check output files to make sure that the jobs are continuing to produce output.

Apologies for the problem. Please contact staff@biowulf.nih.gov if you have questions.

Helix/Biowulf staff.




3/11/2015

On Monday March 16 starting at 9:00 pm, system maintenance will be performed on the Helix Systems samba servers known as Helixdrive (helixdrive.nih.gov) which could last up to 3 hours. The Helixdrive service allows users to mount or map the Helix/Biowulf home, data and scratch directories to their local systems.

During this maintenance, any existing user mounts to home, data or scratch directories will be terminated. Users will be unable to initiate new mounts until the maintenance has been completed.

This maintenance will not affect any other Helix/Biowulf services, including Helix and Biowulf login sessions, Biowulf cluster jobs, Helixweb applications or email.

Please contact the Helix Systems staff at staff@helix.nih.gov if you have any questions.




3/11/2015

The Computer Vison System Toolbox for Matlab is available as a trial installation on the Helix systems. This toolbox provides algorithms, functions and apps for the design and simulation of computer vision and video processing systems. A user can perform object detection and tracking, feature matching and other video related tasks.

There are 5 trial licenses and the toolbox is available on helix/biowulf until April 8, 2015. We are considering purchasing this product if there is sufficient interest from our user community.

For more information and instructions on using, see

http://www.mathworks.com/help/vision/index.html




Up to Top

This page last reviewed: December 14, 2010