Biowulf at the NIH
POV-Ray on Biowulf

 

POV-Ray on Biowulf
small jpeg

POV-Ray (Persistence of Vision RAYtracer) is a high-quality tool for creating three-dimensional graphics. Raytraced images are publication-quality and 'photo-realistic', but are computationally expensive so that large images can take many hours to create. POV-Ray images can also require more memory than many desktop machines can handle. To address these concerns, a parallelized version of POV-Ray has been installed on the Biowulf system.

There are two versions of POV-Ray available on the Biowulf cluster:

Version 3.1 is available as both a stand-alone executable (povray-3.1), and as an MPI-parallelized executable (MPI-Povray). However, many other applications which generate POV-Ray input do not support version 3.1.

Version 3.6 (the most recent version) is available as a stand-alone executable (povray). There is no MPI-parallelized executable, but there is a wrapper script povray_swarm which allows an image render to be distributed onto the Biowulf cluster using swarm.

POV-Ray output is limited to only .png, .tga, or .ppm image formats. There are number of programs which can convert images from one format to another available on Helix Systems (e.g., convert, gimp, imagemagick, xnview).

POV-Ray options and parameters are available by typing 'povray' at the Biowulf prompt..

Sample POV-Ray images

Sample POV-Ray v3.6 input: 1ASY.pov (1.4 MB)

Sample POV-Ray v3.6 input: 1AG9.pov (8.6 MB)


Running POV-Ray v3.6 on Biowulf using povray_swarm:

Put your POV-Ray input file (plot.pov) and any other required files (e.g. the pov-ribbons-camera.inc file from Ribbons) into /data/username/, and type povray_swarm along with any desired POV-Ray options (this can include any .ini files as well):

[user@biowulf ~]$ povray_swarm +H2170 +W1826 -I1ASY.pov -O1ASY.pov.tga +P +A \
+FT --ncpus=16
Distributing render (1826X2170) into 16 tiles (4X4) each of size (457X543)

/usr/local/etc/swarmbb --file 1ASY_9633.swarm --ncpus 16 --balance --topdir 9633
--nameNames -S /bin/sh

[main] promoting to dual core nodes

138722.biobos   swarm1n9637
138723.biobos   swarm2n9637
138724.biobos   swarm3n9637
138725.biobos   swarm4n9637

  combining swarm output

swarm done

POV-Anywhere Tile 0.3 alpha
Copyright 2004-2005 by Christoph Hormann <chris_hormann@gmx.de>

Combining 16 tiles into one image...
99 percent
Finished!
pamtotga: computing colormap...
pamtotga: Too many colors for colormapped TGA.  Doing RGB.

Total Machine Time: 223 seconds
Total Elapsed Time: 100 seconds

[user&biowulf ~]$

The image was split into 16 tiles, each tile was rendered independently, and the tiles were stitched together using POV-Anywhere into a final image. Then the image was then converted to .tga format.

The POV-Ray options +P and -D are forcibly added by the povray_swarm script to allow for distributed rendering without complications.

All swarm .out and .err files are concatenated into single files at the end of the run.

Hitting Ctrl-C will trigger swarmdel to delete the swarm run before terminating the script. If povray_swarm in run in background, or to manually stop the script, type 'swarmdel <jobid>' to kill all the individual swarm jobs.


Running POV-Ray on Biowulf using MPI-Povray:

Put your POV-Ray input file and any other required files (e.g. the pov-ribbons-camera.inc file from Ribbons) into /data/username/

Type 'pov' at the Biowulf prompt, and follow instructions.

Sample session, user input in bold:

[user@biowulf ~]$ pov

PovRay: High-Quality Ray-Traced Images
Enter the name of Povray input file: /data/user/povray/purple_helix.pov

Checking /data/user/povray/purple_helix.pov for included files
... include file pov-ribbons-camera.inc
**ERROR**: Included file pov-ribbons-camera.inc is not in 
       /usr/local/pov/include or /home/user or /home/user
Enter directory for pov-ribbons-camera.inc :/data/user/povray/

Enter filename for output: /data/user/test.out
Enter desired width of image (pixels) : 1200
Enter desired height of image (pixels) : 800
Output image format - TGA (default), PNG, PPM : 
Optional: Enter filename of a Povray initialization file : 

http://biowulf.nih.gov/povray.html has a full list of available parameters.
Any additional Povray parameters (e.g. +A0.0 ): 
Creating job file /data/user/povray.7493

Submitting to 16 nodes. Job number is 536827.biobos

Monitor your job at http://biowulf.nih.gov/cgi-bin/queuemon?536827.biobos

[user@biowulf]$

Hint: use the 'convert' program on Helix to convert targa files to any other format. The /data/ directories are mounted on helix as well, so you can cd to your /data directory, then type convert test.tga test.jpg on Helix.

Running MPI-Povray on Biowulf -- the details

Create a script file containing the following (or copy runpovray from /usr/local/pov/demo):

-----------------------script file runpovray-----------------------
#!/bin/tcsh
#
#PBS -N PovRay
#PBS -m be
#PBS -k oe
date
setenv PATH /usr/local/mpich/bin:$PATH
mpirun -machinefile $PBS_NODEFILE -np $np /usr/local/bin/mpi-\
x-povray /data/username/POV.INI -i/data/username/input.pov +L/data/username/\
 +O/data/username/output.file +w100 +h100 -gw
--------------------------------------------------------------------------

where:
/data/username/POV.INI is the Povray parameter file (see below)
/data/username/input.pov is the Povray input file
+L/data/username is the library path. If you have your own include files, such as pov-ribbons-camera.inc, you should point this to the directory where they reside.
/data/username/output.file is the output filename
+w100 describes the width of the output image in pixels (100)
+h100 describes the height of the output image in pixels (100)
-gw is the flag for 'don't print warnings'. It may be useful to see the warnings, so don't use this unless necessary. Some Povray input files produce thousands of warnings and create huge log files, in which case the -gw flag is very useful.

The POV.INI file is not required, but is very useful to keep Povray input parameters reproducible. An example POV.INI file is:

---------------------------file POV.INI-----------------------------
; povray .INI file to remember all the flags
+J0.5
+A0.1
+MB5
+FT
+Q9
+AM2
+V
+R3
;
-------------------------------------------------------------------

Submit the Povray job to the batch queue by typing: (you can also put this in a script)


biowulf% qsub -v np=# -l nodes=# runpovray

where # are the number of nodes and processors required. For an 8-processor job using 2 processors on each node, you would type

biowulf% qsub -v np=8 -l nodes=4 runpovray

The batch system will send you mail when the job starts and finishes.

Displaying the image as it's built

It is possible to watch your image 'growing', as in the demo above. If you connect to Biowulf using ssh from a Unix workstation, your display variable will automatically be set to something like biowulf.nih.gov:18.0, and the X-windows graphics will be appropriately tunnelled through biowulf to your desktop machine. This is crucial, as the nodes can 'talk' only to biowulf, the login node.

In your script, add the flag -display $DISPLAY. See /usr/local/pov/demo/povfast for an example. When you submit the job, type

qsub -v np=#,DISPLAY=$DISPLAY -l nodes=# /home/username/runpovray
                  

(see /usr/local/pov/demo/fast for a sample script that does this)

Your job will then pop up a window on your desktop machine in which the image will appear as it is calculated by the nodes. You MUST close the window after the image has been generated. As long as the image window is open, your job still has the nodes allocated and they are unavailable to other users. So please use this feature only if you plan to be around until the end of the job.

Caveats

If POV-Ray doesn't find an include file that it needs, it may not exit cleanly. This means that the job will continue to spin and the nodes will still be allocated. If the povray output file is not growing in size, chances are the job is spinning uselessly. You need to kill the job (with 'qdel xxxx.biowulf'), fix the problem, and restart.

If you won't be around to watch the file growing, you can also use the '-l walltime=1:00:00' argument to qsub, which limits the job to a walltime of one hour (use whatever limit you think is appropriate).

MPI-Povray demo for Unix workstations connecting via ssh

small jpeg

From your desktop Unix workstation, use ssh to connect to biowulf. Your display variable should automatically be set to something like biowulf.nih.gov:17.0. (This is crucial for the demo to work!) The demo will create two output files totalling 240 Kb, so run it from a directory (/home/username or /data/username) that has this space available. Type

biowulf% /usr/local/pov/demo/showme

This will start up two runs of Povray, one on a single node using two processors, and one on 8 nodes using 16 processors. It will pop up two windows on your screen, one from each run, and you will be able to see how parallelization dramatically reduces the time taken for Povray image calculation. The image produced is a space-filling model of the pseudomonas exotoxin, as on the right here.


Memory requirements

Some POV-Ray jobs may require large amounts of memory, so the nodes and processors should be chosen carefully. See the hardware section of the Biowulf user guide for the current memory configuration of the nodes. Choose the proper nodes by using the appropriate switch on the qsub command, e.g.

qsub -v np=8 -l nodes=2:o2600 runpovray

will use 8 processors on 2 dual-core 2.6GHz 4MB nodes, or

povray_swarm +H2170 +W1826 -I1ASY.pov -O1ASY.pov.tga +P +A +FT --ncpus=16 \
 --props=nodes=1:p2800:m4096

will use 16 processors on 8 older (but much more available!) Xeon 2.8GHz 4GB nodes.


Creating Povray input files

Many programs can produce POV-Ray input files of molecular structures; here are some examples:

Use RasMol to display your molecule, then export a Povray file.

Use Povscript, a version of Molscript that can create Povray input files. It is available on helix.

Use Ribbons to display the molecule, then export as a Povray file.

VMD can display molecules and export Povray input files.

Swiss-PDBViewer can produce Povray input files.