[Pw_forum] MPI work via LSF

Charles Chen polynmr at physics.unc.edu
Tue Apr 1 23:01:27 CEST 2008


Dear PWSCF users

Recently, I compiled the V4.0cvs on our ITS machine, IA64 with Linux.

When I run the examples from the package, e.g. example01 on Si crystal.
The calculation can easily get the LSF overload and I got the warning
message from the admin:

> Your job on cypress (JobID 114850) is causing the load average to get very high again.  Whatever you are doing is giving the system problems.

Please help me on tuning the configuration and make our system admin a
little happier when he see me.

There is another problem which actually I have sent to the maillist but
got no reply yet. Has it been answered somewhere else many times, if so,
please let me know.

I am not sure this is a problem originated from MPICH and LSF. I now can
finish the example with Si, from SCF calculation to bands calculation,
to band post-processing. However, I am still not lucky on trying the
GIPAW example. Since our ITS machines do not allow interactive job, I
submit job through bsub. I successfully finished si.scf.in, which is
extracted from the GIPAW example. When I continued to run the nmr part,
it failed. The error message is like this when I submitted the job:

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
        from gipaw_readin : error #         1
        reading inputgipaw namelist

%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%

The MPI error message is like this:

MPI: On host cypress, Program
/netscr/polynmr/espresso4.0cvs2/GIPAW/gipaw.x, Rank 0, Process 4934 called
MPI_Abort(<communicator>, 0)

MPI: --------stack traceback-------
line: 2 Unable to parse input as legal command or C expression.
The "backtrace" command has failed because there is no running program.
MPI: Intel(R) Debugger for applications running on IA-64, Version 10.1-32
, Build 20070829

MPI: -----stack traceback ends-----
MPI: MPI_COMM_WORLD rank 0 has terminated without calling MPI_Finalize()
MPI: aborting job

It looks the -in option does not work with GIPAW code, am I right?

In case this is the configure problem, I have may config.log attached.

Thanks!

Charles Chen



-------------- next part --------------
An embedded and charset-unspecified text was scrubbed...
Name: sinmr.in
Url: http://www.democritos.it/pipermail/pw_forum/attachments/20080401/1cecc16b/attachment.txt 


More information about the Pw_forum mailing list