[Pw_forum] parallel compiling

stargmoon stargmoon at yahoo.com
Mon Sep 18 17:37:22 CEST 2006


Thanks a lot Dr.Kohlmeyer, I will ask our sysadmin and let you know if I can successfully compile the parallel version.
 
 Best regards,
 
 Stargmoon

Axel Kohlmeyer <akohlmey at cmm.chem.upenn.edu> wrote: On 9/18/06, stargmoon  wrote:
> Dear Dr.Kohlmeyer,
>
>  Thanks for your reply.
>
>  I am just one user of our cluster, that is, I am not the system
> administrator. I do not know the detail of the installation of softwares

so please _ask_ your sysadmin, this is the person 'in the know'
about the details. as i wrote before, all machines/clusters are
slightly different, so there is no way to predict what might be a problem.

> there. But MPICH was there, and can be launched by "module add".  And I was
> told that MPICH was installed ourselves, not bundled with the machines.

so then your sysadmin _has_ to know...

> The command line I used to do "configure" is "./configure
> MPI_LIBS="-L/opt/mpi/tcp/mpich-pgi/lib -lmpich -lfmpich"
>
>  Could you please give me more hints what I should do to figure out this
> problem?

as i wrote before, please try to compile and run _another_ MPI program,
best one of the MPI tutorial examples, as they are rather trivial.
once you get that working, let us know what command line you needed
for the successful compile that produced a usable executable,
we can look into getting QE compiled.

>
>  By the way, have you ever used pathscale to compile espresso? It seems to
> be possible from the update information for espresso3.1.1. We have pathscale
> and mpich for pathscale work for VASP on our cluster.

i managed to do it a long time ago (including a few manual hacks).
but i didn't have access to a machine with pathscale for quite a while.
with the addition of the iotk library, probably a few more tweaks are/were
needed. i would not expect a large difference between pathscale and PGI
on AMD64 machines. most of the speed comes from the design of the
cpu itself and usually numerical codes like QE are faster, if you _lower_
the optimization (and especially avoid IPA/IPO and heuristic vectorization).
QE already takes a lot of advantage of SIMD instructions through the
use of optimized BLAS/LAPACK libraries (i.e. ACML on AMD64).

cheers,
    axel.

>
>  Best,
>
>  Stargmoon
>
> Axel Kohlmeyer  wrote:
>  On 9/17/06, stargmoon wrote:
> > Dear pwscf community,
> >
> > I tried to compile Espresso-3.1.1 recently on our PC cluster (AMD64).
> > However, after I run ./configure, I am told that "Parallel environment not
> > detected". I checked the config.log, since there is no problem in seaching
> > for the MPI compilers (mpif90, mpif77 and mpicc), I think it must be the
> MPI
>
> there are two stages of the search. a) whether the executables exists
> and b) whether they can produce working binaries.
>
> > library problem. Therefore, I tried to set "MPI_LIBS" (there is only
> > libmpich.a in there) in the ./configure command line, but it did not work
>
> i would have expected a libfmpich.a, too.
>
> > either. Could anybody please tell me what kind of MPI libraries I have to
> > point to the "configure" in order to get parallel compilation?
>
> this is impossible to tell, without knowing any details about your system.
>
> what parallel software are you using (it looks like MPICH) and did you
> install it yourself or was it bundled with the system? do mpif77 and
> mpif90 point to a sufficient fortran
>
> can you compile and run any of the (trivial) MPI example programs
> that usually ship with MPI packages, and if yes, please describe the
> commandline you use for that. based on that information, we may
> be able to help you.
>
> especially on linux machines, there are almost always a few kinks
> to be worked out in the installation.
>
> regards,
>  axel.
>
> >
> > Thanks in advance!
> >
> > stargmoon
> >
> >
> > ________________________________
> > Get your email and more, right on the new Yahoo.com
> >
> >
>
>
> --
> =======================================================================
> Axel Kohlmeyer akohlmey at cmm.chem.upenn.edu http://www.cmm.upenn.edu
>  Center for Molecular Modeling -- University of Pennsylvania
> Department of Chemistry, 231 S.34th Street, Philadelphia, PA 19104-6323
> tel: 1-215-898-1582, fax: 1-215-573-6233, office-tel: 1-215-898-5425
> =======================================================================
> If you make something idiot-proof, the universe creates a better idiot.
> _______________________________________________
> Pw_forum mailing list
> Pw_forum at pwscf.org
> http://www.democritos.it/mailman/listinfo/pw_forum
>
>
>
>  ________________________________
> Stay in the know. Pulse on the new Yahoo.com. Check it out.
>
>


-- 
=======================================================================
Axel Kohlmeyer   akohlmey at cmm.chem.upenn.edu   http://www.cmm.upenn.edu
  Center for Molecular Modeling   --   University of Pennsylvania
Department of Chemistry, 231 S.34th Street, Philadelphia, PA 19104-6323
tel: 1-215-898-1582,  fax: 1-215-573-6233,  office-tel: 1-215-898-5425
=======================================================================
If you make something idiot-proof, the universe creates a better idiot.
_______________________________________________
Pw_forum mailing list
Pw_forum at pwscf.org
http://www.democritos.it/mailman/listinfo/pw_forum


 		
---------------------------------
Yahoo! Messenger with Voice. Make PC-to-Phone Calls to the US (and 30+ countries) for 2¢/min or less.
-------------- next part --------------
An HTML attachment was scrubbed...
URL: /pipermail/attachments/20060918/8a218279/attachment.htm 


More information about the Pw_forum mailing list