[Pw_forum] epsilon.x

T t t82h at hotmail.com
Tue Jul 22 17:24:38 CEST 2008




----------------------------------------
> From: t82h at hotmail.com
> To: pw_users at pwscf.org
> Subject: 
> Date: Tue, 22 Jul 2008 15:22:05 +0000
> 
> 
> sorry, you are right dear Axel. I am testing a simple example which is that of silicon before switching to a more complex structure, my scf input is:
> "
>  #! /bin/bash
> #BSUB -a openmpi
> #BSUB -q parallel  -n 2 -W 00:45 -eo %J.err -oo %J.out
> . /cineca/prod/modules/init/bash
> module purge
> module load  intel/10.1
> 
> module load  openmpi/1.2.5/intel/10.1
> module load  QuantumESPRESSO/3.2.3
> #>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
> # general definitions
> SRCDIR=/scratch/userinfm/cne0fm0n/Thierry/mtdftcub/pwscf/80/bin
> PSEUDO_DIR=/scratch/userinfm/cne0fm0n/Thierry/mtdftcub/pwscf/80/Pseudo
> DIR_WORK=/scratch/userinfm/cne0fm0n/Thierry/mtdftcub/pwscf/80/diel
> OUT="si_scf.out"
> cd ${DIR_WORK}
> 
>     cat> si_scf.in << EOF
>  &control
>     calculation = 'scf'
>     restart_mode='from_scratch',
>     prefix='silicon',
>     tstress = .true.
>     tprnfor = .true.
>     pseudo_dir = '$PSEUDO_DIR/',
>  /
>  &system
>     ibrav=  2, celldm(1) =10.20, nat=  2, ntyp= 1,
>     ecutwfc =18.0,
>  /
>  &electrons
>     mixing_mode = 'plain'
>     mixing_beta = 0.7
>     conv_thr =  1.0d-8
>  /
> ATOMIC_SPECIES
>  Si  28.086  Si.vbc.UPF
> ATOMIC_POSITIONS
>  Si 0.00 0.00 0.00
>  Si 0.25 0.25 0.25
> K_POINTS
>   10
>    0.1250000  0.1250000  0.1250000   1.00
>    0.1250000  0.1250000  0.3750000   3.00
>    0.1250000  0.1250000  0.6250000   3.00
>    0.1250000  0.1250000  0.8750000   3.00
>    0.1250000  0.3750000  0.3750000   3.00
>    0.1250000  0.3750000  0.6250000   6.00
>    0.1250000  0.3750000  0.8750000   6.00
>    0.1250000  0.6250000  0.6250000   3.00
>    0.3750000  0.3750000  0.3750000   1.00
>    0.3750000  0.3750000  0.6250000   3.00
> EOF
> mpirun.lsf ${SRCDIR}/pw.x -input  ${DIR_WORK}/si_scf.in>> ${DIR_WORK}/${OUT}
> "
> 
> and the epsilon input is the following:
> "
> #! /bin/bash
> #BSUB -a openmpi
> #BSUB -q parallel  -n 2 -W 01:00 -eo %J.err -oo %J.out
> . /cineca/prod/modules/init/bash
> module purge
> module load  intel/10.1
> 
> module load  openmpi/1.2.5/intel/10.1
> module load  QuantumESPRESSO/3.2.3
> #>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>>
> # general definitions
> SRCDIR=/scratch/userinfm/cne0fm0n/Thierry/mtdftcub/pwscf/80/bin
> PSEUDO_DIR=/scratch/userinfm/cne0fm0n/Thierry/mtdftcub/pwscf/80/Pseudo
> DIR_WORK=/scratch/userinfm/cne0fm0n/Thierry/mtdftcub/pwscf/80/diel
> OUT="si_eps.out"
> cd ${DIR_WORK}
>     cat> si_eps.in << EOF
> &inputpp
>     outdir='$DIR_WORK'
>     prefix='silicon'
>     calculation='eps'
>  /
>  &energy_grid
>     smeartype='gauss'
>     smear=0.15d0
>     wmax=30.0d0
>     nw=1000
>     shift=0.0d0
>  /
> EOF
> mpirun.lsf ${SRCDIR}/epsilon.x -input  ${DIR_WORK}/si_eps.in>> ${DIR_WORK}/${OUT}
> "
> I point out that the scf calculation ends well with all the scf outputs in the working directory whereas the eps calculation gives the following error message:
> "
> Cannot match namelist object name smear
> namelist read: missplaced = sign
> Cannot match namelist object name .15d0
> 
> Cannot match namelist object name smear
> namelist read: missplaced = sign
> Cannot match namelist object name .15d0
> 
> ########################################################################################################################
> # FROM IOTK LIBRARY, VERSION 1.1.0development
> # UNRECOVERABLE ERROR (ierr=1)
> # ERROR IN: iotk_scan_end (iotk_scan.spp:211)
> # CVS Revision: 1.7
> # foundl
> # ERROR IN: iotk_close_read (iotk_files.spp:589)
> # CVS Revision: 1.3
> ########################################################################################################################
> "
> Thanks again
> Thierry 
> 
> _________________________________________________________________
> Curiosità, trucchi e consigli per il tuo Messenger!
> http://www.messenger.it

_________________________________________________________________
Curiosità, trucchi e consigli per il tuo Messenger!
http://www.messenger.it


More information about the Pw_forum mailing list