src issueshttps://gitlab.psi.ch/OPAL/src/-/issues2017-07-24T10:29:38+02:00https://gitlab.psi.ch/OPAL/src/-/issues/101OPAL version2017-07-24T10:29:38+02:00frey_mOPAL versionAn additional flag at runtime of OPAL would be nice that returns the current version, i.e.
```
matthias@R2-D2:~$ opal --version
```An additional flag at runtime of OPAL would be nice that returns the current version, i.e.
```
matthias@R2-D2:~$ opal --version
```2017-05-02https://gitlab.psi.ch/OPAL/src/-/issues/99When APVETO=TRUE set phase relative to arrival time2017-07-24T10:29:38+02:00krausWhen APVETO=TRUE set phase relative to arrival timeThe phase of a cavity at time $`t`$ is given by
```math
\varphi (t) = \omega \cdot t + \varphi_{\text{LAG}} + \varphi_0.
```
When running the auto-phasing algorithm we set the phase of a cavity relative to the phase at which a cavity...The phase of a cavity at time $`t`$ is given by
```math
\varphi (t) = \omega \cdot t + \varphi_{\text{LAG}} + \varphi_0.
```
When running the auto-phasing algorithm we set the phase of a cavity relative to the phase at which a cavity yields maximal energy. Thus $`\varphi_0 = \varphi_{\text{max}}`$. In some cases we want or have to set APVETO=TRUE. Currently we set $`\varphi_0 = 0`$ but we should set it such that $`\varphi (t_{\text{ELEMEDGE}}) = 0`$. Here $`t_{\text{ELEMEDGE}}`$ is the time at which the reference particle enters the element (either physically or alternatively the region in which field of the cavity is non-zero).OPAL 2.0.0krauskraushttps://gitlab.psi.ch/OPAL/src/-/issues/98Placement of elements in 3D coordinates not possible anymore2017-06-17T20:38:34+02:00krausPlacement of elements in 3D coordinates not possible anymorePlacement of elements in 3D coordinates (see attachment) was possible, this isn't the case anymore.
This issue has to do with the fact that I added the attribute ELEMEDGE and introduced access methods.
[Niowave_first_korrektur.dat](/u...Placement of elements in 3D coordinates (see attachment) was possible, this isn't the case anymore.
This issue has to do with the fact that I added the attribute ELEMEDGE and introduced access methods.
[Niowave_first_korrektur.dat](/uploads/ad152c3a3e13fa0ec231105ec4711817/Niowave_first_korrektur.dat)[Banana_ref.in](/uploads/68db2ec88393f764cfebd520466bf2de/Banana_ref.in)[ez_normalizedcathodepos_4.txt](/uploads/8015defbc8c4082e95296f6ff3133670/ez_normalizedcathodepos_4.txt)OPAL 1.9.xkrauskraushttps://gitlab.psi.ch/OPAL/src/-/issues/97Collimator/Probe2018-08-14T07:51:06+02:00adelmannCollimator/ProbeSuggestions from @zhang_h :
Distinguish between probes and collimator and name the collimators on the stdout.
Write probe and collimator data in one line for easy post processing
Add angles back per defaultSuggestions from @zhang_h :
Distinguish between probes and collimator and name the collimators on the stdout.
Write probe and collimator data in one line for easy post processing
Add angles back per defaultOPAL 1.9.xadelmannadelmannhttps://gitlab.psi.ch/OPAL/src/-/issues/96DKS 1.1.0 for OPAL 1.6 branch2017-06-17T20:38:34+02:00gsellDKS 1.1.0 for OPAL 1.6 branchDKS 1.1.0 must be used in OPAL 1.6. So we have the same toolchain for OPAL 1.6 and masterDKS 1.1.0 must be used in OPAL 1.6. So we have the same toolchain for OPAL 1.6 and masterhttps://gitlab.psi.ch/OPAL/src/-/issues/95OpalRingTest2020-04-22T17:05:41+02:00adelmannOpalRingTestOpalRingTest new with 2x2x2 space charge grid gives of course different answers w.r.t. emittance etc.
Please check that this makes still sense. I updated the reference with the actual resultsOpalRingTest new with 2x2x2 space charge grid gives of course different answers w.r.t. emittance etc.
Please check that this makes still sense. I updated the reference with the actual resultsext-rogers_cext-rogers_chttps://gitlab.psi.ch/OPAL/src/-/issues/94Error detected by function "FileStream::fillLine()"2017-06-17T20:38:34+02:00ganz_pError detected by function "FileStream::fillLine()"I ran some simulations and at a certain point on all simulations gave me following error:
[Terminal.out](/uploads/8d537807dbf8586b2ec6f08e87a708ae/Terminal.out)
I've tried to vary the opal command (with and without `mpirun`, or `--use-d...I ran some simulations and at a certain point on all simulations gave me following error:
[Terminal.out](/uploads/8d537807dbf8586b2ec6f08e87a708ae/Terminal.out)
I've tried to vary the opal command (with and without `mpirun`, or `--use-dks`), but all files, even files which already ran well gave me that error.
The Opal Version I use is: `OPAL/1.5.1-20170217`
Example .in file:
[100MeV_InvQuad_1_NoColl.in](/uploads/44d81f1f63a2ffffc828556e7944cfdb/100MeV_InvQuad_1_NoColl.in)OPAL 1.6.0adelmannadelmannhttps://gitlab.psi.ch/OPAL/src/-/issues/93SAAMG-Test-1.in PARALLEL2017-08-09T21:28:33+02:00adelmannSAAMG-Test-1.in PARALLEL
The test is from git@gitlab.psi.ch:OPAL/regression-tests.git
and the `git checkout OPAL-1.6`
Parallel run fails, serial is ok.
```
mpirun -np 4 opal SAAMG-Test-1.in
* Node:0, Filling RHS...
* Node:1, Filling RHS...
* Nod...
The test is from git@gitlab.psi.ch:OPAL/regression-tests.git
and the `git checkout OPAL-1.6`
Parallel run fails, serial is ok.
```
mpirun -np 4 opal SAAMG-Test-1.in
* Node:0, Filling RHS...
* Node:1, Filling RHS...
* Node:1, Rho for final element: 0.0000000000000000e+00
* Node:2, Filling RHS...
* Node:2, Rho for final element: 0.0000000000000000e+00
* Node:2, Local nx*ny*nz = 1575
* Node:2, Number of reserved local elements in RHS: 832
* Node:2, Number of reserved global elements in RHS: 3328
* Node:3, Filling RHS...
* Node:3, Rho for final element: 0.0000000000000000e+00
* Node:3, Local nx*ny*nz = 3375
* Node:3, Number of reserved local elements in RHS: 832
* Node:3, Number of reserved global elements in RHS: 3328
* Node:0, Rho for final element: 0.0000000000000000e+00
* Node:0, Local nx*ny*nz = 735
* Node:0, Number of reserved local elements in RHS: 832
* Node:0, Number of reserved global elements in RHS: 3328
* Node:1, Local nx*ny*nz = 1575
* Node:1, Number of reserved local elements in RHS: 832
* Node:1, Number of reserved global elements in RHS: 3328
* Node:2, Number of Local Inside Points 832
* Node:0, Number of Local Inside Points 832
* Node:3, Number of Local Inside Points 832
* Node:3, Done.
* Node:0, Done.
* Node:1, Number of Local Inside Points 832
* Node:1, Done.
* Node:2, Done.
[fast-dude:02195] *** Process received signal ***
[fast-dude:02195] Signal: Segmentation fault: 11 (11)
[fast-dude:02195] Signal code: Address not mapped (1)
[fast-dude:02195] Failing at address: 0x7fe2336ae600
[fast-dude:02195] *** End of error message ***
--------------------------------------------------------------------------
mpirun noticed that process rank 3 with PID 2195 on node fast-dude exited on signal 11 (Segmentation fault: 11).
--------------------------------------------------------------------------
```OPAL 2.0.0Yves IneichenYves Ineichenhttps://gitlab.psi.ch/OPAL/src/-/issues/92ENABLERUTHERFORD and DKS2019-03-15T13:40:02+01:00Valeria RizzoglioENABLERUTHERFORD and DKSI am testing the attribute **ENABLERUTHERFORD=FALSE** using the new OPAL module OPAL/1.5.2.
Analysing the particle distribution, I have noticed that phase space is different with and without DKS.
* **Run without DKS:** ` mpirun -np 8...I am testing the attribute **ENABLERUTHERFORD=FALSE** using the new OPAL module OPAL/1.5.2.
Analysing the particle distribution, I have noticed that phase space is different with and without DKS.
* **Run without DKS:** ` mpirun -np 8 opal Degrader_1Slab_230.in`
![OPAL_1.5.2_nodks](/uploads/135423a9df3842bc730cd54969389a75/OPAL_1.5.2_nodks.png)
* **Run with DKS:** ` mpirun -np 8 opal --use-dks Degrader_1Slab_230.in`
![OPAL_1.5.2_dks](/uploads/13a3731c8b3871d1e88630ab08d851cf/OPAL_1.5.2_dks.png)
It seems that running with DKS the attribute **ENABLERUTHERFORD** has not been implemented.
Here the input file: [Degrader_1Slab_230.in](/uploads/de37f170435fcdda5e621019974dda1e/Degrader_1Slab_230.in)OPAL 2.0.0baumgartenchristian.baumgarten@psi.chbaumgartenchristian.baumgarten@psi.chhttps://gitlab.psi.ch/OPAL/src/-/issues/91Documentation for attribute DESIGNENERGY of kickers missing2017-06-17T20:38:34+02:00krausDocumentation for attribute DESIGNENERGY of kickers missingOPAL 2.0.0krauskraushttps://gitlab.psi.ch/OPAL/src/-/issues/90OPAL-Cycl - COMET2017-06-17T20:38:34+02:00adelmannOPAL-Cycl - COMETI have been using a locally compiled code with a version number 1.2.1 SVN. I have also run the program through module load with a version number 1.4.3. The loss files are basically the same.
Attached is the input file vc.in. Two phase...I have been using a locally compiled code with a version number 1.2.1 SVN. I have also run the program through module load with a version number 1.4.3. The loss files are basically the same.
Attached is the input file vc.in. Two phase slits CMA1 and CMA2 work quite well. However, the loss data from the vertical collimators, for example, from the pair VC7 and VC8, often register the same particles.
[vc.in](/uploads/8630def3fe171c14cc64887dc9991232/vc.in)OPAL 1.6.0adelmannadelmannhttps://gitlab.psi.ch/OPAL/src/-/issues/89Bunch Printing Format2017-04-12T16:12:15+02:00frey_mBunch Printing FormatThe format of printing the charge should be changed from ```std::fixed``` to ```std::scientific``` in the function ```getChargeString(double charge, unsigned int precision = 3)``` (file src/Classic/Utilities/Util.h) because otherwise rea...The format of printing the charge should be changed from ```std::fixed``` to ```std::scientific``` in the function ```getChargeString(double charge, unsigned int precision = 3)``` (file src/Classic/Utilities/Util.h) because otherwise really small charges are printed as zero.OPAL 1.9.xfrey_mfrey_mhttps://gitlab.psi.ch/OPAL/src/-/issues/88PSI Opal build chain failt2019-10-25T13:46:05+02:00baumgartenchristian.baumgarten@psi.chPSI Opal build chain failt`cmake ..
-- The C compiler identification is GNU 5.4.0
-- The CXX compiler identification is GNU 5.4.0
-- Check for working C compiler: /afs/psi.ch/sys/psi.x86_64_slp6/Programming/gcc/5.4.0/bin/gcc
-- Check for working C compiler: /...`cmake ..
-- The C compiler identification is GNU 5.4.0
-- The CXX compiler identification is GNU 5.4.0
-- Check for working C compiler: /afs/psi.ch/sys/psi.x86_64_slp6/Programming/gcc/5.4.0/bin/gcc
-- Check for working C compiler: /afs/psi.ch/sys/psi.x86_64_slp6/Programming/gcc/5.4.0/bin/gcc -- works
-- Detecting C compiler ABI info
-- Detecting C compiler ABI info - done
-- Detecting C compile features
-- Detecting C compile features - done`
Ich bin wie im Wiki beschrieben vorgegangen, dh.:
`mkdir $HOME/opal
cd $HOME/opal
git clone git@gitlab.psi.ch:OPAL/src.git
git checkout OPAL-1.6
mkdir build
cd build
cmake ..`
Dann kommt leider folgende Fehlermeldung:
`-- Check for working CXX compiler: /afs/psi.ch/sys/psi.x86_64_slp6/Programming/gcc/5.4.0/bin/g++
-- Check for working CXX compiler: /afs/psi.ch/sys/psi.x86_64_slp6/Programming/gcc/5.4.0/bin/g++ -- works
-- Detecting CXX compiler ABI info
-- Detecting CXX compiler ABI info - done
-- Detecting CXX compile features
-- Detecting CXX compile features - done
-- Build type is: RelWithDebInfo
-- Host OS System: Linux-2.6.32-642.13.1.el6.x86_64
-- Hostname: pc10758
-- Unable to determine MPI from MPI driver /afs/psi.ch/sys/psi.x86_64_slp6/Compiler/openmpi/1.10.4/gcc/5.4.0/bin/mpicc
CMake Error at /afs/psi.ch/sys/psi.x86_64_slp6/Programming/cmake/3.6.3/share/cmake-3.6/Modules/FindPackageHandleStandardArgs.cmake:148 (message):
Could NOT find MPI_C (missing: MPI_C_LIBRARIES MPI_C_INCLUDE_PATH)
Call Stack (most recent call first):
/afs/psi.ch/sys/psi.x86_64_slp6/Programming/cmake/3.6.3/share/cmake-3.6/Modules/FindPackageHandleStandardArgs.cmake:388 (_FPHSA_FAILURE_MESSAGE)
/afs/psi.ch/sys/psi.x86_64_slp6/Programming/cmake/3.6.3/share/cmake-3.6/Modules/FindMPI.cmake:628 (find_package_handle_standard_args)
CMakeLists.txt:33 (find_package)`
`-- Configuring incomplete, errors occurred!
See also "/home/l_baumgarten/opal/devel/src/build/CMakeFiles/CMakeOutput.log".`
Module:
`module list
Currently Loaded Modulefiles:
1) cmake/3.6.3 4) Tcl/8.6.4 7) boost/1.62.0 10) trilinos/12.10.1 13) gnuplot/5.0.0
2) gcc/5.4.0 5) Tk/8.6.4 8) gsl/2.2.1 11) hdf5/1.8.18
3) openssl/1.0.2j 6) Python/2.7.12 9) openmpi/1.10.4 12) H5hut/2.0.0rc3`gsellgsellhttps://gitlab.psi.ch/OPAL/src/-/issues/87OPAL master does not compile with NOCPLUSPLUS11_NULLPTR=ON2017-05-01T09:05:17+02:00snuverink_jjochem.snuverink@psi.chOPAL master does not compile with NOCPLUSPLUS11_NULLPTR=ONOPAL master does not compile (gcc 4.8.5) with build option NOCPLUSPLUS11_NULLPTR=ON with the following error:
`src/Classic/AbsBeamline/RFCavity.cpp:152:27: error: call of overloaded ‘unique_ptr(NULL)’ is ambiguous
frequency_td_m(nu...OPAL master does not compile (gcc 4.8.5) with build option NOCPLUSPLUS11_NULLPTR=ON with the following error:
`src/Classic/AbsBeamline/RFCavity.cpp:152:27: error: call of overloaded ‘unique_ptr(NULL)’ is ambiguous
frequency_td_m(nullptr)`
This comes from:
`RNormal_m(nullptr)`
which is defined as
`std::unique_ptr<double[]> RNormal_m;`
With NOCPLUSPLUS11_NULLPTR this is translated to RNormal_m(NULL), for which multiple constructors are possible.
Since there is already quite a bit of C++11 in OPAL, instead of fixing I would suggest (but with caution as I don't know the reason for the build option) to remove the NOCPLUSPLUS11_NULLPTR (and perhaps also the similar NOCPLUSPLUS11_FOREACH).OPAL 1.9.xhttps://gitlab.psi.ch/OPAL/src/-/issues/86OPAL-1.6 check DKS version used to compile2017-06-17T20:38:34+02:00Uldis LocansOPAL-1.6 check DKS version used to compileOPAL-1.6 does not check which DKS version is used so compilation errors are possible due to the wrong versionsOPAL-1.6 does not check which DKS version is used so compilation errors are possible due to the wrong versionsOPAL 1.6.0https://gitlab.psi.ch/OPAL/src/-/issues/85Error in compiling OPAL-1.6 with -DENABLE_DKS=12017-06-17T20:38:34+02:00Valeria RizzoglioError in compiling OPAL-1.6 with -DENABLE_DKS=1I have the following modules loaded:
```
Currently Loaded Modulefiles:
1) gcc/5.4.0 4) hdf5/1.8.18 7) trilinos/12.10.1 10) OpenBLAS/0.2.19 13) opal-toolschain/1.6
2) openmpi/1.10.4 5) H5hut/2.0...I have the following modules loaded:
```
Currently Loaded Modulefiles:
1) gcc/5.4.0 4) hdf5/1.8.18 7) trilinos/12.10.1 10) OpenBLAS/0.2.19 13) opal-toolschain/1.6
2) openmpi/1.10.4 5) H5hut/2.0.0rc3 8) root/6.08.02 11) cuda/8.0.44
3) boost/1.62.0 6) gsl/2.2.1 9) cmake/3.6.3 12) dks/1.0.1
```
and I got the following error message:
```
/home/scratch/opal/src/src/Classic/Solvers/CollimatorPhysics.cpp: In member function ‘void CollimatorPhysics::setupCollimatorDKS(PartBunch&, Degrader*, size_t)’:
/home/scratch/opal/src/src/Classic/Solvers/CollimatorPhysics.cpp:1094:52: error: no matching function for call to ‘DKSBase::callInitRandoms(int&, int&)’
dksbase.callInitRandoms(size, Options::seed);
^
In file included from /home/scratch/opal/src/ippl/src/Utility/IpplInfo.h:59:0,
from /home/scratch/opal/src/ippl/src/Message/Message.hpp:29,
from /home/scratch/opal/src/ippl/src/Message/Message.h:618,
from /home/scratch/opal/src/ippl/src/AppTypes/Vektor.h:16,
from /home/scratch/opal/src/src/Classic/Algorithms/Vektor.h:6,
from /home/scratch/opal/src/src/Classic/Solvers/CollimatorPhysics.hh:13,
from /home/scratch/opal/src/src/Classic/Solvers/CollimatorPhysics.cpp:9:
/opt/psi/MPI/dks/1.0.1/openmpi/1.10.4/gcc/5.4.0/include/DKSBase.h:1077:7: note: candidate: int DKSBase::callInitRandoms(int)
int callInitRandoms(int size);
^
/opt/psi/MPI/dks/1.0.1/openmpi/1.10.4/gcc/5.4.0/include/DKSBase.h:1077:7: note: candidate expects 1 argument, 2 provided
[ 60%] Building CXX object src/CMakeFiles/OPALib.dir/Classic/Utilities/DivideError.cpp.o
```OPAL 1.6.0https://gitlab.psi.ch/OPAL/src/-/issues/84Cyclotron (COMET) does not read RFMAPFN's2020-12-07T13:27:04+01:00adelmannCyclotron (COMET) does not read RFMAPFN's```
COMET: Cyclotron, TYPE="BANDRF", CYHARMON= 2, PHIINIT= -71.649, PRINIT= pr0, RINIT= r0 , SYMMETRY= 1.0,
FMAPFN="BMap_Christian.txt",
RFPHI= {hfphi0/180*pi,hfphi0/180*pi,hfphi0/180*pi,hfphi0/180...```
COMET: Cyclotron, TYPE="BANDRF", CYHARMON= 2, PHIINIT= -71.649, PRINIT= pr0, RINIT= r0 , SYMMETRY= 1.0,
FMAPFN="BMap_Christian.txt",
RFPHI= {hfphi0/180*pi,hfphi0/180*pi,hfphi0/180*pi,hfphi0/180*pi,0.5*pi,0.5*pi,0.5*pi,0.5*pi},
RFFREQ= {frequency,frequency,frequency,frequency,0,0,0,0},
RFMAPFN={"ChimneyEB.h5part","PullerEB.h5part","M77EB.h5part","COMETRF_x850EBc.h5part",
"ehfieldTR.h5part","ehfieldTR2.h5part","ehfieldTR3.h5part","ehfieldTR4.h5part"},
ESCALE={0.84,0.84,0.84,0.4395,-4.5,+6.5,+4.5,-6.5},
MAXZ=15, MINZ=-15, MINR=0, MAXR= 881.1,
SUPERPOSE={false,false,false,false,true,true,true,true};
```
The full set of inputfiles is to large. All inputfiles can be found at: merlinl1.psi.ch:~adelmann/COMET/1.5.1-20170217OPAL 2021.1adelmannext-calvo_ppedro.calvo@ciemat.esadelmannhttps://gitlab.psi.ch/OPAL/src/-/issues/83Bethe-Bloch threshold2021-01-30T19:16:00+01:00adelmannBethe-Bloch thresholdAllow the user to specify when a particle is dead (in the Bethe-Bloch) calculationAllow the user to specify when a particle is dead (in the Bethe-Bloch) calculationOPAL 2021.1adelmannext-calvo_ppedro.calvo@ciemat.esadelmannhttps://gitlab.psi.ch/OPAL/src/-/issues/82IPPL extra message error2017-12-21T12:02:10+01:00frey_mIPPL extra message errorOPAL crashes for > 16 cores (but works with #cores = 4) with the error message
>>>
Error{0}> get_iter(): no more items in Message
Error{0}> reduce: mismatched element count in vector reduction.
Warning{0}> CommMPI: Found extra message...OPAL crashes for > 16 cores (but works with #cores = 4) with the error message
>>>
Error{0}> get_iter(): no more items in Message
Error{0}> reduce: mismatched element count in vector reduction.
Warning{0}> CommMPI: Found extra message from node 11, tag 10218: msg = Message contains 2 items (0 removed). Contents:
Warning{0}> Item 0: 1 elements, 1 bytes total, needDelete = 0
Warning{0}> Item 1: 3 elements, 24 bytes total, needDelete = 0
>>>
in case of serial x and y directions (i.e. PARFFTX=false, PARFFTY=false) and parallel z direction (i.e. PARFFTT=true). The simulation that was ran is [psiring.in](/uploads/06e3f41f765be149e96b56bd6b277485/psiring.in). The fieldmaps can be found in the repository [AMAS-BDModels / PSI-Ring](https://gitlab.psi.ch/AMAS-BDModels/PSI-Ring/tree/master/Fieldmaps). Following modules were used for running on Merlin:
>>>
module use unstable
module add gcc/5.4.0
module add openmpi/1.10.4
module add hdf5/1.8.18
module add H5hut/2.0.0rc3
module add trilinos/12.10.1
module add gsl/2.2.1
module add boost/1.62.0
>>>
When changing to parallel x, y and serial z, i.e. PARFFTX=true, PARFFTY=true and PARFFTT=false) no error occurs.OPAL 1.9.xfrey_mfrey_mhttps://gitlab.psi.ch/OPAL/src/-/issues/81Segfault within Surfacephysics2017-06-17T20:38:35+02:00krausSegfault within SurfacephysicsWith input file [Degrader_70.in](/uploads/4971dc04fcdf6cbee66b92aea9f83832/Degrader_70.in) I got a segmentation fault. Suddenly an incredibly large number of additional particles were generated, then OPAL crashed. Couldn't reproduce it a...With input file [Degrader_70.in](/uploads/4971dc04fcdf6cbee66b92aea9f83832/Degrader_70.in) I got a segmentation fault. Suddenly an incredibly large number of additional particles were generated, then OPAL crashed. Couldn't reproduce it anymore, but something isn't correct.OPAL 2.0.0krauskraus