Different initial distribution with same input file
When running regression tests multiple times, some parameters of the initial distribution are different at each run, even though
SEED is the default. Initial differences are small, but they diverge with time and can yield larger-than-machine-precision errors in the final time-step, which makes it harder to reproduce results and run regression tests (e.g. see #658 (closed)).
Steps to reproduce
By running the regression test
Distribution-Gauss-1.in several times one can see that the initial bunch is not the same at each run (see image below).
However, this only happens when I run the tests on 4 cores. When running the test on 1 or 2 cores the initial distribution is the same at every run. So I suspect that this might be due to differences in timing of communication between processors.
Comparison of the initial bunch between three runs on 4 cores: