19th EMWSD (Wakis) team meeting

Europe/Zurich
6/R-018 (CERN)

6/R-018

CERN

20
Show room on map
Elena De La Fuente Garcia (Universidad Politecnica de Madrid (ES))
Description

Wake and Impedance Solver development:

3D time-domain electromagnetic wake solver for beam-coupling impedance computations, using the Finite Integration (FIT) technique

Zoom Meeting ID
62540318786
Host
Carlo Zannini
Useful links
Join via phone
Zoom URL

- Where are we?:

    - 8 months into the PhD. Working on next milestones of the project: 4. Beam current injection, 5. Materials (conductivity), 6. Open Boundaries (PML) & Testing (testing specially relevant for UPM university)

- Progress on the code:
    - Merging wakis and FIT: wakis package functionality is now refactored into WakeSolver  class that can be passed as object to the FIT Solver. WakeSolver  encapsulates all the functions to compute wake potential and impedance, read fields, with the same syntax as wakis  package. It can be used independently to the FIT solver by just importing and using the WakeSolver class on its own.
        - Discussion: Gianni agrees that we can merge Wakis and FIT and make it a single package. (Action) Lorenzo will transfer the ownership of PyFIT branch to ImpedanCEI and we rename it wakis. Gianni suggests to convert it to python package so it is easier to install.
    - Attribute cleanup: reduces memory consumption 60% and slightly improves performance.

        - Discussion: Gianni and Lorenzo mentioned we can avoid having these attributes in selfbut agreed that it's good for diagnosis. Gianni suggests to have a debug flag that dumps these attributes to a dictionary in case of need. 

    - Built-in plotting 1D, 2D, (matlab based) and 3D (pyvista/vtk based): Fast, flexible (through **kwargs ), proven not memory consuming, possibility of offscreen plotting to create animations. In these slides, some of the examples available are showcased (planewave interacting with dielectric, gaussian wavepacket difusion, planewave reflection...). 
        - Discussion: Gianni asks if these examples will eventually become tests by adding some asserts. Elena confirms, since these examples are requests from university to quantitatively compute certain characteristics of the code (symmetry, numerical dispersion, etc.). We all agree on the importance of this tests.    
    - Solving routines:
        - emsolve() for pure Electromagnetic time domain, source can be any user-defined function of the form: func(solver, t). Computes the fields (E, H, J).
        - wakesolve() for Wakefield time domain, source is a particle beam, computes wake potential and impedance (longitudinal and transverse)
              - Discussion: Gianni asks how are the fields saved. For now we only support HDF5 format, and Lorenzo agrees the main reason is for paralelization and it is being actively used (e.g., pyheadtail ). Gianni suggests to be open to other formats (.parquet  with pandas ) for diversity. Also, for small simulations, instead of dumping to a file, one can allow the field to be kept as an attribute of the class (check memory allocation). We should include a method in Field  class to damp the fields in the desired format.
            
- Beam injection and Absorbing boundaries:
    - Beam injection: as in CST using line current: barely affects performance >0.1%. Produces perturbation at the boundaries due to breaking continuity equation + reflections. 
        - Discussion: Gianni wonders if it would be worth to try enforcing Gauss law. Gianni asks Lorenzo about his PhD studies, where he diffused the perturbation to mitigate it. He is not sure it will be useful in this case. We discuss about the current injection. Instead of enforcing the z axis, one can try to inject it only at the boundary. Lorenzo agrees that injecting from PEC should have zero reflections. One needs to check for beam dispersion (beam size increasing when it travels). To be assesed with mesh/timestep refinement. Tests for university will help with this. Carlo adds that CST had 2 injection types: analytic and transverse line. Worth investigating this second option.
    - Absorbing boundaries: ABC FOEXTRAP implemented on E and H fields. Small impact on computation time <1%. Helps reducing perturbations, specially at boundary z+. Not comparable to PML.
        - Discussion: Gianni points out that the ABC reduces the amplitude by factor 10 at z-  and factor 100 at z+ . Lorenzo points out that it is maybe worth to apply it to the current J as well. 
        
- 1st Benchmark vs CST and WarpX:
     - Simulated 2 cubic pillbox cavities: below and above cutoff. Benchmarked the Ez field for every timestep as well as the wake potential and impedance. Agreement satisfactory below cutoff, while above cutoff the need of PML becomes relevant to get the right amplitude and frequency of the modes.
         - Discussion: Carlo explains that the simulations above cutoff are not often performed and are not that well understood. Sometimes it is interesting to simulate, like for the RF fingers issue. The propagating modes can leak inside the resonating structure appearing as modes. The fact that FIT gets a higher amplitude means we are seeing reflections from the boundary. Gianni adds that we need to perform a nesh study to try to converge to CST results. Lorenzo adds the small frequency discrepancy can come from CST potentially using the TE/TM FIT. Gianni suggests to study the boundaries by simulating a waveguide, exciting the fundamental mode and comparing it with analytical fields. 
    - Performance vs WarpX: FIT is faster, and ABC gives smaller reflections than WarpX’s PML. With FIT solver, we have matched and overpass (in some ways) the performance achieved with WarpX-Wakis.
        - Discussion: Lorenzo comments that it's better to benchmark with CST, since comparing with WarpX, an PIC-FDTD solver, is not fair. The speed of the simulation is highly impacted by the number of macroparticles (10^6). But this number has to be kept high to have a smooth charge distribution. Also the beam in WarpX has a transverse size. All agree we should benchmark with CST from now on. About speed, one should compare on one of the Virtual Machines (WSL already installed)
        
- Next steps:
    - speeding up the code using Kostas trick openpmd environment variable. Try GPU implementation with cupy since they support numpy and scipy.sparse.
    - implementing conductivity: simple way first, if it does not work, use Weiland/Berenger update equations. These are also needed for PML implementation. They include the exponential of a matrix, but since we use diagonal matrices, it is trivial and should not slow down the code that much.
    - Tests (planewave, gaussian packet) to be presented by the end of April at UPM University. These include symmetry studies, refraction angle, numerical dispersion, stability, energy conservation, etc.)
    - Encouragement to apply to CERN School of Computing 2024 using the training budget. 
 
- General conclusion: the code is advancing faster and smoother than expected, with very promising results in these first months of development !! :D

 

There are minutes attached to this event. Show them.