GridPP Technical Meeting
Virtual Only
Fortnightly meeting for technical topics looking further ahead than the weekly ops meetings on Tuesdays. There are also dedicated storage group meeting on Wednesdays. Each topic can go beyond the nominal 5 minute slot whenever necessary.
Notes to turn into minutes...
Sam still working on the update and plans to publish an update next week. There are a lot of conflicting philosophies. Talk to Brian b as he was in the UK anyway.One example was to OSG have been working with LIGO and they set up a single copy of their data (~10TB at Nebraska) and then accessing from everywhere else. Also using cvmfs for data access as well. COuld this work for LHC data? There are variety of possibilities... eg having a local cache (perhaps an object store). cvmfs has been made secure .and so propreity software can be served.
The requirements document will come out next.
VAC,vcycle (and LHCb)
No underlying changes. LHCb has merged VAC and the underlying structure (being done within Dirac) ... extra "stuff" for VMs Can now run payload (production) jobs within VMs ... deployed soon.
ATLAS
No great news but the Candian guys have developed a monitoring site ...http://monitor.heprc.uvic.ca:5000/
(There was a discussion about CERN cloud procurement and plans for the next round of HN tests).
CMS
Nothing much to report.
Update from storage
Mainly the discussion with Brian B. Some discussion about integrdata cloud.
Security
Nothing much to report .... there are two GDB working groups (David Crooks is leading one). There will be a preGDB next month about this and David is taking a leading role. There had been a traceability meeting which focused on naemspacing etc. This work is ongoing.
GridPP Dirac
A well placed comment line in the JDL can break the Dirac server (discovered this morning).... email sent to the Dirac users list. The problem is now overcome.
HEP software foundation
Nothing since the workshop.
The cernvm was verygood at RAL this week and there was much talk on how to extend it to HPC and containers.