WP3 pre-kickoff meeting
-
-
1
WP3 Intro & OrgaSpeakers: Neil Chue Hong, Dr Thomas Vuillaume (LAPP, CNRS, Univ. Savoie Mont-Blanc)
Agenda
-
Icebreak: https://padlet.com/thomas_work/where-are-you-from-rv2nxr2u7mk96eog
-
WP3 organisation
-
Draft slides for kick-off meeting
-
WP3 first 6 months
Present
-
Thomas Vuillaume
-
Daniel Garijo (Universidad Politécnica de Madrid)
-
Kay Graf (FAU/ESCAPE)
-
Nikos Pechlivanis (CERTH)
-
Neil Chue Hong (UEDIN)
-
Kirsty Pringle (UEDIN)
-
Pablo Tamarit (CERN)
-
Eva Martin del Pico (BSC)
-
Stefania Amodeo (OpenAIRE)
-
Serafeim Chatzopoulos (ARC)
Excused:
-
Jason Maassen
-
Fotis Psomopoulos
-
Thanasis Vergoulis (ARC)
Please complete the following (to be completed before Friday 08/03):
Partner
What do you expect from wp3?
What do you bring to wp3?
What tasks will you work on in wp3?
What do you plan to work on in the first 6-12 months?
Recruitement status
What dish do you absolutely want to try out (or you recommend) in Greece?
CNRS-LAPP
The best from each community brought together in actionable tools.
Experience building tools around Zenodo and CodeMeta. Experience in software development for astronomy. Link to the ESCAPE cluster. Expertise in ML.
T3.1, 3.2, 3.3
Getting everybody to understand each other and work together.
Build a catalog of tools and services and evaluate the potential integrations.RSE arriving in April/May
Fasolatha
FAU
Services to judge software quality for all communities
Link to ESCAPE cluster, leading OSSR (thematic software repository there), background in software for high-energy physics, link to EOSC Association Taskforces on Software Infrastructures
T3.1, 3.2
Mainly on the best practice collection in WP2 then implementing in WP3
Scientist arriving
Moussaka
UPM
Guidelines and services that are useful for researchers to improve the FAIRness of their tools
Metadata representation (Codemeta link), tools for automated software metadata extraction (SOMEF)
T3.1, T3.2, T3.3 (co-lead)
Best practice collection and implementation. Automate collection of metadata to aid the assessment tools.
RSE (ongoing)
Happy to try what other people recommend
BSC
Resources that help researchers (easily) make their software more FAIR.
A tool to edit software metadata and check the FAIRness of a software based on our interpretation of the FAIR principles for software.
T3.1, T3.2 (lead)
Best practices collection
Relying on existing personnel (Laura P. and Eva M.)
All :)
UEDIN
Collecting and integrating the community knowledge around tools and services for software quality
Contributing knowledge of FAIR4RS, research software metrics, and software quality. Connection to FAIR IMPACT (leading software metrics work).
T3.1 (co-lead) and T3.3
Organising workshop for MS3.1. Work with SKAO to determine best way to operate technology watch. Discuss with NLeSC how to structure input into RSQkit.
Kirsty and Neil providing initial effort, additional person to be assigned
All the food is delicious!
SKAO
A tool where a researcher can get guidance about what technologies are recommended for adoption
sw quality tools in use within the SKAO and SRCNet development collaborations
T3.1
Work with UEDIN to setup the practice and tooling around the tech watch.
Understand what are its “dimensions”
Organise a workshop to create the first version of the technology watch.
Software Quality Engineer, probably starting in May
koulouri
CERN
Tools and processes making it easier for scientific communities to improve software FAIRness
Personal experience in software engineering and software quality. Bringing improvements to Zenodo aligned with the project’s objectives
T3.2, T3.3
Clarifying requirements, estimating tasks and planning the work
Relying on existing personnel (at least for 2024)
Tsoureki
CERTH
Open guidelines better software quality (FAIR software)
Software metadata representation, Software development practies, open source bioinformatics
T3.1, T3.3
Identify tools/services evaluate software quality, catalogue of tools/services aligning to best practises, Connection to the FAIR4RS and ELIXIR SMP efforts
-
ARC
A comprehensive list of impact measures for software entities
Integrate impact measures for software in BIP! Scholar (tool for crediting the work of researchers)
T3.1, T3.2, T3.3
Aid at landscaping analysis for best practices in assessing software quality and impact, and how they link to work in WP5
Relying on existing personnel
UNIMAN
A better understanding of usage and needs of workflow quality and FAIR metrics
Knowledge of Workflows and FAIR
T3.2, T3.3
<this will come out in discussions at the KOM>
Relying on existing personnel
NLESC
Consensus on best practices, metrics, metadata, etc. relating to software quality and fairness
Experience in creating best practices (both our own and Turing way), tools such as HowFairIs and Tortellini, metadata/tools such as CFF, and services such as the Research Software Directory
T3.1, T3.2, T3.3
Landscape analysis to get a good overview of the tools, and how they link to the work in WP-2
Relying on existing personnel
Open to suggestions ;-)
Collaborative notes
-
potential topics for discussion in the ad-hoc discussions on Monday morning (for those who’d be able / willing to join)
-
Pablo: We forked the Tech Radar from Zalando in my previous team at CERN:
https://opensource.zalando.com/tech-radar/
-
Marco: this is the thoughtworks tool https://www.thoughtworks.com/en-gb/radar
The thoughtworks tool drives the publication of a six-monthly report in the case of Thoughtworks
-
-
1