EMI SA2.3 Metrics Meeting




EVO conference call:


Phone Bridge IID: 2059767

- Switzerland (CERN, Geneva)
    +41 22 76 71400
- United Kingdom (University of Manchester)
    +44 161 306 6802

Action Point: Take metric granularity on a case by case basis. Product team granularity should be the norm.
Action Point: include recommendations on how to improve tracking systems, per product team, severity and detection area of the certification process. Should come into line with Savannah. Also, recommend eventual cohesion between bug-tracking systems (only a recommendation, we have no power to enforce this).
Action Point: EK to look into threading and valgrind metrics over next few days. May need to split into more metrics. Metrics will be added to internal document.
Action Point: GP will expand out the coverage test description and add a list of tools used. (in addition check for perl, bash tests)
Action Point: include recommendations about how to streamline or introduce metrics, especially in the case of different
Action Point: GP to ask Lorenzo and Andres for information, in light of Reviews section of DSA2.1, which includes per platform, per format (tgz/rpm/deb), per change/patch, per external granularity requirements.
Action point: Both EK and GP will add risk and goals sections to any metrics produced before the next (first agile) meeting on Monday 12th July 2010.
Action Point: EK will rearrange document to include Introduction, state-of-the-art, metrics surveys, architecture, metrics, references, background reading
Action Point: GP will add goal and risk specifically to coverage metric, and will include where possible metrics based on EGEE-III NA1 report
Action Point: GP to send link to EK: http://project-egee-iii-na1-qa.web.cern.ch/project-egee-iii-na1-qa/EGEE-III/QoS/Follow-up/Middleware/Middleware.htm
(already done, straight after meeting)
Action Point: Next meetings: 12th, 13th, 15th, 16th July 2010, at 10:30CEST (all 4 meetings are going to be agile meetings to discuss internal document).

There are minutes attached to this event. Show them.
    • 10:30 AM 10:40 AM
      Immediate Concerns
      • Granularity of different parts of the product lifecycle are important. Do we want: per Product Team metric statistics collection? Do we want: bug-tracking system statistics collected per PT?
        • how can the metric collection process be automated given such varied middlewares?
    • 10:40 AM 10:50 AM
      Metrics Summary from PTs 10m
      * C/C++ metrics: Valgrind, thread testing, cppunit testing, code coverage * Java metrics: jUnit testing, code coverage (lack is more important), lines of commented code * General: bug-tracking stats, lines of commented code, deprecation warnings, removal of useless code over time.
    • 10:50 AM 11:00 AM
      Bugtracking metrics from Savannah
      • Severity levels: cosmetic, minor, normal, major, critical
        • Detection areas: None, Development, Porting, Integration, Certification, Pre-production, Production
        • Bug fix times (should be common to all systems)
        • Patches per component, patches release over time
        • bugs per source lines of code (SLOC)
    • 11:00 AM 11:10 AM
      Metric summary from DSA2.1
      • Documentation per release (Product teams)
        o up to date and conformance to DSA2.1 standard
        o delivery of documention is in time
        • Software release cycle (SA2.4 repositories/build tools)
          o Platform support metrics
          o Availability of externals
          o Software delivery formats
          o Change and patch control
        • Component release cycle (Product teams)
          o Code metrics (stated above), per language
          o naming and packaging conventions metrics
          o multi-platform support, percentage built metrics
          o Unit, coverage and functional test metrics (stated above)
    • 11:10 AM 11:15 AM
      Metric layout
      • Is proposed layout okay?
    • 11:15 AM 11:25 AM
      Assignment of tasks
      • Using the framework in: https://twiki.cern.ch/twiki/bin/view/EMI/TSA23MetricsFramework
        • Gianni
          o 1) Functionality
          o 2) Reliability
          o 3) Usability
        • Eamonn
          o 4) Efficiency
          o 5) Maintainability
          o 6) Portability
    • 11:25 AM 11:30 AM
      AOB - Any other business