# Metrics

Metric is used to allow admin configuring custom metrics for an assessment to see the answers in a proper way.

# New DB Table

# analytics_metrics

  • id id of this metric
  • uuid uuid of this metric
  • name the name of this metric
  • description the description of this metric
  • is_public whether this is a public metrics in the metrics library
  • data_source : which data source we are linking to for the calculation
    • For MVP, we only support question
  • data_type defines how to link to an assessment question. It has the following valid value (MVP):
    • scalar : can only be linked to a radio button choice question of an assessment
  • aggregation defines how to calculate the result. It has the following valid value: (when the data_type is scalar)
    • average : the result will be the average weight of the answers for this question
    • sum : the result will be the sum of the weight of the answers for this question
    • count : the result will be the count of the user that has answered that question
  • filters : the filters applied to this metric
    • This is a JSON encoded field that contain 2 attributes
    • role: an array of filtered roles. Valid role are participant, mentor, coordinator, admin
    • status: an array of filtered user status. Valid status are active, dropped
  • default_calculation_frequency

# analytics_metric_institutions

Relationship between a metric and an institution

  • id
  • metric_id
  • institution_id
  • requirement has the following valid value:
    • required : without linking this metric to an assessment, an experience can not go live
    • recommanded : an experience can go live without linking this metric to an assessment, but there will be a warning
    • not required : an experience can go live without linking this metric to an assessment
  • status
    • inactive : when this metric is not linked to any assessment question in any experience
    • active : when the metric is linked to an assessment question in an experience
    • archived : the metric won't be calculated anymore

Relationship between a metric and a data source link in an experience

  • id
  • metric_id
  • data_source_id : the related data source id
    • When data_source = question, this id is the question id
  • experience_id
  • institution_id
  • calculation_frequency : when to calculate the metrics
    • on demand : we only support on demand calculation for MVP

# analytics_metric_records

The table to store the metric report records

  • id
  • metric_link_id
  • value : metric data value result
  • count : how many data have we gone through to do the calculation
  • created : date when the record gets calculated

# Workflow

# Create metric

There will be a page on the institution level to see all the metrics in the institution and institution admin is able to create new metric or use a metric from the metric library.

# Configure metric

On the experience dashboard, there will be a "metrics" tab. Admin can configure the metric (i.e. link metric to a specific assessment question).

required metric and recommanded metric will have special indicator to remind the admin to do the configuration. If they are not configured properly, when admin try to make the experience live, it will popup some warning message.

# Trigger metric calculation

On MVP, we will only implement on demand calculation. There will be a button on the experience dashboard "metrics" tab to trigger the calculation

# Export metric result report

There will be a button in the experience dashboard "metrics" tab to export the report CSV

Example report CSV:

Metric Description Agg Method 01/01/2024 Value 01/01/2024 Count 01/10/2024 Value 01/10/2024 Count
WTR Willingness to Recommend Average 0.7 60 0.6 100
Skill: Communication 1-5 scale self assessment of communication skill level, 1 being lowest, 5 highest Average 0.9 120 0.8 150
Opt-in for Research Count of students who answer Yes to allowing their data to be used for research Sum 140 160 170 200
Provided Optional Feedback Count of students who provided any answer at all to an optional text feedback question. Count 60 120 75 150

# Tasks

  1. Create new db tables [API]
  2. List all the metrics in the institution (institution setting page)
    1. Page for the list [UI]
    2. API for the list [API]
  3. Add a new metric (institution setting page)
    1. Add a fresh new metrics from a form [UI]
    2. API to create a new metric with form data [API]
    3. Add a new metric from the metrics library
      1. List the metrics in the library [UI]
      2. API to list the metrics in the library [API]
      3. API to add a new metric with a given metrics id from library [API]
  4. Edit metric (institution setting page)
    1. Page to edit metric [UI]
    2. API to edit metric [API]
  5. List all the metrics in the institution (experience setting page)
    1. Page for the list (indicate whether each metrics is configured or not) [UI]
    2. API for the list (including the configuration in the experience) [API]
  6. Configure a metric in the experience. i.e. link a metric to an assessment question (experience setting page)
    1. Page to link metric to a question (inside an experience) [UI]
    2. API to list assessment questions [API]
    3. API to link metric to a question [API]
  7. Trigger the calculation of a single metric or all metrics (experience setting page)
    1. A button to trigger the calculation [UI]
    2. API to do the calculation [API]
  8. Download the report CSV (experience setting page)
    1. A button to download the report CSV [UI]
    2. API to respond with report data [API]