About the SGMA Index
Learn the science behind our memory assessment.
Research in Context
Recent advancements in memory research, such as therapies for memory loss and mathematical tools to measure its computational processes, have unlocked exciting possibilities for understanding human memory performance. However, the tools used to assess memory performance have not kept pace, bottlenecking further advances in research and patient care.
The SGMA Index attenuates this by offering a modern solution to memory assessment. Unlike traditional memory assessments that require in-person clinician visits and long intervals between uses, the SGMA enables frequent assessments, making it ideal for longitudinal studies and remote clinical monitoring. At the same time, it is equally effective for quick, one-time evaluations, providing the flexibility needed for both ongoing and standalone applications.

Quantified SGMA Index for a 57-year-old.
SGMA Index
The SGMA Index, developed by MemoryLab Health, is a quantitative tool for researchers and clinicians to assess and track memory performance.
Legend
Basic
What sets our assessment apart is its ability to calculate how “active” a memory is—or how likely it is to come to mind—based on its history.
Tracking the history of a memory is made possible through our online assessment environment. By tracking every instance an assessment item is shown to a user, and when the user responds with an answer, it is possible to measure their memory of that item in a structured and consistent way.

SGMA app is compatible on all devices.








8-minute assessment
A user’s memory can be assessed with our 8-minute online assessment, which presents paired items (e.g., image/name) for the user to memorize. Importantly, the assessment adapts in real time to the user’s responses to personalize the pace and difficulty. This design ensures the assessment is both accessible and engaging for users of varying cognitive abilities.

History of an assessment item’s activation.
Measuring activation
The assessment personalizes the pace to the user with an online algorithm that tracks how “active” an assessment item is. An item’s activation—how likely it is to come to mind—is calculated based on its history. For example, the history of the item shown here includes two encounters at different time points. The second encounter shows a slower decay in activation compared to the first, likely because the second exposure reinforced and strengthened the memory of that item.
*Read more about the algorithm in the Scientist section.

Forgetting rates incrementally adjusted during an assessment.
Measuring forgetting
From this, the corresponding forgetting rates—represented by the slopes in the previous figure—are calculated for each assessment item. To best reflect the user’s performance, these forgetting rates are incrementally adjusted for each item throughout the assessment. Shown here are forgetting rates for nine items throughout the assessment. Warmer color items were more difficult than expected for the user, and cooler color items were easier.

Quantified SGMA Index for a 57-year-old.
SGMA Index as a function of forgetting
After the forgetting rates for all items the user encountered are finalized, they are combined to create the SGMA Index. This index serves as a computational measure of the typical rate at which a user forgets information, offering a mechanistic explanation for their memory performance.

Forgetting trajectories for different SGMA Index values.
Interpreting the SGMA Index
Since the SGMA Index reflects the typical rate at which a user forgets information after learning, we can predict probabilities of recall based on their Index. Illustrated here is the relationship between the SGMA Index and the predicted time at which information will be forgotten after single instance learning. The dashed line represents the ‘forgetting threshold’ defined as a 5% probability of recall.
For context, here are typical user values:
100: Healthy 18-year-old.
75: Healthy elderly adult (65+).
50: Elderly adults with mild cognitive impairment.
Scientist
Our assessment is comprised of two main components: A computational model of episodic memory and an adaptive fact-learning system.
Our computational model can precisely quantify memory processes. Specifically, our model calculates how “active” a memory of an assessment item is—or how likely it is to come to mind— based on its history. Its history is composed of the user’s response latency and accuracy. With this assessment environment, it is possible to measure memory performance with precise mathematical predictions. These mathematical predictions are based on the four fundamental concepts below.
Our adaptive fact-learning system uses these predictions to adjust the presentation of new items and the scheduling of previously encountered ones to match the user’s learning pace. This personalization enhances the assessment environment, making it accessible and engaging for users of varying cognitive abilities.
1
Memory as a predictive system
Our model of memory is grounded in John R. Anderson’s rational analysis which conceptualizes memory as a predictive system. According to this framework, our memory’s sensitivity to statistical structures in the environment enable it to optimally estimate how likely we are to need that information again in the future. This explains why an individual is more likely to remember what they encounter more often or more recently in their environment.
2
Memory as traces
This statistical model of memory can be described in terms of the multiple trace theory of memory. In this model, each encounter of the same item is encoded as a separate ‘trace’. The theorized ‘memory’ of that item is then thought to be the mathematical accumulation of all component traces.
3
Forgetting as a power law
Forgetting of these traces occurs in a fairly predictable pattern. This pattern can be described by a power law with a steep initial drop followed by a slower, gradual decline. The rate at which this decline happens can change depending on other factors including its recency, frequency, and spacing of encounters.
4
Forgetting as a parameter
These different forgetting rates can be individually adjusted with an additional model parameter, a decay intercept we term the Speed of Forgetting. This parameter is characteristic of an individual and reflects a combination of underlying biological processes, some of which are often compromised in dementia.
Summary
These four concepts allow us to predict both the likelihood of a user recalling information from memory and the time it will take to do so. The video below demonstrates our assessment interface and provides a behind-the-scenes look at how the activation trajectory of a cumulative memory trace is calculated in real time by our model.
Video Details
Our model generates a memory trace each time an assessment item is encountered. These traces decay over time following a power-law function. When the same item is repeated, a new trace is added, forming a cumulative memory trace. The cumulative memory trace, now consisting of two component traces, decays more slowly. Importantly, each component trace in a cumulative memory trace will decay at independent rates.
Expert
Our assessment integrates a well-established cognitive model of declarative memory function from the ACT-R framework (Anderson and Schooler, 1991) with an adaptive learning system (described in van Rijn, van Maanen, and van Woudenberg, 2009 and Sense et al., 2016).
The cognitive model employs Anderson’s “rational analysis,” a Bayesian approach that suggests memory adapts to the statistical structure of the environment to optimally estimate the likelihood of needing a memory trace (Anderson, 1990). Anderson and Schooler (1991) expanded on this concept by demonstrating how factors like recency and frequency predict the probability of re-encountering an item, thereby influencing memory performance. This approach underlines the adaptability of memory in response to environmental cues, enhancing its predictive accuracy.
Building on this, the mechanistic instantiation of this model is can be described in terms of the multiple trace theory (Nadel et al., 2000), which assumes that each memory consists of individual traces encoded every time the information is encountered, with each trace decaying according to a power law. This means that the odds of retrieving a memory are proportional to its activation, and this reflects the log odds of retrieving any of its component traces.
Looking for support?
Connect with our dedicated research team—we’re ready to assist you!
Looking for support?
Connect with our dedicated research team—we’re ready to assist you!