
At the end of February, I attended LN Mastery Day: Using Data to Demonstrate Learning Impact in Bristol, organised by The Learning Network. The day brought together learning professionals to explore a question many of us are still trying to answer: how do we show the real value of learning?
Across three workshops, the focus was on making better use of the data we already have. Topics included employee-led learning, data integration, and metric chains, with each session offering practical ways to connect learning activity to wider organisational priorities.
The theme that ran throughout the day was that the challenge is often not a lack of data, but knowing what to look for, how to interpret it, and how to present it in a way that is meaningful to the business.

Looking beyond formal learning
The first workshop, led by Martin Couzins, focused on employee-led learning and how organisations can better recognise and evidence its value.
This was a useful starting point because so much workplace learning happens outside formal training. It happens through conversations, experimentation, solving problems, using new tools, and responding to change in real time. Yet this type of learning can be difficult to capture if we focus only on courses, attendance, or completion data.
One of the key points from this session was the need to identify the signals that learning is already taking place. That means paying closer attention to what people are doing, where capabilities are developing, and how those changes relate to business challenges. It was a helpful reminder that valuable learning is often happening in plain sight, but it can remain invisible unless we make a deliberate effort to notice it. This session did descend a little bit into how organisations are using AI, which is a relevant topic, but it took away some of the focus of the overall message.
Making better use of existing data
The second workshop, led by Dr Nicola Thomas and focused on data integration.
This session addressed a common challenge in L&D: Many organisations already hold useful information, but it is spread across surveys, assessment results, observations, feedback, spreadsheets, and business metrics. Each source tells part of the story, but rarely enough on its own. When different data sources are connected, they can provide a clearer picture of what is working, what is changing, and where further investment might be needed.
I found this especially relevant because it moved the conversation away from collecting more data. In many cases, the issue is not volume. It is interpretation. It is about making better use of the evidence already available and using it to build a more coherent account of learning impact.
Connecting learning to organisational priorities
The final workshop, led by Tom McDowall and introduced the idea of metric chains through an interactive case study.
This session looked at how different measures can be linked together to show a connection between learning activity and wider organisational priorities. I found this useful because it gave a more concrete structure for thinking about impact. Rather than treating metrics as isolated data points, the focus was on how they relate to one another and how those links can be used to build a stronger case for learning. One of the hardest parts of demonstrating impact is showing how learning connects to strategic-level outcomes, so this session offered a practical way into that conversation.
What I took away from the day
Completion rates and learner satisfaction scores only tell part of the story. They may be easy to collect, but they do not necessarily tell us what changed, whether anything improved, or how learning contributed to broader goals. In digital learning design, it is easy to focus on what is being built, how engaging it is, or whether learners complete it. Those things do matter, but they are not enough on their own. We also need to think about what learners do differently afterwards, what evidence of that change might exist, and how we can communicate that clearly.
Final thoughts
Overall, the day offered a practical look at how learning professionals can move beyond surface-level measures and take a more joined-up approach to evidence and impact. I came away with useful ideas, but also a reminder that this work often starts with asking better questions, looking more carefully at the data already available, and being clearer about what kind of change we are actually trying to demonstrate.