Showing posts with label design. Show all posts
Showing posts with label design. Show all posts

Saturday, 14 June 2025

Stop Confusing Learners: The Connection between Consistent Design and Cognitive Load

    For a long time I've been frustrated at the way the IET, particularly in its On-Site Guide, sets out information. I'll give just one example. In the catchphrase of the Irish comedian Jimmy Cricket, there's more, but one will hopefully suffice. This example is taken from 'Section 10, Guidance on initial testing of installations, Insulation Resistance'. In particular, the OSG's guidance given on the Pre-test checks necessary before carrying out an IR test. This is what it looks like:




(i) Looking not at the information, but at the typographic manner that the information is presented, we can see that the text:

(a) Has a bold heading.
(b) Is formed into a list.
(c) Is numbered with a lower case Roman numeral.
(d) Is indented from the left-side margin.
(e) And is further indented and numbered with a lower case letter (a & b) for each check.

    The information is easy to identify because it stands out on the page and easy to follow because it’s split up into easily digestible chunks of instructional information.  So when an apprentice, who's trying to carry out this test with neither instruction, guidance, nor feedback, needs some assistance; this is both a useful and helpful place to look. The problem with it is that it's incomplete. There are a number of further pre-test checks that need to be done, but these are not included in this list. These are:-

  • Located on the following page.
  • Tucked away in the section headed "Test".
  • Not written in bold.
  • Not set out in a list like the previous checks.
  • Not numbered.
  • And not indented.

This is what it looks like.



    And as well as not standing out typographically, this information is located in a section that gives guidance on how to carry out a test on a single phase circuit. If you were testing a three-phase circuit, you probably wouldn't consider looking here. The information presented in the OSG, which an apprentice, or indeed an electrician, might need to prepare a circuit for an IR test, is fragmented and typographically inconsistent. It tacitly acknowledges the need for guidance, then seems to almost go deliberately out of its way to deliver that guidance badly. I've written elsewhere about the way technical/manual work is regarded. This lack of attention to design details hints that at some deeply hidden level of consciousness, the OSG writer thinks that the information isn't worthy of the respect it deserves.

consistency enables people to efficiently transfer knowledge to new contexts, learn new things, and focus attention on the relevant parts of a task (Lidwell)

The Principle of Consistency

      The fundamental principle of interactional design that is broken here is that of consistency. Consistency allows us to identify patterns and give them meaning, which in turn enables us to make sense of our experiences, to predict what might come next and to decide what choices to make in order to pursue our goals. William Lidwell, in his book Universal Principles of Design, explains that consistency ‘enables people to efficiently transfer knowledge to new contexts, learn new things, and focus attention on the relevant parts of a task’. More specifically, the design fault in the On-Site Guide is a lack of internal consistency, which, as Lidwell points out, undermines rather than cultivates trust. This lack of internal consistency communicates to the user that the system has not been carefully designed and is more likely to have been “cobbled together”. As a consequence, they lack faith in the book and because they're not sure they've got all the information they need, themselves.

    The value of internal consistency for the educator is that it doesn’t just group the aesthetic and the functional together in one consistent whole but makes the aesthetic a feature of the functional and helps teach through design. And that's exactly what you need when you’re looking to the OSG for guidance and support. You want to find your spot and gather your information from that spot. Instead what happens is that at worst the user misses vital information, and doesn’t carry out the test correctly, or at best, if you are fortunate enough to find both bits of information, your attention is split between the two parts. And that leads to unforced errors, circuits not tested properly, and assessments failed.

Designing to focus the learner's attention

      In vocational training, learners often need to integrate information from various sources like diagrams, text, and instructions to master practical skills. I encountered this precise problem with the specification for an electrical installation that was inherited from previous educators. Information for each of the circuit was scattered in various places throughout the document making it almost impossible for the apprentice to find all the information required because they couldn't for the life of them tell where that information might be hiding. Finding a single piece of information required the learner to read, re-read, and read again the whole spec every single time that they needed one piece of information. The learner's working memory became overloaded, trying to mentally connect the disparate information instead of focusing on understanding the skill itself. Consequently, and not unsurprisingly, information was missed, and tasks were left undone. Again, the problem lay not with the reader whose attention was split across too many points but with the design of the specification. It was like the learner had been tasked with collecting water in a sieve.

    The solution, of course, was simply to redesign the specification so that all the information for each individual circuit was consistently presented typographically and collated in one specific place. In other words, the specification was designed to focus the reader's attention rather than split their attention.

The Split-Attention Effect

      The split attention effect, identified by John Sweller within his Cognitive Load Theory, describes a situation where a learner must divide their attention between multiple sources of information that are presented separately in space or time. This separation requires learners to mentally integrate the disparate pieces of information, and this consumes valuable working memory resources. Essentially, when information is not physically or temporally integrated, learners experience an increased extraneous cognitive load. This load is unproductive for learning as it's directed towards mentally connecting the fragmented information rather than understanding the content itself. As a result, less working memory capacity is available for processing and encoding the actual material, leading to less effective learning.

Physically integrating disparate sources of information so that they no longer have to be mentally integrated reduces extraneous cognitive load and facilitates learning. (Sweller)

    The problem with the spec. for the electrical installation mentioned earlier was that information was spread across multiple pages: some of it was here, some of it was there, and other bits were somewhere else entirely. The learner had to constantly switch their attention between the multiple places and hold information in their working memory to make the required connections. This split attention hindered the learning process compared to the new scenario where the text is integrated into one place. By physically and temporally integrating this information, we reduced extraneous cognitive load to optimize learning and performance. In effect, the educator bore the complexity of the design, allowing the user to experience the simplicity of it. (I’ve written elsewhere about Tesler’s Law in education).

    The split attention effect highlights the importance of instructional design that minimizes the need for learners to mentally integrate separate but related sources of information. By being mindful of the split attention effect, instructional designers in vocational education can create more effective learning materials that reduce extraneous cognitive load and allow learners to focus on acquiring the necessary skills and knowledge.


------------------------


Sources

* Doughton, M. (Ed) (2022) On-Site Guide BS 7671:2018+A2:2022, The Institution of Engineering and Technology, London
* Lidwell, W. (2003) Universal Principles of Design: 100 Ways to Enhance Usability, Influence Perception, Increase Appeal, Make Better Design Decisions and Teach through Design, Rockport Publishers, Massachusetts
* Sweller, J. (2016, February 10). Story of a Research Program. In S. Tobias, J. D. Fletcher, & D. C. 
Berliner (Series eds.), Acquired Wisdom Series. Education Review, 23. http://dx.doi.org/10.14507/er.v23.2025

Saturday, 12 April 2025

The Unseen Hand of Complexity: How Tesler’s Law Shapes Effective Assessment Design

It all started with a question. Maybe it was meant to be an insult:

"Are you sure you aren’t making this more complex for yourself than you need to?"

I wasn't sure how to answer. Today, I would reply emphatically, 

“Yes! That’s exactly what I’m doing!”

Tesler's Law

    Last week, I discovered Tesler’s Law. From that moment, all the problems that I’d been facing on an assessment design project simply melted into thin air. In this post, I want to elaborate how Tesler’s Law of Conserved Complexity might apply to assessment design in education. Although I work in vocational education, the information will, I hope, be useful to anyone involved in assessment design.

    The pursuit of effective assessment is a constant evolution. It’s no wonder the word assessment begins with the small letter ‘a’! Bad Lacanian jokes aside, every educator, every where, strives to create assessment methods that accurately gauges student understanding, provides meaningful feedback, and ultimately drives learning forward. But all too often, we find ourselves wrestling with overly convoluted marking schemes, overly simplistic scoring systems, and labyrinthine processes that lead nowhere. Could there really be a guiding principle lurking beneath these design challenges? I didn’t think so. And then, out of the blue, there it was, right in front of me; 

Tesler’s Law states:

"Every application has an inherent amount of complexity that cannot be eliminated or hidden."

Whilst originally conceived in the field of computer-human interaction, Tesler’s Law, also known as the Law of Conservation of Complexity, offers a surprisingly insightful and practical lens through which to analyse assessment design.

Come and see the complexity inherent in the system


    The complexity is always there. You can’t wish it or design it away. Instead, it must be dealt with in one of two ways: either by the application itself or by the user. As educators, our "application" is the assessment itself whether it be the test paper, or the practical task to be completed competently. The fact is that Tesler’s Law applies to any method we use to evaluate what students have learned or are supposed to have learned. The "user" is both the student completing the assessment, and the assessor tasked with evaluating it. That latter point, I think, was the most striking revelation of last week’s “discovery”. I had designed a process that looked incredibly simple. But all I’d done was effectively shift the complexity onto the poor bloody infantry of assessors. It really seemed like a binary choice: simplicity or complexity. Once I started to think about the problem in terms of Tesler’s Law, I was able to shift the complexity elsewhere.

The Trap of Overly Simplistic Assessments: Student Burden v. Limited Obvious Insight for Educators


    One might initially think that the ideal assessment is the simplest one possible. A quick multiple-choice quiz, for instance, seems straightforward. However, Tesler’s Law suggests that by pushing all the complexity onto the student, we might well be missing crucial insights. A multiple-choice question requires the student to:-

  • navigate potentially nuanced options;
  • discern subtle differences;
  • guess strategically, or
  • all of the above.

    The complexity of understanding the underlying concept and applying it correctly is entirely on their shoulders. And whilst multiple-choice assessments are quick and easy to mark and score, such assessments can often provide a superficial understanding of student learning. We know what they got right or wrong, but not necessarily why. They might have guessed right, or they might have guessed wrong. They might even have guessed their way to a pass. Under the circumstances of these known unknowns, the complexity of diagnosing misconceptions and providing targeted feedback is significantly increased for the educator.

The Power of Well-Designed Complexity:

    Conversely, consider an assessment with a detailed rubric. Here, the educator takes on more of the complexity in the design phase.

Educator Investment: Crafting a clear rubric with specific criteria, levels of achievement, and guiding questions requires significant upfront effort. This is the educator absorbing the inherent complexity.

Reduced Student Ambiguity: A well-defined rubric clarifies expectations, reduces ambiguity, and guides students in demonstrating their understanding in a structured way. The cognitive load for the student shifts from deciphering the assessment's hidden requirements to focusing on what's required of them.

Richer Data for Educators: The detailed rubric allows for a more nuanced evaluation of student work, providing richer data on their strengths and areas for growth. The complexity of analysis is managed by the structured framework.

Finding the Balance:

    The key takeaway from applying Tesler’s Law to assessment design isn’t about making everything complex or overly simplistic. It’s about consciously deciding where to put the inherent complexity. Effective assessment design strategically distributes this complexity to maximize learning and provide meaningful insights. We should always know where the complexity lies.

Here are some practical lessons that I’ve learned from my current design project:

Clarity is Key: I’ve invested a lot of time on this over the last few weeks. Crafting clear and concise instructions, rubrics, and expectations shifts the complexity away from the student. Instead of the student trying to decipher the task, the educator should provide clear, upfront, and standardized guidance. Think inductions and briefings.

Structure for Success: Provide scaffolding and frameworks for complex tasks. Breaking down large tasks into smaller, manageable steps reduces the cognitive load on students. But this doesn’t just apply to the student, it also applies to the assessment process, such as in the way that the assessment items are written in the first place.

Targeted Feedback Mechanisms: Design assessments that allow for specific and actionable feedback. This helps educators address the underlying complexities of student understanding. It’s worth considering ways that can automate a certain amount of the feedback to allow educators to provide more detailed and/or nuanced context.

Consider the Learning Objectives: The complexity of the assessment should align with the complexity of the learning objectives. High-order thinking skills often require more complex assessment methods. Ensuring that the syllabus aligns with the assessment is a jolly good place to start.

Iterative Design: Don’t expect perfection. Assessment design is rarely perfect on the first try. Be willing to iterate and refine your methods based on student performance and your own reflections. It might mean a period of double-marking. But taking that weight of responsibility removes complex problems later. This also acknowledges and addresses the inherent complexity of measuring learning. Students are endlessly inventive in the ways they get things wrong. The assessment structure and processes have no option but to be dynamic.

Beyond Ease of Grading:

    Ultimately, Tesler’s Law reminds us that striving for the easiest assessment to grade might inadvertently place a greater burden on our students and limit the depth of our understanding of their learning. By consciously embracing and strategically managing the inherent complexity of assessment design, we can ensure fairer, more informative, and ultimately more effective methods for evaluating and fostering student growth. Let’s move beyond the illusion of simplicity and embrace the inherent complexity.

​"Is It a Sort of DIY Book?": Confronting Myself As "The Other"

I recently telephoned a Glasgow branch of Waterstones to have a copy of Callum Robinson's book Ingrained : The Making of A Craftsman se...