Training Quality over Quantity
Redefining Training Under Standard 1.1
Article 2 – Newbery Consulting – 2025 Standards Article Series
How did we get here?
To understand the current requirements of Standard 1.1 and why it matters, we need to look back at how the sector arrived at this point. For almost a decade, training delivery in vocational education has been shaped by a single question: how many hours of training are enough?
This focus on “amount of training” emerged in the 2015 standards, when the regulator began enforcing the requirement that RTOs provide a “sufficient amount of training”. The intent of this change in policy at the time was certainly warranted. It sought to ensure students were receiving sufficient training to combat the damage to the reputation of the sector that was occurring due to ridiculously short duration causes that were not leading to the transfer of skills. The sector quickly adapted, it led to a compliance culture built around defending hours on paper rather than designing quality training.
The issue quickly became one of documentation. RTOs were expected to produce detailed evidence showing how their delivery met minimum hour requirements. Course programs became the main vehicle for demonstrating this, with the regulator scrutinising whether the total hours aligned with expectations. While this did push providers to think more carefully about scheduling and volume, it also dragged the sector into a numbers game. It became a quantitative exercise rather than resulting in quality improvements.
Many RTOs found themselves trapped in this quantitative model. Considerable effort went into spreadsheets, tables, and justifications, while the more fundamental question about training quality was lost. What started out as an initiative to push for longer training courses and it quickly became clear that the mechanism in the standards was focusing on the wrong thing. The 2015 standard was attempting to treat the symptom rather than treating the disease. The disease was an absence of quality training design. It also created confusion, as different auditors applied different benchmarks, and providers struggled to keep up with the ever changing interpretations by the national regulator.
Enter the 2025 Outcome Standards and Standard 1.1 as a deliberate step away from that quantitative mindset. Instead of counting hours, the standard asks whether training is engaging, structured, paced, resourced, and supported by effective training techniques. It shifts the conversation from quantity to quality. This does not mean hours are irrelevant. Students still need “sufficient time for instruction”, but the focus is no longer on defending numbers. The focus is on whether the design and delivery of training genuinely enable students to acquire and practice the skills and knowledge required.
In my conversations with clients, I describe that we have shifted from a quantitative focus on the number of hours to a qualitative focus on the quality of the training being delivered. Personally, I would observe after 30 years of being a practitioner in VET, this is easily the most significant shift in the regulation of “training” we have seen. You can go back through the standards we have had over that time, from the introduction of the Australian Recognition Framework in 1998 through the subsequent set of standards in-between (QETO 1998, AQTF 2001, 2005, 2007, 2010, NVR 2011, SNR 2015), the expectation of training has always been limited to needing sufficient learning materials. The 2015 standard requirement for a “sufficient amount of training” was the previous high water mark. Now, we have leaped forward in a dramatic positive change with a focus on the quality of the training design. I love it!
In short, we got here because the old model was not working. The sector spent years trying to measure compliance through hours, only to discover that numbers on a page do not guarantee quality. Standard 1.1 represents the course correction: moving from a quantitative compliance framework to a qualitative standard of training that works in practice. So, at the beginning of this new era, it looks positive.
Why Course Structure Matters
One of the most fundamental responsibilities of any training organisation is to provide students with a structured and coherent learning experience. This goes beyond simply enrolling students into a qualification and assigning units of competency. Course structure is about designing a pathway that makes sense, that builds knowledge and skills in a logical sequence, and that gives students the right balance of instruction, practice, and assessment. When this structure is missing, training is often fragmented, rushed, or inconsistent, and results in poor retention, low satisfaction rates, and skills outcomes that are not transferable to the workplace. All too often when I am reviewing a clients training, I find there is no designed training structure or activities. Instead, the RTO relies on the trainer just delivering the content straight out of a commercial learner guide and PowerPoint. When I ask the RTO, “But, how are they learning the skills? What training activities are they completing?”. I get crickets. If this scenario describes you, this article is critical for you right now.
The new Outcome Standards for RTOs 2025 are a turning point for vocational education and training. Standard 1.1 requires training providers to deliver courses that are properly structured and paced with the student’s ability to progress. It requires training to be sequenced with assessment and supported by suitable training activities and resources. These requirements are expressed as outcome the RTO must be able to demonstrate. It is at the heart of what makes a vocational program credible and compliant. Yet in my work across the sector, I see this requirement misunderstood or, more often, ignored. Too many RTOs view the course structure as a compliance document for audit purposes, not as the central design tool for delivery. Most frequently, a training organisation only has what I call a “units and dates document”. That is the limit of the delivery structure, a table of units of competency with allocated date when these will be delivered. Is there a detailed course program or timetable? Usually not. Are there at least detailed session plans that explain what training is being delivered? Very unlikely.
It is worth pausing on why this matters so much. Vocational education is about preparing people for real work. Students are not just expected to memorise information but to perform practical tasks safely and competently. That requires practice and feedback, and it requires each new skill to be introduced at the right point in the journey. A student cannot be expected to troubleshoot faults on an engine, for example, if they have not first been trained to use workshop tools. They cannot be asked to lead a workplace project if they have not first been shown the fundamentals of communication and planning. These are obvious truths, but they are often forgotten when training is thrown together without a clear program.
When I started my career in VET as an Army instructor, the importance of structured training was drilled into me. I learnt to plan lessons with clear objectives, to select instructional methods suited to the skill being taught, and to ensure that every step was connected to the next within a structured course program. Those foundations still shape the way I look at training today. What we need now is to see those same principles applied more widely in the sector. We need courses designed with a deliberate structure that guides students from simple to complex, and that incorporates scaffolding so each stage of learning builds on the last. In reality, only a small minority of RTOs achieve this consistently, but it is exactly the type of design approach that Standard 1.1 expects.
The regulator has been fairly over the past 5-6 years on the need for greater detail on the course structure and the planned learning and assessment. Anyone who has undergone a performance assessment in recent years will be familiar with the evidence request in Training and Assessment Strategies for the details on the “topics delivered in each session and the time frames for each session”, “assessment task and due date of assessment” and “timetables for scheduled classes”. This specific wording emerged in ASQA’s standard evidence request in about 2017 after they had realised that they were loosing the battle on improving “the amount of training” that training organisations were delivering. Personally, I liked this new approach because it did provide an impetus for clients better define their course delivery arrangements. Where we are today with the new, much better elaborated requirement is really just a continuation and extension of this regulatory expectation.
The truth is that course structure matters not just for compliance but for credibility. Employers who partner with an RTO want to know that their staff will be trained in a systematic way, not in a piecemeal fashion. Funding authorities want to see evidence that public money is being spent on programs that are properly designed. Students themselves want to feel that their time and fees are being invested in a program that has direction and purpose. A well-structured course communicates all of this. In this article, I want to explore what a quality course structure looks like in practice. We will look at sequencing, pacing, training techniques, and the role of resources. We will consider what the regulator expect and where RTOs often fall short. Most importantly, we will look at the practical steps that training organisations can take to design programs that genuinely prepare students for the workplace. This is not about adding more paperwork. It is about getting back to the fundamentals of training development: analysing, designing, delivering, and supporting students to learn skills that matter. Good training starts with good structure.
Section 3: Sequencing and Progression
If course structure is the foundation of training, then sequencing is the frame that holds everything together. It determines the order in which skills and knowledge are introduced, practiced, and assessed. Done well, sequencing gives students a clear sense of direction and builds their competence step by step. Done poorly, it leaves students confused, underprepared, and often overwhelmed.
The principle behind sequencing is straightforward. Students should start with the basics before moving on to more complex tasks. This is sometimes referred to as a “simple to complex” approach. It means that underpinning knowledge is taught first, core practical skills are established early, and only then do we introduce higher-level problem solving or complex tasks. It sounds obvious, yet many course programs fail to reflect it. I regularly see training and assessment strategies where the sequence of units is lifted directly from the qualification packaging rules, as if the order listed in training.gov.au somehow represents a logical pathway. It does not. Those rules simply prescribe what must be included, not how it should be taught. I see CRICOS courses at diploma level routinely being delivered using a rolling enrolment model where students can literally enter and start the course with some of the most complex units without the benefit of earlier foundational training. It might be good for the business model but in some qualifications, as far as quality training delivery is concerned, it is a joke.
Simple to complex is by far the most common and the most easily understood sequencing approach, but it is not the only one. Training designers can also draw on:
- Spiral sequencing, where learners revisit the same skills or topics multiple times, each time at greater depth or complexity.
- Work-process or task-based sequencing, where the structure mirrors the actual flow of workplace tasks or job roles. I love this.
- Mastery learning sequences, where content is broken into small units, and students progress only once each step has been fully achieved.
- Backward-designed sequencing, where the program is built by starting with the required outcomes and assessment evidence, then working backwards to plan the learning path.
- Project- or problem-based sequencing, where authentic workplace projects or problems drive the order and integration of skills and knowledge.
These alternatives demonstrate that sequencing is not a one-size-fits-all exercise. The right sequence depends on the qualification, the industry context, and the learner cohort. Apprenticeships delivered primarily in the workplace may follow a different sequence to classroom-based courses. A program designed for existing workers with prior knowledge may sequence differently to one designed for school leavers. The principle of logical progression holds true in every case, but how it is applied requires professional judgement. This is where the training designer’s subject matter expertise is critical. Sequencing should not be outsourced to administrators or left to the assumptions of commercial resource publishers. It must be owned by trainers who understand the industry and the skills being developed.
Good sequencing also helps trainers manage the classroom. A cohort that starts with simpler tasks builds confidence and cohesion. Students see progress, which motivates them to keep going. Trainers can gradually introduce more challenging activities as the group’s capability grows. This momentum is lost if the sequence is random or poorly considered. Students who are thrown into complex tasks too early will struggle and lose confidence. Those who are held back with content that is too simple for too long will disengage. Getting the balance right is one of the core arts of training design.
ASQA has reinforced the importance of sequencing for years. Audit reports consistently identify the absence of a documented course program as a critical gap. A proper course program does not just list units, it shows when they are delivered, what training activities are planned, how they connect with the next unit, and how students progress from beginning to end. It demonstrates that the RTO has considered the learning journey, not just the compliance requirements. Logical sequencing is evidence of a systematic approach to training delivery. Without it, the RTO cannot demonstrate that “training is well-structured and enables VET students to attain skills and knowledge consistent with the training product”.
One of the more subtle points about sequencing is that it is not only about the order of units. It is also about the order of activities within each unit. A well-designed session plan follows the same principle: explain, demonstrate, practice, and then assess. This is where the concept of scaffolding becomes critical. Scaffolding refers to the way trainers break complex skills into manageable steps, guiding students through each stage with structured support until they can perform independently. Demonstration performance methods are, in fact, a form of scaffolding. The trainer demonstrates the task, explains the process, supports the student as they practice, and gradually reduces assistance as competence grows (Explain, Demonstrate and Imitate is an alternative method). The logic is the same at both the micro and macro level. Students learn best when training builds in layers, with each step preparing them for the next. Whether within a single lesson, across a unit, or over the span of an entire qualification, effective scaffolding ensures that students are not left to leap from theory to assessment unaided, but instead progress through a structured sequence that builds their confidence and competence.
Finally, sequencing connects directly to assessment. The assessment conditions of each unit describe what must be demonstrated, but unless the training has been sequenced to prepare students for those tasks, assessment will be invalid. If students are expected to demonstrate complex tasks without having first acquired and practiced the underlying skills, then the assessment is not a fair test of competency. This is why the course program must always show training being completed before assessment begins. It is a cascading structure: knowledge and skills are introduced, practiced, reinforced, and only then assessed.
When I review course documentation, sequencing is often the area that reveals the most about an organisation’s approach. RTOs that take it seriously show careful thought, industry consultation, and a clear logic in their program. RTOs that treat it as an afterthought expose themselves to non-compliance and, more importantly, set their students up for failure.
Pacing – Supporting Student Progress
If sequencing is about the order of training, pacing is about the speed. It answers the question: how much time do students need to properly acquire, practice, and consolidate each skill before they move on? The standards now explicitly require that training is paced to support the student’s progress. That phrase is deceptively simple, but it goes to the heart of quality delivery.
Pacing begins with a realistic view of the student cohort. A class of school leavers will not move at the same rate as a group of experienced workers. Even within one group, students progress at different speeds depending on their prior knowledge and learning ability. Good pacing recognises these differences and ensures that the average student has enough time to keep up without being rushed, while also keeping the course efficient and engaging for those who learn more quickly. The challenge for trainers and course designers is to strike this balance. Training cannot be so compressed that students feel overwhelmed and unprepared for assessment. Equally, it cannot be so extended that students lose momentum, disengage, or face unnecessary costs. Pacing is therefore about finding the point where training is both supportive and efficient.
One of the best ways to think about pacing is in terms of practice. Students need time not only to hear about a skill but to practice it under supervision. If the schedule only allows for theory delivery and a single run-through before assessment, then the training is not truly paced to support skill development. Practice is where learning is consolidated, and practice takes time. This is especially true when specialist equipment is involved. A group of students learning welding cannot all practice at once if there are only two welding bays. The trainer has to rotate students through, meaning more time is needed to ensure each student gets adequate exposure. Ignoring this reality is one of the fastest paths to non-compliance.
Feedback is another key factor. Students need time to receive meaningful feedback on their practice, reflect on it, and try again. Rushing through this cycle breaks the learning process. The standards require feedback because it is inseparable from pacing. Without time for feedback, pacing is too tight.
Pacing is also influenced by the delivery mode. Online courses, for example, often assume students can self-pace. While this can work for knowledge-based content, it can easily become problematic when applied to practical skills. Without structured opportunities for practice and feedback, online pacing can leave students stranded. On the other hand, classroom-based training gives the trainer more control over pacing, but also requires careful scheduling to ensure sessions are not too rushed or too drawn out. In practice, getting pacing right means being explicit. It is not enough to say “students will have time to practice.” Course documentation should show how much time is allocated, what activities will take place in that time, and how trainers will monitor student progress. A proper course program will demonstrate that training activities occur before assessment, that students are given repeated opportunities to practice, and that feedback is built into the timetable. This level of clarity is how the RTO demonstrates Standard 1.1(2)(c) “training is structured and paced to support VET students to progress, providing sufficient time for instruction, practice, feedback and assessment”.
The danger of poor pacing is twofold. If training is rushed, students may pass assessments superficially but lack the depth of competence needed in the workplace. If training is drawn out without purpose, students may drop out or become disengaged. Getting pacing right is harder than it looks. It requires trainers and training designers to be realistic about the time students need, and RTOs to resist the temptation of short courses that promise speed over substance. But when pacing is done well, the difference is obvious. Students are more confident, assessments are more valid, and graduates are more capable.
Instruction, Practice, and Feedback
Standard 1.1 also also insists that students are given sufficient time for instruction, practice, and feedback. These three elements form the backbone of effective training. If any one of these components is missing, students are left underprepared for assessment.
Instruction is the starting point. It is where the trainer introduces new knowledge or skills, explains concepts, and sets the foundation for learning. In practical training, instruction often follows a demonstration performance method of instruction (explained earlier), where the trainer shows how something is done and talks through the process and then get the student to practice. In more theoretical contexts, instruction may involve presentation, discussion, or problem solving activities. Regardless of method, instruction must be clear, accurate, and pitched at the student’s level.
Students must be given time to practice. This is where they translate knowledge into skill. Practice should be active and hands-on, not simply watching or listening. The quality of practice is closely tied to the availability of resources. If ten students are sharing one piece of equipment, then only one student is practicing at any given time while the rest are waiting. Unless adequate time has been scheduled, this kind of arrangement leaves some students without the chance to meaningfully develop their skills. It is why proper course planning has to consider trainer to student ratios, facilities, and equipment availability.
Feedback is the bridge between practice and competence. It is where the trainer observes performance and gives targeted guidance. This is not assessment, where the trainer judges competence. Instead, feedback is formative. It points out what the student did well, what needs improvement, and how they can improve it. Feedback works best when it is specific and immediate. A trainer who says “good job” provides little value. A trainer who says “your grip on the tool is too tight. Loosen your grip a little so you have more control” gives the student something they can use in their next attempt. The ability to give constructive feedback is the difference between a trainer that has confidence and industry experience and a trainer who does not.
Feedback can also encourage self-reflection. One common approach is to ask the student to evaluate their own performance before the trainer adds their observations. This reinforces learning, builds confidence, and helps students take ownership of their progress. From a compliance perspective, the requirement for instruction, practice, and feedback pushes RTOs to move beyond generic timetables. The regulator will expect to see evidence that these elements are planned and delivered. A timetable that lists “training” without specifying what actually happens is not enough. Auditors often ask the RTO to explain how they ensure every student has time to practice and receive feedback. If the RTO cannot demonstrate this, the RTO risks being found non-compliant.
The most effective way to safeguard against this is through detailed session plans or training plans. These documents map out the sequence of activities within each unit of competency, showing where instruction occurs, how practice is organised, and how feedback is provided. They also demonstrate alignment between training and assessment. One critical principle here is that no student should ever be required to perform a skill in assessment that they have not already had the chance to practice during training. This point cannot be overstated. Expecting students to demonstrate competence without prior practice is both unfair and non-compliant.
The interplay between instruction, practice, and feedback also shapes the learning culture. When students know they will be given time to try, make mistakes, and receive constructive feedback, they are more willing to engage. Conversely, when they feel rushed or unsupported, anxiety increases, engagement drops, and outcomes suffer. Instruction, practice, and feedback may sound obvious, but their absence is one of the most common gaps in non-compliant RTOs. Too often, training programs are reduced to lectures and assessments, with little space for guided practice or meaningful feedback. Standard 1.1 is clear: An RTO must show that its training is allocating sufficient time for instruction, practice, feedback, and ensuring that training techniques, activities, and resources actively engage students and strengthen their understanding.
Training Techniques and Activities
Standard 1.1 explicitly requires that RTOs apply training techniques and activities that engage students and support the development of knowledge and skills. This may sound straightforward, but it addresses one of the weakest areas in current practice. Many trainers today have not been taught how to deliver instruction effectively. The sector has often focused more on assessment compliance than on the quality of training. The result is that training sessions can lack structure, variety, and practical technique. Standard 1.1 aims to correct this imbalance.
One of the most effective and well-established approaches is the demonstration and performance method of instruction. I have touched on this a little already but, it is worth coming back to focus on. I was introduced to demonstration and performance method of instruction in the Army. It is (or was) a mandatory requirement to learn and demonstrate competency in to be promoted to Corporal (Junior Leader). As a recruit instructor it became my tools of the trade. Almost every lesson was built around a variation of a demonstration and performance method of instruction. This model of instruction remains highly relevant to vocational education today. The idea is simple but powerful: the trainer demonstrates the skill, explains what they are doing, and then gives students the chance to imitate and practice. Two common variations are:
- DEP: Demonstrate, Explain, Practice. The trainer first demonstrates the task, then explains the key steps and principles, and finally provides opportunities for students to practice. Great for less complex skills where you expect that the student will pick up the skills quickly.
- EDI: Explain, Demonstrate, Imitate. The trainer explains the skill, demonstrates it in action, and then guides students to imitate the task themselves. Best for more complex skills where greater control is required or complex equipment is being used.
These methods are particularly effective because they combine visual, auditory, and kinaesthetic learning. Students see, hear, and do. More importantly, they learn under guidance, with immediate opportunities for feedback.
Training techniques should not be limited to demonstration alone. Standard 1.1 expects a mix of methods that engage students. This might include:
- Group discussions to explore knowledge requirements.
- Case studies to apply principles in simulated contexts.
- Role plays to develop communication and interpersonal skills.
- Practical simulations using work scenarios, equipment or workplace documents.
- Online modules for knowledge acquisition, supported by face-to-face practice.
The key is variety and alignment with the training product. Techniques should suit the skills being taught. Cognitive skills may lend themselves to analysis and discussion, while manual skills demand hands-on practice. Well-chosen training activities also play a role in engagement. A session that alternates between demonstration, group activity, and individual practice keeps students active and alert. By contrast, long stretches of passive listening rarely produce meaningful learning. Engagement is not about entertainment, but about participation and getting students to think, act, and reflect.
Another dimension is contextualisation. Training activities should reflect real workplace requirements. Too often, students practice tasks in isolation without the supporting documents, procedures, or conditions they would encounter in the workplace. This means incorporating simulated workplace documents, organisational policies, and realistic scenarios into training activities. Without these, practice becomes detached from reality, and students are ill-prepared for assessment or employment.
The takeaway from this section is: training techniques and activities cannot be left to chance. They must be intentionally designed, documented, and implemented. A compliant training program shows not only the topics covered but also the techniques used to deliver them. It provides guidance on how the students will learn the skill and what activities will they do. It should also explain how they will practice and receive feedback. When training techniques are properly applied, the difference in student outcomes is significant. Students engage more deeply, retain skills more effectively, and enter assessment with genuine confidence. For trainers, it also makes delivery more rewarding. Instead of struggling to hold attention, they see students actively participating and progressing.
Conclusion and Practical Guidance
There are other aspect of Standard 1.1 that I have not covered here. These include the obvious requirement that the training is consistent with the requirements of the training product and the requirements relating to work placement (another article).
As I observed earlier, Standard 1.1 of the Outcome Standards represents a turning point for the sector. It shifts the focus from hours on a timetable to the quality of training design and delivery. For almost a decade, the conversation has been dominated by how many hours of training, how many supervised activities, how to justify allocations. The result was often compliance-driven paperwork rather than meaningful course design. The new standard changes that dynamic completely. RTOs need to think qualitatively about whether their training is engaging, structured, paced, well-resourced, and supported by effective training techniques.
For some providers, this will require a cultural shift. It is no longer enough to provide generic schedule and rely on the initiative of the trainer. The regulator will look for evidence that training is intentionally designed to help students build skills step by step. The language of Standard 1.1: engaging, structured, paced, sufficient time, feedback, techniques, instruction, practice, resources, all point to the need for quality training design.
The challenge for RTOs is to translate these broad requirements into practice. That begins with course documentation. A compliant program does more than list units and hours. It shows:
- A logical sequence of training that moves from simple to complex and ensures training occurs before assessment.
- Clear pacing arrangements that give students enough time to acquire, practice, and consolidate skills without dragging the course unnecessarily.
- Explicit detail on instruction, practice, and feedback, showing how each element is built into sessions.
- A variety of training techniques and activities that engage students and match the skills being taught.
Just as important is the professional capability of trainers. Standard 1.1 assumes that trainers are not simply assessors but educators who can deliver engaging sessions. This means investing in professional development. Many trainers have never been taught demonstration and performance methods of instruction or how to structure scaffolding into training. Building this capability is essential for compliance and for quality.
From a practical standpoint, RTOs can take several steps now:
- Review your training documentation. Does your course program show clear sequencing, pacing, and activities?
- Check your resourcing. Do students have access to the same equipment and documents in training that they will need for assessment?
- Review your training techniques. Are sessions built around demonstration, practice, and feedback, or do they rely too heavily on passive delivery?
- Review your trainers’ skills. Do your trainers’ need development in instructional techniques, scaffolding, practice coordination and feedback methods. Trainers need practical tools, not just compliance briefings.
- Test the student experience. Walk through a typical course from the student’s perspective. Is the training engaging, structured, and paced?
Ultimately, Standard 1.1 is not about making training harder for RTOs. It is about lifting training quality across the sector to where it should have been all along. Students deserve training that prepares them properly for assessment and the workplace. Employers deserve graduates who can perform the skills required. RTOs deserve the credibility that comes from delivering courses that result in skilled and confident workers.
I do honestly think this is a game changer. That’s why I have raised this issue as the second article in what will be a long series of articles to support the sector to adjust to the new standards. You need to be prepared for the hard questions about the structure and the quality of your training. This is the time to get on it if you haven’t already.
Good training,
Joe Newbery
Published: 24th September 2025
Copyright © Newbery Consulting 2021. All rights reserved.