Knowledge Centred to Learner Process.
Amidst my intrigue that a regulator would use a ‘big-stick’ approach to produce high quality outcomes in the training sector, I am comforted that we now have a Minister representing our industry who desires to implement an ‘Educative’ approach. Educate the Education Industry!
However, we must not relax as it is not a license to do as we please. Instead, we will be expected to demonstrate that we have a process in place that will produce high quality training outcomes.
Amongst the ‘noise’ with which ASQA has surrounded their regulatory work through audits and constant introduction of ‘new rules’, there has always been one Standard that has stood out from the others; Clause 1.8. Like many in the industry, I have been critical of the use of Clause 1.8 as either a ‘last-resort’ option to find a non-compliance in an audit, or simply use it to support a claim that few RTOs undertake assessment according to the Standards. Certainly, the statistics that ASQA presented in their 2018 road-show suggested that most RTOs do not know how to assess students, as they presented the following:
RTOs that passed Clause 1.8 at audit 28.5% (less than 1/3)
RTOs that passed Clause 1.8 after audit rectification 46.7% (not even half RTOs audited)
Whilst I have claimed in the past that these statistics suggest a problem with one or more of:
1. RTO’s understanding of Clause 1.8
2. The interpretation of the Regulator of Clause 1.8,
3. The Legislative Act itself – being a ‘Bad Act’ in relation to Clause 1.8.
Upon reflection, and after reviewing ASQA publication, “Guide—Developing assessment tools, updated 1 April 2015”, I suggest that this is a major problem within many RTOs and ASQA should direct its future focus on Clause 1.8, within a framework of both assessment development and implementation, and actual delivery of courses. They should, of course, use an ‘educative approach’ to monitor and guide RTOs.
The “Guide—Developing assessment tools, updated 1 April 2015” is a reasonably sound document that skirts around the idea that assessment tools should be developed through a ‘process’. However, it tends to focus too much on just assessment tools and presents a ‘micro-management’ approach to the development of assessment tools; without acknowledging that assessment should be embedded into the complete delivery system. It emphasises that Industry representatives should be consulted, who should ‘critique’ the assessment tool for clarity, content accuracy, relevance, and appropriateness of language for the learners. It has always amused me that an ‘industry representative’ with no experience in training can determine if an assessment tool is suitable for a particular student group who they have never met, nor of whom they have very little knowledge.
I asked a group of Industry Representatives if they would turn to the VET system to train their staff. The response was, “Hell no! We want instruction that is designed around our organisation’s needs, considers our staff’s current knowledge, factoring in their learning styles, and the development of content that our staff will find useful and motivational.”
They insisted they would hire an ‘Instructional Design’ person or team to develop their training programmes in favour of a VET provider.
I have noticed in job advertisements that Instructional Designers are mainly sought by non-VET sector organisations; and seem to be paid two or three times the rate of a trainer.
I also notice that most RTOs either purchased training and assessment tools, which they possibly ‘contextualise’, or develop them in-house with a content-centred focus rather than a learning centred process. I accept that some larger RTOs and TAFEs may have the luxury of employing Instructional Designers to develop their courses; although in 18 years of employment at a large TAFE, I did not witness it in that organisation.
I Googled ‘Instructional Design’ to discover what I needed to know to implement an Instructional Design process into the VET industry.
My first surprise was that the references to, and descriptions of, the early researchers in Instructional Design were people who I had studied in my teaching degree back in the 1970’s and 1980’s. In that course, we didn’t necessarily refer to these researchers as Instructional Design researchers. With so much focus on ‘compliance’ and ‘ticking boxes’ in the VET sector, I had forgotten the importance of implementing a process to design great courses for VET students.
Instructional Design models such as, ADDIE (Analyse, Design, Develop, Implement, Evaluate) and The Systematic Design of Instruction, also known as the Dick and Carey model, are worth revisiting.
However, how can these models be integrated into the VET system when we know little about the prior knowledge and experience of our students, until they enrol, and little insight into the employment for which they are being trained? The Standards demand that we gather information about prospective students prior to or at the time of enrolment. Yet this is not undertaken with great diligence by most RTOs, and mainly focuses on the LLN and special needs. Do we give our new students an assessment on their ‘learning styles’, such as the very simple Honey and Mumford instrument?
Would it be possible to spend the first few training sessions in VET focused on gaining an understanding of students? Would it be possible to develop the course and assessment tools around the student learning process, as the course progresses.
Courses always need some prior planning with a framework that identifies outcomes or objectives. Could the Training and Assessment Strategy be a modus operandi framework rather than a static blueprint?
Would ASQA allow us to be totally ‘learner focused’. Can the Principles of Assessment and Rules of Evidence be moulded for individual differences in students and learning styles?
I don’t currently have all the answers to designing the perfect course in VET, however, I hope to describe my endeavour to overlay an Instructional Design model in future blogs. When I studied engineering in the late 1970’s, the engineering faculty declared that there was too much information available for engineers to acquire, so they adopted a strategy of training engineers to seek the information they needed at any specific time. It seems that teaching people ‘how to learn’ rather than conveying knowledge is the educative framework that is required in our current era. Today’s world is even more complex than20 years ago, but has greater instant access to information. Can we re-invent the VET sector under the directive of the new Minister for Education?