I was a performance analyst working with my first instructional design mentor. I had just joined Harkcon and was working for a military aviation training center in Mobile, Alabama. My mentor was an amazing man who had spent the majority of his life either in training programs (as a student, a pilot, or an instructor pilot) or designing them from the ground up. He was the Lead ISD, or Site Lead, for Harkcon.*
I was near the end of my first assignment. While all of the instructor pilots (called IPs) ended up benefiting from the creation of this material, we were making courses for Instructors Under Training (IUTs), people who were trying to be instructor pilots. I had been assigned to take the work my mentor had done teaching IUTs how to operate a helicopter simulator and “do the same thing” for a completely different set of IUTs (who needed to learn how to use their multi-million dollar simulator).
“Do the same thing” should have been a misnomer. Right? After all, how could I “do the same thing” – especially when this helicopter…
…was so very different from this one?
Actually, my mentor had made this first assignment an easy win (for which I am eternally grateful – I was less than a year out of grad school at this point). Because he had done a Job Task Analysis (JTA) on the MH-65 instructor pilots using their simulator, he had identified 6 key tasks instructor pilots had to do in order to conduct a course event while facilitating a simulated environment. In other words, by carefully observing instructor pilots while using the simulator to teach new pilots how to fly an MH-65C, he was able to find there were six key things that had to happen in order for that instruction to be as effective as it could be.
In case you’re interested, the tasks were:
- Conduct a Student Briefing
- Determine Operational Status of the Trainer
- Perform Trainer Boarding Procedures
- Prepare the Trainer for the Training Session
- Conduct a Flight Training Session from the Instructor Operating Station (IOS)
- Terminate a Training Session
The reason why getting these tasks defined in this manner was such a big deal is two-fold. First, instructor pilots had a wide-ranging level of ability in being able to use the simulator to facilitate a realistic, simulated environment while teaching their students. To operate the simulator while teaching is no easy feat and requires a completely different skill set than flying. These pilots were the best the Coast Guard had to offer, but, when all you had to go by was a wordy, lengthy, next-to-useless “Instructor Utilization Handbook” (IUH) – something that should really just be called a technical manual – and a great deal of waning tribal knowledge, something had to change.
Because my mentor had focused on performance tasks, and not on the equipment, I was able to “do the same thing” he did for the MH-65 instructor pilots. When I did my own JTA, following and observing MH-60J instructor pilots conduct course events, I was able to independently confirm these high-level tasks were the same for MH-60 pilots as they were for MH-65 pilots. This saved a great deal of time and greatly increased my chances for success on this project. I would not have been able to do the other stuff I’m about to talk about if this level of transfer regarding these tasks did not occur between simulated platforms.
Now, as for how those tasks were completed, that’s something else entirely. Different versions of software, completely different simulators (not to mention the different helicopters they were made to simulate), and different safety requirements made the “hows” between these aircraft – at least at first glance – different to the point of befuddlement. I loved figuring this out. I loved learning how to operate the IOS to insert storms and aircraft and aircraft carriers into the simulated environment. I grew elated when I discovered safety interlocks none of the instructors knew about. I positively swooned when I figured out how conduct Transition Collision Avoidance System (TCAS) scenarios and how to best present that in the instruction.
I loved working with my Subject Matter Experts (SMEs) to wrestle with all of this stuff, as well. After one session, a humble hero and instructor pilot let me actually “fly” the MH-60J simulator. I’m a guy who grew up playing every flight sim game I could get my hands on – from Gunship to A-10 Tank Killer, and oh so many more before and after. So I cannot stress how amazing a moment it was when the instructor pilot gave me this privilege. He even used my cell phone camera to capture the moment on video.
I have to mention the deliverables for this project and how I built them. This being one of my first assignments, I didn’t really have much in the way of equipment. I had a standard Coast Guard Workstation III with standard software. This meant I was using MS Paint to edit photos and make graphics. In fact, all of my instruction was made with Microsoft products. An Interactive Courseware Specialist, and my first e-Learning mentor, let me borrow his camera to take pictures, but all of those pictures still needed to be worked and inserted into the instruction. It was a challenging and fun experience going back to pure basics – meaning no Adobe software, no authoring tools, just Word and PowerPoint to make good ‘ol fashioned Instructor-Led Training (ILT).
But, back to my story. Like I said, I was near the end of the assignment. My course training IUTs was very nearly complete, and I realized that I had made some assumptions about how I would assess student performance. I was planning on capturing the tasks and performance objectives I had identified for the MH-60J simulator operation in the same format and assessment tools that my mentor/site lead had built.
However, I realized I didn’t have a clue how students were assessed in this community. Given that I was trying to figure out the best way to assess my students’ performance (pretty important if you’re a performance analyst), and one of the major goals of any good JTA is to avoid making assumptions like I had done. So I began asking questions.
Those questions led to more questions. Eventually I ended up talking to all the pilot branches at this training center about how they assess student performance. Some branches still used pencil and paper. Some branches used Microsoft Word. Some used Adobe Acrobat. All tried to capture student performance in courses using these lengthy forms called Daily Progress Reports (DPRs). I realized that there could be some major improvement in how these branches assessed student performance, especially since none of the forms actually reported to a database.
Additionally, while talking to a couple of IPs in the MH-60 branch office, I got to be a fly on the wall and hear the branch Standardization Officer say, “I spend 80 percent of my time dealing with these stupid forms!” When he had calmed down, I asked him exactly what the problem was. “These forms change every time someone with a different computer opens them. And people are always screwing around with the tables and adjusting them. And sometimes the forms just magically rearrange themselves, and I have to go back in and fix them.”
That was a performance improvement opportunity if ever I heard one.
So, I brought my own laptop in. I happened to have some experience creating interactive forms for my students back when I was an English teacher (I’ll post more on that later). So I opened up LiveCycle Designer and got to work. I showed my mentor how I was going to assess student performance in my course, and how I thought these new forms could actually be used to help all the branches. He really liked my idea and asked me to write something up. The next day I brought in a two-page paper detailing a three-phase plan for implementing these new forms. My mentor and I took this paper to our Performance Technology Branch (PTB) chief (and one of the best bosses I’ve ever had). The branch chief saw the value in the proposal and told us to move forward with it.
I decided to make a video letting new MH-60J IUTs (and any IPs who deigned to listen) how the new IUT course would work and how the new forms worked. I figured a video would have the best chance of sticking around to fight waning tribal knowledge and let people know what was going on with these forms. That video, capturing all the work I’ve talked about to this point, appears below. Note I made all the videos, graphics, photos, and whatnot that are directly related to the instruction. Except for a few images that were already in the IUH and one I used from the branch shared drive (that really cool “rainbow” picture of a rescue in Elizabeth City), I created the content you see here. Of course I didn’t make the music (credits listed in the video).
If you watched it, near the end of the video, you saw the first form I created. When I showed these new forms to the client, thanks to the video and some other work, there was already a lot of buy-in with making the new forms. That buy-in was crucial, because I worked with a lot of great people in the MH-60 branch to draft over 30 versions of the first forms used for assessing student performance – the check flight forms.
Eventually these forms were used to assess all performance in the USCG aviation community. From very large forms with hundreds of fields that were used to assess student performance in courses, to small forms that assessors used to validate performance for all the Coast Guard stations in the country, these forms became the new standard in how students, pilots, and Coast Guard members were assessed.
Obviously I didn’t do all of this by myself. I worked with some amazingly talented people to make more and more of the forms. While I trained my coworkers and others (including an instructor pilot or two) on how to create and use the forms, many of the people I trained ended up teaching me about the forms. Many, many improvements were made collaboratively.
The first course that was completed was the MH-65D Transition Course. My mentor/site lead asked me to create a video introducing pilots to the new course and the new forms, which we were by this point referring to as enhanced Daily Progress Reports (eDPRs). You can view it below.
Since my work on the forms, they have gone through several versions, and they now look very different from the forms we created years ago. However the processes we used to create and maintain the methods of assessing student performance remain the same: instructional designers and performance analysts at PTB work with subject matter experts collaboratively to build student assessments.
My only regret in working on this project is that I was never able to help make the third phase of the project come about. It involved making the forms report to a database – or what I would now call a dashboard – so that real-time reporting on fleet-wide performance could be generated.
One of my favorite parts of working on this project was when the MH-60 Branch Chief came in to the Harkcon PTB office and said, “I used to spend more than 20 minutes grading each student’s DPR. Now I spend right around three. So I have 17 minutes now to come in and thank you guys every time I fill out a DPR.” Another was when the MH-60 Standardization Officer once told me and the Standardization Officer who was going to replace him, “Look, you’re going to wonder why I was complaining about my job so much, now that these guys have made it so much easier for you!”
In regards to my first JTA and working with my mentor to help IPs facilitate training in simulated environments, he and I gave a presentation about that, which can be viewed below.
Featured Image Source: Mark Van Scyoc of Shutterstock.com.