The views and opinions expressed or implied in WBY are those of the authors and should not be construed as carrying the official sanction of the Department of Defense, Air Force, Air Education and Training Command, Air University, or other agencies or departments of the US government or their international equivalents.

Wild Blue Yonder on the Air - Ep. 6 - Dr. Megan Hennessey & COL Celestino Perez on "Problem-based Learning"

  • Published

Opinions, conclusions, and recommendations expressed or implied within are solely those of the author(s) and do not necessarily represent the views of the Air University, the United States Air Force, the Department of Defense, or any other US government agency.

Dr. Margaret Sankey: Welcome to Wild Blue Yonder on the air, the Air University podcast. And joining us today is Dr. Megan Hennessey, who's the director of the Teaching and Learning Center at Air University. And we also have Colonel Celestino Perez who is the chair of the Carlisle Scholars Program at the US Army War College. And they're here today to talk to us about problem-based learning, a pedagogical tool that is tailor-made for PME. So welcome, and could you tell me more about problem-based learning?

Dr. Megan Hennessey: Yes, thank you so much, Margaret, for having us. We're delighted to be here. Tino is my partner in crime in all things problem-based learning and scholarship of teaching and learning, so it's always fun to get to chat with him as well. So problem-based learning is essentially a student-centered instructional approach. And you can use it at all levels, from K-12, up through higher ed, executive education, professional training, professional development. And the point is to get the students, or the learners focused on research, and using a research-based approach to address a problem. And the answer to the problem is not as important as the skills you learn along the way in addressing that problem. So you need to start off, first of all, with an actual problem. We would expect that of problem-based learning. But the characteristics of the problem are important, so you don't want something that is necessarily bounded or easy to solve right off the bat. You want to think about the structuredness of the problem, so something that is ill-structured. An example of that might be how to design a car, versus a well-structured problem would be how to start a car. You can see the difference there. There are very clear-cut steps on how to start the car. You put your key in the ignition. Hopefully, you put your foot on the break, and then you turn the ignition, and the car starts.

But designing a car, you could go in many different directions with that. And the purpose of using an ill-structured problem is because you're so focused on developing the skills of the students, you want them to think creatively and really not limit their process of approaching whatever that problem may be. So first step, engage with that ill-structured problem, and then through that engagement, you begin to investigate. This is where the inquiry happens. The students start to ask questions about the problem, about the environment, about consequences of things that they're discussing, and then you move towards generating some kind of solution. In the DOD, we look at that often as a sort of solution product. Lots of temptation here to think about a PowerPoint briefing or something similar that we're so used to, but I think Tino will talk maybe a little bit about how we've been more creative with how we approach solution products. But it's important to not get too tied in as a learner and as an instructor on what that product is because, again, it's about the development of the skills more than just grading the final product.

But I think the most important step of a problem-based learning is the end, where you debrief, and you reflect on your experience. And this is often the step that people skip because they do get so caught up in, "Well, what's the answer to the problem? Did I answer it correctly?" And that's not as important as thinking about how you approached the problem and what you learned in the process. So PBL has been around for a long time. It started in the 1970s in medicine, and I'm surprised it took so long actually. But that's where it kicked off as sort of a pedagogical tool and a way to marry performance-based assessment, so actually doing what you would be expected to do after graduation, on the job, etcetera, with authentic assessment, which is taking it one step further and making sure that you're testing students' ability to not only do what they'll be expected to do after graduation, but to do it in an environment very similar to what they would be expected to do after graduation. So the environment and the action become very important there.

Sankey: As you said, this has a long track record with professional fields like medicine, law, Master of Business Administration. What do these programs have in common with professional military education?

Hennessey: Yes, so as part of the authenticity of PBL, I think where PME benefits from using this as a pedagogy, is the pressure of making decisions under real life conditions, and as much as possible, to do so publicly, to be expected to defend those decisions with evidence from your research, so that was the investigation and inquiry phase, and the goal again is skill mastery, so you can work on all kinds of skills, communication skills, professional presence, military bearing, you name it. But really the point is to be able to perform under pressure in a very transparent way.

Sankey: Those are all skills we definitely want. Thinking about my experience in the PME classroom, I know that one of the big struggles that a lot of students had was going from reading strategy and understanding what the words on paper said, and then being the senior leaders who actually had to go out and translate that into doing strategy in situations that they hadn't studied, that weren't clear-cut. And so, is this an important tool in trying to inculcate that skill?

COL Celestino Perez: In the PME world, a lot of people make a distinction, a sharp distinction between education on the one hand and training on the other. And I see where they're coming from, and in many cases, I agree with that distinction, it makes sense. But in many cases, it doesn't make sense, it depends on the perspective. So if we think that education encompasses skills like researching. How do we aggregate the research that we do together? How do we apply concepts that we learn in the classroom to actual problem-solving? Well, these are skills, and in order to get better at them, we need to train them. We need to do them repeatedly, get feedback and then try to improve on subsequent iterations. The Army War College, and in many places that teach about strategy, they talk about strategy in terms of the alignment of ends and ways and means. And if that's true, if that definition captures something that what a strategy is, then that means that strategy entails the alignment, aligning things.

So students in at least at the senior level, like at the Air War College and the Army War college, they need to be able to discern, "Okay, what are the appropriate ends that we should pursue the objectives." Given all the means that we have available; diplomats, military, economic power etcetera, how do we arrange them in a way in order to achieve those? And I think it's a lot of work. So in this respect, strategy is no different from mathematics, from music, from Engineering, from medicine, especially surgery, we have to act, and we have to perform in order to do better. I think it's crucial for military educators to think about what they do in terms of both training and education.

Sankey: When I read your article, problem-based learning also resonated with me as being very similar to war gaming. Which of course professional military education traditionally is very rooted in sand tables and wargames. How does this overlap? And how is it different?

Perez: I'm a huge advocate of war gaming, and I think that war gaming is indeed a variant of problem-based learning. It's a way to do it. I think though, that war gaming is best suited towards exercising the ability of students to make decisions on the spot, to understand the relationships between variables that have already been decided among authorities are important, and so they put them into the war game. And so I'm an advocate, we should do them. They test the student's ability to execute and to think in terms of execution. There's another aspect though to military work, high stakes problem-solving strategy, which entails planning the homework that we do before we insert ourselves in interventions around the world with diplomats and others in order to solve problems. And so in order to do that, we need to accustom students to assembling the materials that they need in order to understand the environments in which they're going to intervene in. And in a war game, it's a scenario, in other words it's camp. It's already been prepared. There are certain objectives that the war gamers want the students, or the planners or whomever is going through the war game in order to get.

But in planning, what we accustom students to discern are the many variables, the complexity that exists out there among the many variables. Because it's likely the case that what's gonna jump up and actually cause friction, cause the principal obstacles to arise in the real world, they're not gonna be planned. No one would have thought of them, and they will come as a surprise. And there's a group called the Oxford Research Group and the Remote Warfare Program out of United Kingdom, where they did a lessons learned study, and they were interviewing one of the generals who was involved and partnered in operations in Africa. And he said, "Look, the environment in which I am operating in, in which my troopers have been operating in, is so complex that you couldn't design a war game, a scenario sufficiently complex enough to capture the things that I've been wrestling with," and I take that seriously. And so I think war games are important, but so too is problem based learning that's focused on planning.

Hennessey: I think Tino, too, one of our findings really revolved around causal logic, and I know Tino, you have a connection with Séverine Autessere who has the finding that students really fail to think through theories of action when it comes to intervention and crisis situations of which of course our national security professionals will be expected to do. And I think that's the other part that maybe, it's easy to forget with war gaming and problem-based learning can cover that delta. Especially if you focus on the debriefing and reflecting phase, you have to think through that causal story and defend it, and hopefully you have an instructor in PME who will force you to do so, and students aren't used to doing that. And it can be highly personal, highly emotional, and it's definitely a learning experience.

Perez: No, absolutely, Megan. Absolutely. One of the things that we talk about in the PME is critical thinking. This critical thinking is somewhat, to me, it's vague. And to me critical thinking encompasses two specific things for military practitioners, diplomats, other national security, the developmental workers, other national security professionals. And one of them is normative reasoning, which is arguing about the ends we should be pursuing. Are they better or worse, good or bad, just, unjust, evil, unevil, ethical, unethical, etcetera? But the other one is the one that Megan is getting to is this idea of causal reasoning, causal literacy. Which is, how does the world work? Why did this problem, that we as military professionals or as national security professionals, how did it even come to be? How is it changing over time? And then what are the various ways in which we might intervene and the various ways in which those interventions might result in a wide range of outcomes, and students aren't accustomed to being pushed that hard to give the causal reasoning, and it shows up in actual field settings in operational settings, and so we really want to use PBL, problem-based learning to get to this. So thanks for raising that Megan, absolutely.

Sankey: That feeds in really beautifully with my next problem, which is not just in PME, but in my experience of graduate education overall, the big leap, especially for undergraduate to graduate is grappling with ambiguity. There isn't one clear answer. That things don't turn out the way that you expect them to. That all of the people in a scenario are gonna do their own thing no matter what you want. Scenarios, just as you said, don't suddenly occur just to annoy us, it's the product of a very complex frequently wicked problem with no simple solution. And I think you've outlined why it's especially crucial for national security professionals to become much more comfortable with problems that they maybe can't solve. Is that a tool for getting them more at ease with what would be kind of a failure maybe in the back of their minds?

Perez: Absolutely, so yeah, Megan spoke earlier about there being no right answer in a PBL exercise or scenario. And this is absolutely, right. So the first part of injecting or inculcating the idea of PBL into the Academy is to get educators comfortable with the idea that, "Hey, not every single hour needs to have a well-defined end state." The first hour, then the second hour, I'm gonna do this, and then I'm gonna do this. If you give them a problem like, what do we do about Afghanistan? What do we do about Iran? What do we do about Russia? There is no right answer that the educators are gonna be able to put down in a rubric and say, this is what they need to do in order to get a good grade. So educators need to be comfortable with the complexity. Now, what does this mean for the practitioner, for the student, for the educator, the complexity we deal with can be... Touch upon it in terms of environment. So we often talk about framing the environment or understanding the strategic environment in which we're gonna operate. The problem is that there is no one environment, it's plural, so let's take a look at some of these, why are these problems complex? Well, if we're dealing with say, Syria or Afghanistan, or the South China Sea, with respect to China and great power competition, there's the actual adversary, the political system and the military system, say inside China.

There's also the region, there's regional complexity. So what is going on geographically and in terms of other regional actors in the broader region that is effected by the South China Sea and China's actions or with respect to say, Taiwan who else matters? We also have to look at our own organizations, our military organizations, what would it take for the joint force in order to intervene successfully in this instance? There's also the environment inside Washington DC, so within the Pentagon, within the State Department, within the White House and the National Security Council. There's obviously a dynamic there, the Combatant Commanders and the Joint staff, and the Army staff, the Air Force staff, need to deal with as they come to grips with what to do with this. There's also the broader public. And so there's a civil military relations environment that we have to deal with, and then we have alliances, we always are operating with friends and partners overseas, so those relationships, there's simply another environment. Well, in any geographically-oriented problem that we're dealing with, all of those come into play and they could all bite us in places where we don't want to be bitten, and hurt us. And they all need to be thought about as we think through our actions as military professionals, as national security professionals.

Sankey: The core of your fascinating article, and I hope we can include a link to it when we finally get this posted, [MHD1] was the pilot program that you ran at Carlisle with a seminar of Army War College students. Could you please tell me about that set up and how that worked?

Hennessey: Yes, it was really exciting. So Tino and I were part of the Curriculum Committee for the mandatory core course that kicks off the resident education program year, it's called Introduction to Strategic Studies. And Tino had some great ideas about PBL and how to sort of transform that five-day course into an experience in authentic assessment. And I said, well, let's take it one step further and turn it into a scholarship of teaching and learning pilot and actually put the framework of educational research around it, so that we can better share and understand the results of what we find and contribute to the literature, essentially, because this has never been done before at this level, senior service college level.

Essentially, what I contributed was, what are we looking for from a learning standpoint? What does the process of student learning look like when you do a PBL intervention? And how does it change over time? So we had the resident education program 10 months long, we had the privilege of seeing these students over those months, how they may have changed or adapted as a result of this exercise at the beginning of the year, so there was some comparative pre-post analysis happening there. = Do they learn better because of this exercise? So what are some artifacts of performance that we can look at and the metrics by which we may judge them to see if that experience, again, contributed to their learning. And then the third piece was, how does this reflect on the core curriculum itself? Are we teaching the right things? How do we know what we should be teaching, whether that's skills or substance or content or both, and how are we going to loop this back into what we do in PME as professionals, how will we improve our programming, our teaching, our relationships with students, our research, etcetera?

So it was really exciting as an educational scholar, the ripples were endless of what you could explore, but at the end of the day, we came up with two research questions that informed the study. So the first was how do students translate knowledge of strategy into performance? Which Tino has already talked about in our discussion today, the importance of that. And then the second was, how does the completion of the core curriculum influence students' performance in problem-solving exercises? So that was really the programmatic assessment piece of, we saw what they did at the beginning of the year, let's set up something as well as a post year exercise, a summative assessment, if you will, to see if they got any better and how they pull in what they've learned over the past 10 months. From there it turned into a mixed methods study, was highly ethnographic. So we were very lucky to have some observers during the PBL phases of the exercise and were able to code those observations thematically. We also had artifacts, again of the students' performance, so the solution products that they shared with us, and then rubrics. So in terms of the rubrics, we used rubrics that were already in place at the institution, so the students were familiar with them.

            Sample criteria were oral communication skills and strategic thinking. So there was a framework in place that we already built from and then helped to make it a little bit of an easier sell for those who participated in the pilot for us, so there was a social function there that helped. But especially we had two seminars, so we had a control and a test. Our test seminar at the beginning of the year, 16 students, two women, three international military officers, we did that purposely to try and make it as equitable as possible, but also it highly limited the number of total seminars that qualified, so our inclusion criteria, they're cutting down to four that out of the 24 at the time would have qualified. And then we approached their teaching teams and recruited the two from there.

But the test seminar started at the beginning of the year with the standard course, the introduction to strategic studies course, with one amendment, which was the final day of the course was not spent in guided discussion or lecture as the other seminars were across the school, but instead they did a problem-solving exercise, a PBL intervention. And the problem for that was develop a regional strategy to reinforce regional stability and prevent proliferation of weapons of mass destruction during the 1991-2000 time frame. So this was building off of the content of that course that they had already covered to date, the course essentially is a case study of the 1990 Persian Gulf War. So they had a couple of days under their belt, and now they were expected to do something, to translate that knowledge into strategic performance.

So on the last day, they worked together to not just solve the problem, but first self-organize, which was fascinating. We gave them the option to split into their own groups and there was no real creativity with that process, they just split right down the middle of the room. We mentioned ethnographic observations. One of the key findings was that those dominant personalities emerge very quickly, and in both the fall iteration of the exercise, as well as in the spring, women and international military officer students were in many cases, relegated to clerical roles, so expected to be the ones to take notes, not expected to contribute verbally or really productively to the conversation, not called on directly to share their personal experience or in the cases of the international military officers, the regional experiences that they may have had to enrich the problem discussion. So that was fascinating. And at the end of the day, they briefed their product, their solution to this problem, to their teaching teams, and had a chance to reflect. There was a lot of discomfort, they did not expect to have to do something like that so early into the school year. And I think they learned a lot from it based on that reflection.

But we did not see that show up very much in the spring iteration of the exercise, and I think Tino will probably talk a little bit more about that part, but essentially we did it again in the spring with that same seminar as well as one more. So the control seminar had not had the opportunity to do the exercise in the fall, they were new to it, they performed almost exactly the same as the test seminar. So that tells us a couple of things about familiarity with PBL. It did not seem to be an advantage to one group over the other that they had done it before. It is a pilot study, so I think we were all surprised by that finding, but I'd like to dig into that a little bit more. And essentially, the spring exercise was also amped up a little bit in terms of authenticity, because we invited observers from fairly high ranking senior leader positions within the school to observe but also evaluate. So there was an added layer of pressure there that was purposeful to amp up the affective domain, so to speak, to use educational lingo, but also to give them some experience and what it's like to brief something that you created to the senior audience and to be able to do it and defend it well, and speaking well, presenting yourself well, that was essentially the broad set up the framework of the study. I'm happy to go into more on the findings if you wish, but that was how we set it up.

Sankey: Usually right about this point is when I get excited about a concept and think, "Wow, I want to do this in my seminar," but then the rational brain kicks in and says, these are two professionals who have a background in pedagogy, who have very carefully set this up, so instead of rushing in to give them a problem, let's set up background structure should curriculum designers be planning and thinking about this sort of instructional strategy appeals to them.

Hennessey: Yeah, this is very much backwards planning, and so the problem is the focal point, and so some questions to get you started, first would be, who is going to define the problem? Will you give the students a broad topic or an operating environment and ask them to define the problem? In which case that adds another layer of teamwork, of team dynamics, of environmental scanning, knowledge of the environment, the history, the context. So it depends on what you want to test there. Or are you strapped for time? Would you just like to give them the problem, which was the situation that we were in, we were very bounded by time requirements, so we just gave them the problem. And then the second, how will students be organized? So will you choose the route that we did, which allows them to self-organize, split into their own groups, solve the problem, or address the problem individually if they wish, that is a key component that will affect much of what else you do in the classroom for the PBL exercise. What resources will the students have access to? Are they allowed to do their own research? Are they expected to address the problem based on everything that they already know?

So one of our findings was that the students had limitless resources for this study, and they chose not to use hardly any, they relied so heavily on their own existing heuristics and frameworks that it certainly skewed the outcome of the process and the discussions they had. No willingness to conduct research, and in some cases, it was extra ironic because the seminar was seated in the physical library. So they had their resources right there, but they chose not to use them, so that's another thing to consider, is the access that you'll be giving your students to resources. And then finally, how will students share their findings and recommendations? Do you have a preference in what their solution product might be? If you're teaching an inter/multi-disciplinary course or you're emphasizing professional skills such as briefing and presentation, maybe you want to stick with something more traditional and ask them to do a briefing or perhaps some more creativity might be pulled in. Tino has had a lot of success with mind mapping and rich picturing that in and of itself is the solution product, and so you might consider that.

And to what extent will their work be shared outside of the classroom? So we saw a major difference in student behavior between the fall and the spring studies, because the spring study, again, brought in those senior leaders as an audience, it ramped up the anxiety, although I think that was a positive thing because it gave them the experience of what would be expected of them after graduation, but it does add another layer to consideration for planning. So those are sort of the questions that I would recommend people think about if they're thinking of getting started in this space.

Perez: Yeah, I'd like to add too is, I'm not an expert in pedagogy. I've been teaching for a long time, and I'm very thoughtful about it, but I was so happy to meet Megan, and I would be excited about the approaches I use, but she was able to share that, hey, there's actually a science to this stuff, there's a field of knowledge, which she is an expert in that relates to what I was trying to do, strictly as just a practicing teacher and educator, and she goes "Oh." It gives you the confidence, right? It gave me confidence that what I was doing, that there was some soundness there in terms of a research basis. And then she linked me in with this scholarship on teaching and learning community among... And institutions of higher learning, and it was exciting to see there, lots of people that are thinking about PBL, but other concepts within... They relate to how we teach and are really mindful about that.

But the reason I bring this up is because I think that you teachers or people who are new to PME teaching when they hear this go, "Oh you know, that sounds pretty difficult. You have to be a pro at this in order to execute it," and I think Megan would agree with me if she doesn't, so... But I think a first year instructor could do this. Let's talk about PBL and the train up for new faculty, let's discuss what kind of problems we can use, let's talk about what the hurdles are and what the opportunities are. We should feel comfortable in setting up this exercise. Yeah, you don't have to be an expert PhD at educational methodology to do this well, but it sure helps to have people like Megan with her expertise saying, "Hey, this makes sense, and here's good ways to do it."

Hennessey: Thank you, Tino.

Sankey: Another really striking point that you make in the article is the absolute willingness of the seminar students to see this as an academic exercise, and that had some downsides, in that they were also willing to default to, well, PowerPoint makes it so. We said this is what was gonna happen, and that's what happened. There was some pushback. They were not happy when they realized that the fiats that they'd laid down about how allies would behave or what the reaction would be is not what they prescribed, so how might you structure the pre-brief or interventions during the exercise that might vector that a little bit?

Perez: So that comes from this PowerPoint nirvana, I think, is that the phrase that... It comes from Jeffrey Miser, he wrote an article for parameters a few years ago. He saw this firsthand among military professionals that they put something on the white board to have lines of effort leading towards an end state, and if we just put it there for diplomacy and information in military and economic, then we're gonna get the good stuff on the other side, and this just doesn't work out, and we still use this kind of way of thinking. The average class here is three hours long, so in three hours, a lot of material is covered. Students do homework, they do reading, they come to class and they read about it and they discuss what they read the night before. And let's say a class has say 30 lessons or something, three hours times thirty. That's a lot of time. If you never put students in the position where they have to comprehensively apply those concepts to a problem, not just once in a capstone sort of way, but over and over again.

I don't think retention is there, I don't think the ability to retain those concepts, and they're certainly not gonna know how to use those concepts to get a better understanding of the problems that they're trying to confront. So the first biggest point I would make is that we need to start talking more about something called abductive reasoning from Charles Peirce, he introduced this idea. But it differs from inductive reasoning, where we have several cases, we try to create stylized facts that are common in all the cases and create a theory about what's going on, and then later we try to test that theory by deriving certain hypotheses and applying them to various cases to see if the theory holds up, and we kind of go into a circle. To me in this description I just described, I just offered, inductive and deductive reasoning are scientific pursuits where the thing that matters the most is the theory, that's what we really care about as scientists.

But in the real world, practitioners care to solve a problem, that's the number one priority. So what we're trying to do in abductive reasoning then is, fill the students heads with established theories that pertain to the environments they have to confront, they might have to intervene in, so that they can use the concepts within those theories to unpack the environment and to decide upon various types of alternative intervention. If what I said really matters, if that's compelling, the implications for education are huge, which means that if we're teaching something for three hours, and then another three hours, another three hours, at the end of the week students should have to be put in a position where they have to take all those concepts from the week, aggregate them in some way, so they understand their relationships between them, and then next apply them to a problem and wrestle with it together. And then the teacher is able to give feedback and correct, unless that happens, you could teach three hours a day in traditional seminar discussions and you're not gonna get to high performing practitioners.

The next thing I'm making, this is a shorter point, is when we talk about problem-based learning or say studying cases, we often talked about canned scenarios. I'm a fan of just opening up the newspaper online, say, "Okay, well, what's going on? It looks like Azerbaijan and Armenia are having a war. Let's apply the concepts we just learned about war and coercion to this case." So now the students have to go out and research what's going on in this conflict, you have to help them decide what's better, what's worse, researcher's quality. Bring it together, aggregate their research, that's a depiction of the environment, and then apply the concepts to it. So use real world cases as often as possible, that's what I advocate for. And then the other thing is if those who are evaluating need to keep high standards. So Megan talked about at the end of the year, we did this with two different seminars, each one broken in half, so four observations. And we had an ambassador, we had the commandant, we had a provost giving feedback among others.

Only the ambassador was really critical of a couple of silly things that were coming out of the fanciful ideas about intervention, but all of them gave them good marks on the written evaluations. So wait a minute, if it's bad here, we should be bringing that out, otherwise what's gonna happen overseas when this thing is really put into play, the strategic performance, and it's gonna blow up in our face. So abductive reasoning, use real world scenarios, and then the evaluators holding the students accountable for what they're doing to make sure they're thinking comprehensively. I think that's the best way to get at some of those, to make sure the students are thinking comprehensively about the problem.

Sankey: And to return to something that you brought up earlier that really was striking in your result is that this wasn't just about the course content, watching students default to a couple of strong personalities dominated the conversation, the side-lining of some minority populations and the international officers, that really tells you a lot in a safe environment about people's leadership and management skills, is that something, maybe the seminar instructors could leverage in their 360 evaluations or something else, where that observation is valuable.

Perez: This fighting stands apart from the other ones because it has to do with how we treat each other, it has to do with this topic of diversity, inclusion, and equity. It's a reinforcement of this idea that high stakes problem solving, or strategy is an interdisciplinary human endeavor, so that you might want to separate leadership from defense management and acquisitions and things from national security policy and strategy, and then military silos, but that's not the way the world is. And how we treat each other affects our performance, and the fact that those women in each of the four groups at the end of the year were relegated to clerical position, in one case it was so bad, one woman quit, that there were dominant personalities who at the end of the year weren't self-aware, enough to moderate it, to acknowledge, "Wow, I'm not giving everybody a voice here." Despite going to an education and leadership and... These things affect each other, and the performance of those groups was dampened by this behavior. It brings to mind Jason Lyall, a political scientist, he published a book within the past the year or two, called Divided Armies, where he shows that armies, militaries that are more inclusive, inequitable with their persons, with their service numbers are more, or have high-performing results in the battlefield, that they tend to win more.

So this isn't just an ethics issue, it's also one of performance. It's all related. It's surprising the findings here, Megan and I have been briefing this for both in the War College and then outside. I wish it would have... It would shock people more. I wish it would hit them more because it relates to the curriculum, it relates to our causal claims, it relates to what strategy is, to how we teach, and then as we say here, diversity, equity, and inclusion issues.

Sankey: The article was based on, as you've said, a pilot study, if one were to scale this up to have the whole War College do this, what sort of best practices or things you'd need to be mindful of, if you were expanding this to all the seminars?

Hennessey: I think that the questions that I outlined earlier still apply, so things like who's gonna define the problem, how will your groups be split, if you'll have groups at all? But what I might also suggest is some things to watch out for: common weaknesses and student arguments during these sorts of PBL exercises and assessments. The number one weakness from students is the lack of counter argumentation. They have not anticipated opposing viewpoints and they are surprised when you ask them. Some other weaknesses, students have only selected evidence that supports their own claims, that’s something that we saw quite a bit in the spring exercise. Students demonstrate a greater conviction to personal beliefs than evidence. We saw this not so much in the final briefings, but in the actual problem-solving activities, the discussions they were having in their groups. Very, very strong ties to anecdotes, personal experiences that in some cases got highly emotional.

In one case, a student in one of the seminars flipped a table and stomped out of the room because his experience was so anchored in his mind as something that should affect the discussion that when he was pushed on it by a peer, he was unable to handle it in the moment and needed to step out. So some other things; students rely on over-generalization from a single source of evidence, that's a fairly common fallacy, you could see throughout all of higher-ed, I think that you find your source and then you stick to it. We saw that quite a bit. And then the final one, students make unsupported assertions. Again, not unique to PBL, not unique to PME, but something that you would want to take into account when designing your exercise, especially on a full-scale level.

Sankey: I'm still excited and I would still want to do this in class. Can you give me kind of a cheat code for what would be the first thing to do if I wanted to get started? What's the first constructive step to take if this is something you want to include in the upcoming PME school year?

Perez: Let me reverse that for you. See if this helps. A way to think about it would be, what are the things that I want to see as an educator in my seminar at the end of the year? And then, how does that vision affect what I do in the first week, the second week, the third week of the school year? One of the things I'd like to see is when I ask students to look at a problem, do they consult automatically without being told, do they go seek out books, articles, other persons in the building or who are accessible? International fellows from a region that they're studying, or other experts on the campus who might be able to shed light? Do they consult literature in persons with expertise? You want them to do this automatically. Do I want them to apply the concepts that I've been spending three hours a day discussing and inculcating? Do I want them to apply those things in this problem set? Well, if that's so, then how do I teach differently? And maybe how do I give mini assessments on the way as I approached this at the end of the year? I also want to see students who specified tight causal on claims, give me a mechanistic sort of... Though it's probabilistic, nothing is 100% in our world, but they have a good duly diligent account, a responsible account of why they think an intervention might play out in a good way and they should be able to give those.

So then what do you do at the beginning of the year in order to start cultivating that practice? We haven't talked about ethics that much, a little bit, but if I want students to appreciate that indeed war is a contest or a battle of wills, but war is also the nature war. If it's anything to me that it's loss of life. It's maiming, it's displacement and statelessness, it's disease, it's famine, its family separation. It's all these horrible things. So, do students understand the relationship between the military profession in the application of lethal power and those civilians who live, work and play in those places where we operate?

And I want to see students at the end of the year really wrestle with the ethical dimension of the profession of arms in ways that they're not gonna do on day one, for sure. And then at the end of the year, I want to see students who are able to work together in teams, who are self-aware, who know to moderate their behavior, and when someone's not speaking up, to pull them into the conversation, make them feel like members of the team. Then finally I want students who do not display hubris, that when pushed on the work that they've done, that they know that there are weaknesses in it, and they accept being called out on those things, there's a humility, strategic humility there about their performance that we all need. If those are the things I want to see at the end of the year, then how do I structure my seminar at the beginning of the year to get to them? I know I didn't answer your question, but that should shape a good conversation among those people who listen to the podcast.

Sankey: That's very helpful and very constructive. I'm always trying to do that as well as to support an instructor who's brand new and thinking, "Oh no, how do I get through next week?" They're not thinking, "What do I want at the end of the year?" It's, "How am I gonna survive until Friday?" So these are really important tools that get at the heart of what it means to be a national security professional, and so I was excited by your research. I love having Megan as a colleague here in Montgomery, and it's been a pleasure to meet you. For both of our guests, is there any wrap-up that I didn't think of or that you'd like to add?

Hennessey: Yes, I'll make a shameless plug. If this does interest you and you're specifically interested for the listeners in the research side and sharing your findings, please get in touch with us and join our growing community of practice for the scholarship of teaching and learning in military education specifically. We have a conference coming up, April 13th and 14th of 2022. It's our third iteration focusing on questions just like the ones that we talked about in this conversation. So thank you, Margaret, for giving us the opportunity, and I hope that this continues to build like-minded scholar practitioners among our peers.

Perez: How can I follow that? That's terrific. Thanks for the opportunity to speak with my colleague, Megan. Again, we missed her here at The War College. I miss her terribly. So Air is lucky to have her, but thanks for the opportunity to talk about these issues, Margaret. Appreciate it.

Sankey: Thank you for your time and expertise, and we'll try to include links to your article and your community of practice and all the other things that interested folks would need.

 

Dr. Megan J. Hennessey

Megan J. Hennessey, PhD, is the director of the Teaching and Learning Center and an associate professor at Air University. She holds a doctorate in higher education from George Mason University and has held multiple faculty and instructional systems designer positions in professional military education and other government settings.  To request a copy of the study, please email her: megan.hennessey@au.af.edu.

COL Celestino Perez, Jr.

Celestino Perez, Jr. is an active-duty colonel in the U.S. Army and an associate professor at the U.S. Army War College.  He holds a Ph.D. in political science (political theory) from Indiana University at Bloomington.  He has taught at the U.S. Military Academy in the Department of Social Sciences and at the U.S. Army Command & General Staff College. In the Fall of 2021, he will retire from active duty and assume the Chair of Executive and Strategic Leadership at the U.S. Army War College.

 

Wild Blue Yonder Home