Listen to the full episode here:
Amir Bormand: On this episode of the show, I have with me David Marchick. He is the Dean of the Kogod School of Business at American University. We're to be talking about preparing to work in an AI infused world and David's going to walk us through what's happening within his world with AI. Lots of education is changing and well, obviously you can avoid it, which a lot of schools are trying to, or you can embrace and David's going to talk us through and walk us through what that means for him and the university. We're going to learn a little bit about how that teachers are actually starting to incorporate AI within their teachings and working with it versus against it. David, thanks for joining me. I'm excited to see how much we can cover here. Absolutely. All right. So I guess we're going to be talking about really the educational system and AI in this case, obviously American University, you're the dean of the school of business. I guess, tell us a little bit about before we dive into where, where you have been headed, heading, tell us a little bit about just traditionally the view of, you know, where these AIs, these GenAI tools have been seen by schools. Obviously they've come on the scene the last couple of years. So tell us how the traditional view has been of these technologies.
David Marchick: Well, when AI jumped into the national consciousness, AI has been around forever, as you've covered on your podcast. I've listened to a number of episodes. So AI, machine learning, it's been around for a long time. But really, it's been in the public consciousness for a year and a half since ChatGPT went wild. And the initial reaction in academia at all levels, high school, college, grad schools, et cetera, was how do we stop this so students don't cheat? What rules can we put in place to block it? And how do we prohibit it? And many schools, high schools, universities, grad schools, law schools are still taking that approach. I have deans from all over the country calling me and college presidents saying our policy is no AI. Okay. The data shows that pretty much everybody aged fifteen to twenty is using AI. So you can't stop it. It's kind of like telling students not to listen to music. What we've done is embrace it under the view that anybody that's Gen Z that goes into the workplace, whether they're twenty-two and getting their first job out of college or they're thirty having got received an MBA or a law degree or a joint degree, their employers are going to expect them to use it. And it actually clicked in with us when we had a speaker named Brett Wilson speak at our school about two years ago. He's the CEO of a company called Swift Ventures, which is a venture capital firm, San Francisco, invests in AI. And a sophomore raised his hand and said, am I going to be replaced by AI? And the room kind of fell silent. And Brett said, you won't be replaced by AI, but you could be replaced by someone who knows AI if you don't. And at that moment, I said, we have to infuse AI into everything we do. And our faculty responded very, very quickly in an impressive way.
Bormand: Fantastic. mean, obviously we do hear about it. I think I mentioned to you previously, my daughter's school, she's still in lower school. So not at the university level, but they restrict it heavily. But also we know almost every kid somehow figures a way of using ChatGPT. It is inevitable. I guess when you heard this, a little bit of an epiphany of shifting gears to bringing AI in, you said the faculty was super supportive. What does that mean from an educational framework? And the reason I ask that is we've have hundreds of years of educational system built upon the student doing all the heavy lifting, regardless of how tedious and how monotonous it might be, but we do it all. I guess that shift to something where you're asking the faculty to bring in those AI tools that they previously not worked with or potentially had incorporated within their educational curriculum. What kind of shift is that? Like how big of a shift is that?
Marchick: It's a huge transformational shift. In our lifetimes, the personal, well, I'm older than you, but the personal computer was invented. I liken it to kind of a math teacher when calculators became ubiquitous and cheap. And a math teacher saying to a student, don't use a calculator because you're not going to learn math. You're not going to learn addition. You're not going to learn multiplication. A student needs to learn the basics. They need to learn math. But a calculator now is a ubiquitous tool. know, spelling. Okay, when I was in grade school, I had spelling tests and spelling words all up through really high school, studying for the SAT. I'm probably a much worse speller today than I was when I was seventeen because the computer fixes my spelling. I type it and I just run spell check and it fixes it. So those are small changes in our behavior and the way we learn and the way that we teach. AI is like a giant asteroid hitting earth and changing education in a very, very transformational way. if universities can't move with speed, we will be disintermediated. The basic function of learning, knowledge accumulation, access to learning anything is changing. Even coding, which is a very sophisticated part of what we teach, is changing because Google and Anthropic and firms all over the country are reporting that forty, sixty, eighty percent of the code that's being written is actually being written by AI as opposed to programmers. So the whole way universities teach computer science is about to be revolutionized.
Bormand: You were to look at traditionally we've asked students to come up with original work. You know, there's, there's, there's, you know, plagiarism tools, there's ethics tools, all these different tools to keep people from borrowing, others' ideas. I guess with the tool that's generative in nature, that barrier is very much gone. I mean, you could pretty much ask an AI to write in any voice tone, make adjustments to your heart's content. How does the educational system balance that? Because obviously you still want something from the student, you still want them to learn, but there is a component of how much of that will be original. I guess that might be more art than science, but how do you view that? How does the educational system support that?
Marchick: So it's a great question. I would say we're still figuring it out. ⁓ Nobody really knows the answer. One of the things we've tried to do at our business school, which has created a culture of experimentation, is to say, hey, we're making this up as we go along. Faculty for decades, for millennium, have basically said, I'm the expert. I studied this. I got a PhD. I'm the expert on this. And my job is to share my knowledge with you. With AI and the infusion of AI, we're all making it up as we go along. So it's interesting. And we're learning and we're adjusting and we're making mistakes all the time, which is fine. And we're telling students, hey, we're making this up as we go along and we're learning and we're adjusting and we're making mistakes all the time. The first thing we do in orientation and our kind of Business 101 class—the freshman first semester, first year class—is we teach students to question AI. Question the accuracy, question the output, and question the process that AI goes through.
AI is imperfect, and we want students to know how to judge the imperfection and problems in AI. So teaching that kind of skepticism then allows them to learn the possibilities of AI.”

David Marchick
Dean, Kogod School of Business
Marchick: And then you know, you can't cut and paste a paper from AI. You'll get an F. But let's say, I mean, when I was in high school, I was not a great writer, and I had hard time conceptualizing papers and a hard time organizing topic sentences. And then, you know, your basic structure of an essay. I would have helped me a lot because it's a great tool to help you structure an essay. Here are the five points you want to make in a structured argument. And then it's up to me to make those arguments and fill in the essay. Then there's the question of everything I do, I have AI proofread for grammar, mistakes, sentence structure. So is that inbounds or is that out of bounds? My view is that's inbounds because when you're working, you're going to do that every day. Every email you send your boss, every memo you write, you're going to have AI check it and proofread it when you're in the workplace. So why not learn that in school? So what we tried to do is teach the fundamentals, have AI applications infused on top of that so you can be better, more efficient, more productive, and then full disclosure of how you use it so that you know. We don't want students to cheat. We don't want them to use AI to substitute. We know when a student takes AI and sends that in because it starts with, I'm profoundly honored to write this paper. And you just know, okay, ChatGPT wrote that. So we're figuring it out.
Bormand: Let ask you this, and this might not, and just a follow up, because this might not be an answer that anyone has, but you guys at the university level, you guys are ahead of the game. You're teaching kids to come in day one. This is an accepted tool. They probably have gone through some parts of high school since chat GPT has become a thing and they've been told not to use it. They're coming in, sitting down and it's like, okay, use it. And I guess that's gotta be a big shift for them because they actually have to relearn how to actually, they might've been using this tool all along, we'll leave that aside, but they have to relearn how to actually apply this. They have to learn how to come up with a use case, come up with a better way of prompting all these different things. And I guess if we were to go backwards and just wonder for a second, I know a little bit outside the context of the discussion, but if we were to wonder how long the mainstream educational system is gonna take to understand or want to incorporate tools like this. I mean, that just seems like a very long tail discussion. I'm not even sure how long that's gonna take, but the longer that goes, there's gonna be a bigger gap of how this tool is developed, how people are gonna, it's like, as you're telling, like you said, hey, there's a spell check. Do not use it. Delete the spell check, back to the calculator example. But I guess to that point, like, when will we start seeing this trickle down? Is this a years thing, decades thing? And I know this is a little bit bigger scope than what we were talking about.
Marchick: I think it's a years thing. We started on this journey about two years ago and have moved very, very fast. One publication that gave us an award for the best AI program in the country said that we've embarked on the most consequential transformation in business education anywhere. But we're seeing it now adopted in a lot of universities. There are some of the most elite universities that they're not adopting at all. They have strong monopoly positions and 100 applications for every spot. And they feel like they don't need to move. And then some large public institutions where they're just too, I would say, bureaucratic to move. We're seeing, I would say, in the last six months, fairly rapid adoption and movement in business schools and in engineering schools. I would say the arts and sciences, literature, history are much slower. And I think that’s an attribute of the faculty there. So I think different parts of universities are moving faster. In general, business schools and engineering schools are farther ahead.
We're very focused on giving our students a leg up and having them be more competitive in the market when they graduate. But I feel like we have to run to stay ahead of the pack because everybody else is going to catch up.”

David Marchick
Dean, Kogod School of Business
Marchick: The other thing that you asked about, think, is a really important ethical issue for universities because AI will exacerbate the haves and the have-nots. So right now, if you come to my business school, you not only get AI infused from the day you start orientation to the day you graduate, but we have a partnership with Perplexity so that every student in the business school now gets the highest end perplexity enterprise tool, which costs hundreds and hundreds of dollars a year for students. they get it as part of, we negotiate, we're paying for it. We are testing products with perplexity in classes. products that create tutors, that create mini-LLMs for classes, ⁓ that allow students to collaborate. That gives our students a leg up. But I do think that there's a big There you know, there's already a chasm in in academia between the haves and have-nots and those that that can afford it and those that can't and I do think that one of the negatives of AI is that it'll exacerbate the Chasm that exists
Bormand: I guess just to maybe talk about that, you guys are ahead of the game. You guys are pushing the envelope. It is interesting that there are other schools that are deciding to take a different approach. Obviously, every program can have its own course in this path. You guys are experimenting with applications. You mentioned having access to Gen AI tutors. I guess just to kind and I love use cases because it makes it more concrete. So if somebody's listening to this going, well, how will that tutor, what will that tutor look like? I guess walk us through how that actually, if it's working now or how you envision it, it working. Cause I think that that's a big step for some people to see.
Marchick: OK, so let's take graduate business school. So graduate education and business, either an MBA or we have lots of one-year masters, there are two types of students that seek those degrees. Number one is students that are in a profession where they think additional training and additional credentialing will accelerate their rise and accelerate their employment. Accelerate their compensation. So you could be a finance student as an undergrad and then get a master's in finance and you're going to make more money and have better knowledge. Then yesterday I was with a student who was like a lit major and she is studying finance and she said she was really struggling in the heavy quantitative finance classes. They're very heavy in math. They're very heavy in data. They're quant STEM classes. And I said, work with your professor to create a tutor using perplexity so the professor can upload all of his or her materials, assignments, previous tests, answers to those tests. And then the LLM that's created for that class can create a tutor for that student. So the student can say, give me twenty questions that I need to prepare for this test. I don't understand this concept, can you be a tutor to help me work through this? AI is fantastic at that. There, it could actually be an equalizer. There, we're lucky. I was a lawyer, and I was in business, and I can afford tutors for my kids. They had a hard time in bio or, you know, I throwing money and throwing tutors at it. AI can do that. So there AI is an equalizer and actually creates more equity. But the tool's incredible. And I think overall it will help learning rather than diminish learning. And then there are some ethical questions around learning, like writing. Take an example of writing. Okay, I told you was not a great writer probably until I was you know, maybe a junior in college that I get my kind of confidence in writing. AI can take the bottom decile writer and make them a median writer. They're not going to create a John Steinbeck out of you. So the question there is, are you cheating the bottom decile student from how to learn to write, or are you equalizing his or her skill set and bringing them up to some acceptable mean? I don't know. That's something we'll be debating for some time.
Bormand: I want to ask you something. mean, fantastic points. And I guess one, one question within that on the podcast, I'm asking engineering leaders and our product leaders and other leaders within, within, uh, software companies, whether we're going to see a shift in terms of the type of people that are going to be employed in other positions. And what do I mean? Well, if you're a software engineer, if you don't have to write as much code, can you shift into another role and leverage some of that understanding? Or if you're within design and all of a sudden before you couldn't write code or didn't want to write code, but now you have tools that can, you have an advantage in design. You're shifting yourself a little bit further right into the technical sphere or product leaders or whatnot. In the case of somebody who might not understand accounting finance as well, there are now these AIs that will potentially give you skills or ability supply different skills to those type of areas. I guess as you see students graduate through the program, I know it's still early days, but do you envision people having different options than our traditional, I got this accounting degree, I've got to go into that field versus, I have an accounting degree, but with these AI tools, I can actually do different things applying that knowledge. Is that a potential outcome as we see these tools become more common within education?
Marchick: 100 percent. Let's take the marketing field. OK, so one of the things that we've done is we have brought in private sector leaders to train our faculty under the theory that our faculty need to be retrained to use AI for their discipline so they can teach students. OK, so our finance faculty need to learn how to use AI to teach students how to use AI to underwrite an investment, to do fundamental research, to pick a stock. Our marketing faculty need to teach our students how to use AI to do consumer research, to write presentations, to do image creation, to create ideas, to create campaigns. Our accounting faculty, they can teach a student to write an S1 to take a company public within a matter of days.
When I was at Carlyle, it took us nine months and millions of dollars of legal fees and banker's fees to write an S1. You could do that in a few days with AI now.”

David Marchick
Dean, Kogod School of Business
Marchick: I think that AI can help broaden and deepen skill sets. So let's take a marketing professional. Right now, marketing professionals are very siloed. You're generating images or art. You are doing research, or you're in charge of email campaigns or databases or analysis. I think it's going to make a marketing professional able to do it all and to be much more efficient. So just going back to an example, one of the folks that trained our faculty is a fellow named Ira Rubenstein. He's senior VP of marketing for PBS, the Public Broadcasting System. And he used to be head of marketing for Sony and Marvel. He's one of the best marketing leaders in the country, really. He's part of the Academy of Motion Pictures. So in his role, he creates marketing campaigns for Ken Burns, who's the most significant content creator for PBS. And he came in and gave an example where Ken Burns did a movie on the American buffalo. And Ira went to an AI image creation tool and said, me a poster that looks like Ken Burns style of a buffalo with Native American themes. And in 30 seconds, it created an image. He printed it out, handed it to his creative person and said, here, do something based on this. OK. That probably cut down $10,000 worth of work, hundreds of hours of debate, lots of creation. And I mean, you've been in marketing campaigns, and you hired people through your HR. Creative design of marketing campaigns is painful when you debate the art, the image, I don't like this color. And AI just cuts down on the time and the efficiency in extraordinary ways. So I think it's going to revolutionize many, many fields, including marketing. And it broadens the skill set that one's able to do. And I think it'll make those individuals' jobs even more interesting because they'll be broader rather than narrow. And everybody likes to do lots of stuff, including stuff they're not qualified to do.
Bormand: That's fantastic. I love that example that actually brings it all really home as to the core topic and what we've been chatting about. David, if somebody wants to learn more, we've talked, we haven't put scratch on the surface. I'm sure we could talk about this for hours and still have plenty to go. But if somebody does want to learn more about what you're doing, kind of the incorporation of AI within education, some of the things you've mentioned, what is a good way of learning more?
Marchick: Okay, you can go to the traditional source of knowledge, which is do a Google search on Kogod and AI, and it'll come to various web pages. Or you could go to ChatGPT or Perplexity or Anthropic or Gemini and say, you know, give me a 750 word article on what Kogod is doing to incorporate AI into its curriculum, what classes it offers. What disciplines one can use AI for? What are some use cases? And it'll produce all of that. It'll show that we have more than 40 classes and we've infused AI into every different subject. It'll show that we've infused AI at the undergraduate and graduate level in core courses and in electives. And it'll show that our faculty is incredibly creative and adaptive and learning to teach in a new way to help our students.
Bormand: Love the answer. David, thanks for taking the time. Thanks for sharing with us.
Marchick: Thank you so much for the opportunity.
Bormand: Absolutely. All right, that's it for this episode. Be back again, different guests, different topic. Until then, I think when David and I spoke about this topic and what they're doing within education and the advanced work that they're actually seeing in motion, I think there's a lot of good in talking about this. I'm sure a lot of the audience probably has kids in various stages of education. I'm sure this has come up. So I thought this would be a great topic. Please share this with somebody else. I think David's doing some fantastic work. I love to see what other people are doing and thinking about this and how you're coping with it even at home. Also like, subscribe, comment, let me know how the show's going for you. Until next time, thank you and goodbye.