36 | LEHIGH ALUMNI BULLETIN office in Lehigh’s Alumni Memorial Building, Provost Nathan Urban displays this canvas: a man sporting a gray crew cut, dark business suit and dangling blue tie sitting atop a mythical creature that’s a cross between a dragon and a hawk. The two soar over green treetops, the distant mountains visible in the background. The bold brush strokes that look like thick paint are reminiscent of Van Gogh, but no painter created this image. The canvas was made by DALL-E 2, a deep learning model developed by OpenAI, an American artificial intelligence company, to generate digital images from descriptions called “prompts.” It was a gift from Chris Kauzmann, interim director of Lehigh Ventures Lab, who asked the generative artificial intelligence (AI) platform to create an image of “a university provost riding a mountain hawk in the style of Vincent Van Gogh.” The picture serves as a visual aid for Urban, demonstrating the capabilities of generative AI—and its drawbacks. DALL-E 2 created a fictional mountain hawk that Urban says he would have never imagined, but its depiction of a provost is “not surprisingly, a white man in a suit and tie” rather than someone of another gender or race. “I think that provides a concrete example of some of the concerns about bias intrinsic to generative AI,” he says. Still, Urban doesn’t believe the Lehigh community can ignore the technology or pretend that it won’t have an impact on students as they carve their futures. He predicts the tools will improve, increase in their usage to the point that they will be ubiquitous and set a new baseline for competence— and present challenges by introducing bias into the work generated. “I think we need to recognize and actively engage as a campus, especially with our students, in the ways in which these tools will shape the future,” he told faculty and staff at a symposium on teaching and learning at Lehigh earlier this year. Promising yet problematic, generative AI can create new content such as text, images, audio and video but is limited to the data sets it has access to. It lacks the ability to come up with novel ideas or recognize abstract concepts like irony—something that currently only humans can do. It can refine ideas, aid in creating new ones, save time on repetitive tasks such as writing emails and test student knowledge by getting them to think critically about the responses it generates. As generative AI becomes more widespread, its use is potentially disruptive to higher education and causing universities such as Lehigh to consider how the technology fits into the classroom, to what degree it should be used by faculty and students and how to harness its benefits for students as they prepare to enter a workforce where generative AI will likely be commonplace. A neuroscientist, Urban has always been interested in how the brain does what it does, especially in domains where computers don’t do it very well. “I would say, up until very recently, writing and creation of images have been tasks where humans were completely untouchable,” Urban says. The gap between what humans and computers could produce was so large, it was almost laughable, he says, but that gap is quickly closing with recent developments in generative AI. “What does that mean for education?” Urban asks. “How does that change what we should be expecting of students, and how does that change what students should be expecting from the university in terms of the kind of skills and knowledge they should be gaining during their university career?” WHAT IS GENERATIVE AI? One of the most well-known generative AI platforms is ChatGPT, a language model-based chatbot released by OpenAI on Nov. 30, 2022. It enables users Provost Nathan Urban holds a canvas created by DALL-E 2 that he displays in his office. INHIS CHRISTA NEU
RkJQdWJsaXNoZXIy MTA0OTQ5OA==