Time to Get Hacking

By Tom Porter

“We made an app to encourage climate action among the Bowdoin community,” said Eva McKone ’26. After spending the day immersed in the world of generative artificial intelligence (AI), McKone and two fellow students had created a product.

“The app works by quantifying how much carbon you can save by, for example, riding your bike to school, or eating plant-based food, or taking shorter showers,” explained the biology major. “We hope giving people a number they can tangibly hold on to will help motivate them around these sustainable habits.”

McKone and her team members, Sophia Naumov ’28 and Oscar Martinez Mendoza ’28, were among the approximately sixty students of all experience levels—mostly from Bowdoin, but also from Bates and Colby—who attended the Generative AI Hackathon on Tuesday, October 14.

The event was part of the College’s Hastings Initiative for AI and Humanities, which was launched earlier this year.

“The goal for you, as students, is to explore,” said Visiting Lecturer in Computer Science Christopher Martin, addressing the attendees as he kicked off the event, which he helped to organize.

“Students are using our new Librechat system [a recently launched open source AI chat platform] to access any large language model [LLM] they want,” he explained, “to do whatever they want and explore what those models are, and are not, capable of.” There’s no set goal in mind, he added, other than gaining familiarity with generative AI tools, i.e., those that can create their own content from vast datasets.

AI Overview Plus a Philosophy Class
A range of experts were on hand, offering workshops in different aspects of the field. Among them was Adrienne Kinney, a postdoctoral associate in AI and Humanity, who led a session titled AI 101. “We provided an overview of how large LLMs work and how their ‘predict-the-next-word’ foundation can be fine-tuned for specific applications. To illustrate these ideas, we compared the behavior of a tutoring-focused AI agent with that of a general-purpose model,” she said. “We aimed to illustrate the ways language models can support and expand the development of ongoing projects.”

Kinney’s postdoc colleague Collin Lucken’s workshop offered a look into the philosophical underpinnings of generative AI. “What I wanted for attendees of the workshop was a brief introduction to some longstanding philosophical problems relevant in our current AI moment.” For instance, he explained, attendees were prompted to consider how they would judge for themselves whether any given AI system is conscious or sentient in the way human beings are. “Hopefully, attendees left the workshop with renewed curiosity about the nature of mind.”

Both Kinney and Lucken were recently hired by the College as part of the Hastings Initiative, along with two recent Bowdoin graduates who were taken on as postbaccalaureate fellows.

How to Tame Your Python: "Accountability is Key"
A number of Bowdoin alumni who work in the field of software development and artificial intelligence also attended the event to share their perspectives on how AI is shaping their world.

Jackson Wilkinson ’05 works for venture capital firm F-Prime, where he brings his years of software design and development experience to bear in advising early-stage companies, mainly in the healthcare tech sector. “They’re all thinking about the best way to incorporate AI into their software development process,” said the music and philosophy major (who started writing code at the age of two!). His advice to them is to “frankly, consider allocating a lot less to junior level engineers than you have in the past,” due to the recent, dramatic improvement in the performance of generative AI models successfully writing code in common languages like Python.

“A year ago, LLMs were producing intern-level results,” he said. “Now, when used well, which is an important qualifier, these tools can perform more like someone with two or three years of experience.” This means that companies today are increasingly less reliant on software engineers spending their time writing code.

A more important skill set now, added Wikinson, is knowing how to review the code being produced and how to ensure automated tests are complete and comprehensive to catch errors.

At the end of the day someone has to be accountable for the code produced by these tools, he said, and a missed bug can have pretty big consequences.

The human accountability aspect was also underlined by software engineers Tucker Gordon ’17 and Mason Daugherty ’25.

While much code is now written by tools, said Gordon, being able to read and understand the different coding languages is still important. “Thanks to AI tools, I am now able to create so much more, so much more quickly, than I was a year ago,” said Gordon, who works for Archive, a software startup powering branded resale programs. But with increased output comes increased responsibility, he stressed, “and accountability is key.”

Daugherty works for LangChain, an open-source framework that helps clients create AI-powered applications. “We ship a tool that many Fortune 100 companies use and depend upon for critical applications. So, if something breaks in the code that I write, that could potentially mean lots of money lost—shareholders don't like that.”

All of which requires an unprecedented level of scrutiny and oversight from software engineers. It also adds up to a world where the undergraduates of today might be left scratching their heads when it comes to preparing for the job market they will face in the near future. In this fast-changing environment, where problem-solving is key, said Wilkinson, the education offered at colleges like Bowdoin will put graduates in a strong position. “If you're somebody who has learned how to use technology effectively in a liberal arts context, from the perspective of various intersections of knowledge, this sets you up to be increasingly in demand in many organizations.”

“Attendees were prompted to consider how they would judge for themselves whether any given AI system is conscious or sentient in the way human beings are.”

—Postdoctoral Associate in AI and Humanity Collin Lucken (also a trained philosopher).

Ethics and Accessibility
Librarian Beth Hoppe and Digital Content and Accessibility Consultant Juli Haugen spoke with students about the role of generative AI tools in research and education.

“In addition to the wide range of concerns about using generative AI as a tool for research and writing, traditional citation styles and policies don’t always capture the intricacies of how generative AI may have been used in the course of a project—such as for transcription, data analysis, outlining, organization, or other process-oriented functions.” The important question to ask, said Hoppe, is “How do we then make sure we are representing our work in a responsible and ethical way?”

They urged students, as well as faculty and staff, to check out the AI ethics course they have put together in Canvas, which is intended to provide a foundation for critical inquiry into the ethics of generative AI use.

2025 hackathon in mills hall - whiteboard

Takeaways
One important thing math and economics major Sophia Naumov ’28 said she learned is to make sure you know how to use generative AI tools and to ask them the right questions.

“Probably my biggest takeaway from the Hackathon, though, is that no matter how much AI takes over our world, you still need people behind the technology who are checking it, because you can’t 100 percent rely on what it’s telling you.”

Oscar Martinez Mendoza ’29, who’s considering majoring in physics, math, or computer science, said he was seriously impressed by the power of generative AI tools and their ability to perform incredibly complex tasks. But, he added, they still have a long way to go in some areas. “We were amazed by the difficulty we encountered in trying to accomplish something relatively simple.” As they designed their app, the students tried to create an image showing a finger pointing downward—but this proved beyond the machine’s capabilities. “After five attempts, we gave up.” 

Bowdoin’s Generative AI Hackathon was sponsored by the Hastings Initiative for AI and Humanities, Bowdoin IT, and the Office of Career Exploration and Development (CXD).