Fred Salsbury

Fred Salsbury

Graduate Director
Department of Physics

Courses: Mechanics, Intermediate Lab II, General Physics I, Topics in Cancer Biology, Research Topics in Structural & Computational Biophysics


What do you teach and how have you been thinking about artificial intelligence in the context of those courses?

Depending on the semester and year, I teach Phy 262, Mechanics, co-teach Phy 266, Intermediate Lab II, teach Phy 113, General Physics I, which is mostly introductory mechanics, co-teach a week of MCB 723, Topics in Cancer Biology, and organize talks in SCB 710, Research Topics in Structural & Computational Biophysics.

What excites me about AI in general is the potential ability to hand off more mundane and tedious work to AI and focus on the more significant and creative work. However, the border between what is mundane and tedious and significant varies from first-year physics to actual physics research, so I worry that students might jump too far and too fast without careful coaching, and miss out on key foundations, or learn things wrong. This is especially problematic because right now (Feb. 22, 2024), generative AI makes significant mistakes in physical and mathematical reasoning. This is not a surprise, and generally, ChatGPT+ can be talked through the mistakes. The mistakes are usually ones students would make. But then the question is raised, how proficient in science do you have to be to use generative AI to help you in science? I am trying to get some sense of this in Phy 262 this semester, and in general I am cautiously optimistic that in my smaller courses, careful coaching can occur.


I have started exploring its use in Phy 262, Mechanics, which is required for biophysics and physics majors, and an elective for engineering majors. First, I’ve given them explicit permission to use generative AI in brainstorming ideas for their WakerSpace Project, which started at the beginning of the semester. I will ask them about their usage of this at the end of the semester. Second, this course introduces them to computation in physics, and right before Spring Break they have a Friday where they are asked to computationally model the simple pendulum and look at its harmonic and anharmonic behavior. By then they don’t need any code examples, just the questions. So in addition, they will have an assignment to use ChatGPT to do the same modeling and assess: could they get it to do it correctly; how hard was it; and could they have used ChatGPT to model it when they were more novice in computation or mechanics as they were at the beginning of the semester? Was it easier or harder than doing it themselves, and what do they think about its utility in modeling physical systems. Then the Friday after Spring Break we will get together and discuss in small groups and they will write up their report.

Also, in the team-taught MCB 723, Topics in Cancer Biology, I will have a lecture where I will be talking, in part, on Generative AI in Drug Discovery.

Post-spring-break update and takeaway on the Phy 262 assignment:

Here you’ll find the uploaded problem they were to replicate, and all the redacted reports. They were warned the redacted assignments would be shared broadly. The best four are labeled A, B, C, D, and the rest are numbered. 

One thing I noticed during class is an extreme gender difference; granted this class is small with only 4 women and 8 men. With that caveat, the women overall seem to be far more engaged in asking questions of chatgpt, conversing by asking for clarifications, and being genuinely curious about chatgpt. They seemed to be enjoying themselves. Most of the women also explored other uses in class — such as asking for practice problems for a test. 

The consensus from the reports seems to be use with caution: as the physics will often be not quite right, the codes not exactly what is asked for at first, but would be useful for debugging, when stuck, or to explain why particular pieces of codes.  And they would like to learn how to better ask questions of chatgpt.


My hunch is that they mostly figure out how to get it to work but it will take some effort and expertise, but it will be interesting to see what the results are. I picked before and after spring break because given that these are biophysics, engineering, and physics majors, some of them will be very excited and want to spend a lot of time experimenting.

Lessons Learned

Right now, the only advice I have is to be honest with the students that these are experiments, both because AI is a fast-moving field and we need to figure out how to use AI to support learning, rather than replace learning. Hopefully, I will have more advice after this semester.

Disciplinary Insights

So far my focus outside of classes has been with helping graduate students navigate the use of generative AI:

1) I helped Dr. Gmeiner, in MCB 723, Topics in Cancer Biology class, to show how Chatgpt 3 was not capable of producing correctly written or cited work for a graduate-level review article. Fake citations — which not all the students checked! — often slight errors and written at the level of intro undergrads. This was mostly Dr. Gmeiner’s work last year, and will be repeated; although I will also lecture on Generative AI in Drug Discovery, as mentioned above.

2) My own physics PhD students have been eager to explore generative AI, not surprisingly given that we do research in computational biophysics and have been using machine learning in data analysis for years, and we have and continue to work together to see how Generative AI can be used. So far we have seen that if careful attention is paid to keeping the science correct that ChatGPT can do a good job of improving their writing for publication. Although if not done carefully, irrelevant or nice-sounding nonsense can be produced. Not surprisingly, my senior graduate student is much more capable of assessing the output of ChatGPT than more junior students. We also hope in the near future to explore the use of AlphaFold in our biophysics and drug discovery research.