Teaching Statement

My teaching has been focused on mathematics courses for engineering and computer science undergraduate students. I have taught several semesters of Calculus I/II/III, Differential Equations, Engineering Mathematics (introduction to Linear Algebra and PDEs survey course), and Discrete Mathematics white on the faculty at the Air Force Academy. I also served as the course director for Calculus III, Engineering Mathematics, and Discrete Mathematics, coordinating the syllabus and exams for these courses and leading a team of the all instructors/professors for all sections in those semesters. I was awarded Outstanding New Instructor, Outstanding Course Director, and Outstanding Academy Educator while at the U.S. Air Force Academy. Also, while on the faculty at the U.S. Air Force Academy and as a research associate at CU Boulder, I have had several opportunities to mentor undergraduate and graduate students in research.

Classroom Time

Face to face instruction time in the classroom is a finite resource and should be used wisely. It is an important opportunity to provide understanding or clarifications that are difficult to provide efficiently in written course materials. I focus on three main goals, building intuition around core concepts, helping students build plans for approaching problems, and helping students troubleshoot these plans when they fail to reach the desired solution. These goals are in line with the learning outcomes and and program objectives in CU Boulder’s Computer Science BS program [oCS]. Specifically, graduates are expected to leverage foundational knowledge to design and implement solutions to problems.

There exists some research suggesting that although a clear presentation of material in a lecture can increase a student’s confidence in their understanding, that confidence does not necessarily translate into greater understanding, at least as measured by exams [CMRF16]. While I find a clear lecture helpful in establishing common language and some understanding of core concepts, I use in class activities to build understanding and give the students the opportunity to locate the gaps or errors in their understanding. Historically I have relied upon MATLAB and Mathematica for these in class exercises, as well as printable worksheets. As my research has shifted to open source software, I have been investigating leveraging tools like Jupyter notebooks [eal19] in the classroom to replace proprietary software for in class exercises. These notebooks, especially when hosted on a public location like GitHub prior to the lesson, provide students with a modifiable asset that frames the topics for the lesson while allowing students to take notes and experiment with the material.

Research suggests that collaborative learning strategies such as Think-Pair-Share [Nin19] facilitate learning. After introducing the lesson content in a more traditional lecture style introduction, I provide opportunities for students work on problems or experiment with the concepts, collaborating with their neighbors and raising common sticking points or areas of emphasis to the level of a discussion with the whole class.

Assessments

I see curiosity and tangible applicability as important in helping student motivation. To that end, I prefer to use project based learning, with collaborative projects that illustrate practical applicability of core course concepts. I use smaller homework sets or quizzes as formative assessments that provide the students with quicker feedback on their understanding and progress, intermediate project deliverables as slower feedback, and final projects deliverables as the summative assessment. For some courses, in person examinations are the most appropriate summative assessment, but I avoid putting students into a high pressure, “all or nothing” situation with these exams representing a disproportionate portion of their grade.

When designing assessments, I have three goals in mind. First, the primary goal of assessments is to provide students a way to measure their progress towards course goals. The concept of learning scaffolding does not exactly map to a university setting [Sta15], but I try to build up student knowledge and capabilities throughout the semester with the goal of building independent problem solving skills in the course domain by the end of the semester. While students and instructors can often interpret the role of feedback [BODohertyS11], it remains an important in helping students build their skills. To this end, I employ smaller formative assessments such as quizzes and intermediate project deliverables to facilitate feedback and help students understand their progress and adapt and needed.

There are several demands on student’s time outside of the classroom, which means that out of class assessments such as homework and projects must compete with other obligations the students have. My second goal is to design assessments so that I make best use of the available student time. Small in class assessments, such as quizzes, provide an opportunity for immediate feedback and I do not have to compete for student time in this environment. From a pragmatic point of view, out of class assessments need to feel sufficiently meaningful to the students so that they allocate some of their limited time to completing the assessments. I prefer smaller homework sets that clearly link to future in class assessments and larger collaborative projects. I have received largely positive feedback from FCQs around these considerations, though this feedback has been from mathematics courses.

Finally, and most importantly, I try to build assessments to be fair when considering the wide range of student backgrounds and challenges. Various underlying reasons such as unconscious bias, systemic marginalization, and different levels of anxiety over assessments, can result in drastically different results between two different students with comparable levels of understanding on a topic. I provide a range of assessments so that students have multiple, different opportunities to demonstrate understanding of key course content.

Mentoring

Mentoring is a critical component of supporting graduate student research. When the student research involves contributions to open source software projects, that provides a natural framework to aid in the mentoring. The need for testing and documentation for all contributions in a healthy open source codebase helps frame discussions about what specifically the student is trying to accomplish and where they are encountering difficulty. Planning out a new contribution or series of contributions provides a concrete framework for discussing research goals. Similarly, the code review process provides a good structure to provide feedback on specific short term goals for the student. The format of open source contributions tends to focus on smaller, specific opportunities for mentoring but can generate conversations that are broader or more cross cutting.

Members of research groups tend to naturally adopt different schedules, leading to asynchronous communication. GitLab’s guide to asynchronous work [Git25] provides some useful suggestions that applicable to mentorship in this context. A lot of their recommendations, such as providing complete messages with all of the relevant context for the question and all the resources required to replicate the current problem. This has the benefit of helping in person or realtime interactions be more productive, and the process of writing out clearly the question with background can help them discover the solution themselves or shift the conversation from basic fact finding to comparisons between different options or perspectives.

I tend to push these mentoring discussions to public channels, such as Zulip (similar to Slack) or GitHub/GitLab issues when appropriate. This allows more people to participate in the discussion, leading to better advice and assistance. Also, this underscores a core value I try to share with students of learning in public. I like to emphasize that all of us always have more to learn. Academia operates best when we honestly and openly admit gaps in our knowledge and collaboratively pursue new knowledge.

I also mentor students online outside of my research. My largest effort in this vein is volunteering my time as a mentor and community leader at freeCodeCamp, a free and open source coding education platform designed to allow anyone in the world access to coding education resources necessary to learn the skills required to start a career in web development.

References

[BODohertyS11]

Chris Beaumont, Michelle O’Doherty, and Lee Shannon. Reconceptualising assessment feedback: a key to improving student learning? Studies in higher education, 36(6):671–687, 2011.

[CMRF16]

Shana K Carpenter, Laura Mickes, Shuhebur Rahman, and Chad Fernandez. The effect of instructor fluency on students’ perceptions of instructors, confidence in learning, and actual learning. Journal of Experimental Psychology: Applied, 22(2):161, 2016.

[eal19]

Lorena A. Barba et al. 2019. URL: https://jupyter4edu.github.io/jupyter-edu-book/.

[Git25]

GitLab. Gitlab's guide to all-remote. 2025. URL: https://handbook.gitlab.com/handbook/company/culture/all-remote/guide.

[Nin19]

Yarisda Ningsih. The use of cooperative learning models think pair share in mathematics learning. In Journal of Physics: Conference Series, volume 1387, 012144. IOP Publishing, 2019.

[Sta15]

Clare Stanier. Scaffolding in a higher education context. In ICERI2015 Proceedings, 7781–7790. IATED, 2015.