• YU Learning Assessment

  • Our Commitment

    Welcome to the Yeshiva University Learning Assessment website. At Yeshiva University we are committed to student learning assessment in order to ensure that our colleges and schools, programs/majors, and courses are successfully fulfilling their educational missions, goals, and objectives. The purpose of this website is to provide faculty and staff with assessment-related tools and resources to help guide development and implementation of effective learning assessment plans.

    Learning Assessment Spotlight:

    Department of Mathematical Science, Yeshiva University

    by

    Dr. Thomas Otway, Professor and Department Chair

    Assessment of the computer science curriculum was facilitated by the way in which the computer science curriculum evolved. Many years ago, Michael Breban, with the help of Arnold Lebow and other department faculty, put together a computer science program at Yeshiva College which conforms to Association for Computing Machinery (ACM) standards. In an ACM-approved curriculum, courses are designed to implement learning goals that practicing computer scientists feel are important for graduates of computer science programs. When Van Kelly joined the department in 2010, he enhanced that curriculum with courses that reflect very recent trends and concerns in industrial computing. The curriculum map that we prepared for the formal assessment turned out to be a natural way to organize the goals that had been already built into the curriculum. The careful tracking of student performance is also standard procedure for our computer science faculty. What is new is that such data are now concentrated in one source available to the entire department, rather than being scattered in an assortment of directories in various individual faculty computers, and are assessed using metrics that have been agreed upon by the computer science faculty. The national societies for many fields release standards for specialized education in that field. We found these standards to be quite useful as a general guide for assessing student learning in computer science.

    Concretely, the curriculum map for the computer science program was constructed as follows. First, the topics of the required courses were written down. We asked ourselves how each topic contributes to the education of a computer scientist. Then we compared those results to the general goals for computer science programs in the ACM guidelines that we follow. In this way, the curriculum map was "reverse-engineered" from the required course offerings. The ACM guidelines (including very recent ones which have just been released) were also reviewed, to decide whether all of the ACM guidelines were represented in the curriculum. The curriculum map is, except for that last step, the inverse of the process which originally created the current computer science major at Yeshiva College from the ACM guidelines in the 1990s.

    Assessment of the curriculum in mathematics is more complicated, for several reasons: there are three different tracks (the actuarial science track, the computer track, and the general track); the program is given on two campuses, partly with shared faculty, and with slight differences in curriculum; and three degrees are given, the B.A., M.A., and Ph.D. It was found, for example, that splitting the assessment of advanced courses into the three tracks produced differences among the subgroups which were statistically insignificant; for that reason, achievement of learning goals for each course in the Mathematics program at each campus is assessed for the whole class, rather than for the students in each separate track. (Assessing courses by track has the additional disadvantage that students tend not to formally declare their track until they apply for the degree.) Although we pool data among the tracks, we do not pool data across the campuses. For historical reasons, and because of differences in faculty specialization, the rubrics adopted at the two campuses are similar, but not identical. For example, computer science is integrated into the electives for the major at Beren Campus, whereas it is a separate track of the major at Wilf Campus.  Also, there are options for the Advanced Calculus requirement at Beren, so those options needed to be taken into account when determining the rubrics for that campus. We also found that assessment practices for the department’s small doctoral program differed qualitatively from assessment of the undergraduate programs. The main difference is that on the doctoral level, quantifiable progress in student learning may not be evident for several semesters, due to the nature of advanced research in mathematics.

    The adoption of rubrics for the Mathematics program required a certain amount of self-discipline. Certain of the faculty have some doubts that the standard, traditional mathematics curriculum is the best possible for majors. The adoption of such a curriculum in a mathematics department such as ours, having a very small faculty, a large number of majors, and a truly huge number of students in our many service courses, is as much a response to the expectations of other departments, professional programs, and employers as it is an expression of our shared belief about what constitutes the best preparation in mathematics. But we realized that assessment activities have to be directed at the program that we have rather than at the program that many of us would like to have under better conditions. Revisions in the program, which are instituted continually to meet changing conditions but which are subject to the usual external constraints, are only reflected in the rubrics once the revisions have been fully incorporated into program requirements. This policy allowed us to focus on the question of whether current learning objectives are being achieved.

    Physics Department, Stern College for Women

    by,

     Dr. Anatoly I. Frenkel, Department Chair                             

    During the Spring 2014 semester, we designed and applied rubrics to assess eight program-level Student Learning Objectives (SLOs) for our four core programs: General Education courses, Major in Physics, Major in Physical Sciences, and Major in Pre-Engineering.

    For example, in the General Education courses, one of the objectives is for students to be able to know the fundamental physics laws (in their most general formulations) and understand their physical implications.  In addition, we want students to know how to adapt these general formulations to concrete applications.  As another example, in the Major in Physics program, a key objective is for students to be able to choose relevant theories and research for solving a specific physics problem. Other student learning objectives relate to the knowledge of fundamental physics laws and concepts and their implications, numerical insights in solving problems, and analytical techniques in laboratory settings.

    To determine whether students are attaining program-level objectives in their physics courses, we collected and analyzed various sources of performance data from tests, lab reports, and student presentations using faculty designed rubrics.  More specifically, for each program-level SLO, a Department faculty member, often in consultation with other faculty, designed a rubric to analyze student work in light of the program-level objective.  The final version of the rubric was approved by the entire Department.  Departmental assistant, Rakhi Podder performed statistical analyses of all data collected by faculty in their classes. At the faculty meeting on March 26, 2014, these data were discussed and used to identify problems that students are having with different program components, such as appropriate mathematical background for some advanced courses, and also to reveal  students’ particular strengths (e.g., graphical representation of concepts)

    At the end of the Spring semester, the analysis will be complete and the Department’s faculty will meet again to discuss possible changes in the programs and, if needed, in the SLOs. A new set of SLOs will be tested during the Fall 2014 semester.

     

     

     


     

  • Contact Us

    Please contact us if there is any aspect of this website or student learning assessment that you would like to discuss.

    Rachel J. Ebner, PhD
    Director of Student Learning Assessment
    Belfer Hall 1300A
    212.960.5400, ext. 6950
    rachel.ebner@yu.edu

Yeshiva University
500 West 185th Street
New York, NY 10033
212.960.5400

Connect With YU