
YU Learning Assessment
Office of the Provost
Our Commitment
Welcome to the Yeshiva University Learning Assessment website. At Yeshiva University we are committed to student learning assessment in order to ensure that our colleges and schools, programs/majors, and courses are successfully fulfilling their educational missions, goals, and objectives. The purpose of this website is to provide faculty and staff with assessment-related tools and resources to help guide development and implementation of effective learning assessment plans.
Assessment Toolkit Resources
-
- January 2025 -- Defining success on assessments
- December 2024 -- AI rubric generating tools
- November 2024 --The important role of rubrics in clossing the assessment loop
- October 2024 --Direct versus indirect assessment evidence
- September 2024 -- Program assessment key reminders
- August 2024 -- Assessment planning matrix
- July 2024 -- Setting benchmarks and performance targets
- June 2024--Autentic assessment resources
- May 2024--Examples of assessment driven changes
- April 2024 --Assesment and accredidation
- March 2024 -- Using caution when using online tracking tools to mesure student engagement
- Februrary 2024 -- Rethinking student learning goals and assessments in light of AI
- January 2024 -- Resources for effectively analyzing and presenting assessment data
- December 2023 -- Using rubrics for self-assessment
- November 2023 --Tips for reducing students' cognitive loads during highly anxious times
- October 2023 -- Assessment during times of crisis
- September 2023 -- Revisiting the importance of formative assessment
- August 2023 -- Using ChatGPT to advance assessment practices
- July 2023 -- Issues with grading
- June 2023 -- Deterring students from cheating with AI
- May 2023 -- Suggestions for "closing the assessment loop"
- March 2023 -- Creating equitable assessment practices
- February 2023 -- Distinguishing between students' assignment grades and performance on student learning assessments
- January 2023 -- Why use rubrics to grade assessments?
- December 2022 -- The benefits of self-reflection exercises for promoting students' metacognition
- November 2022 -- Tips for encouraging students to use feedback
- October 2022 -- Assessing your assessment process
- September 2022 -- Mapping course objectives onto program objectives
- August 2022 -- Getting to know your students' needs
- July 2022 -- Tips for Effective assessment planning
- June 2022- Reflecting on and revising assessment methods
- May 2022 --Authentic assessments
- April 2022 -- Closing the assessment loop!
- March 2022 -- Tips for promoting mastery motivation over performance motivation
- February 2022 -- Educating students about plagiarism
- January 2022 -- Reminder about distinguishing between program assessment outcomes and assignment and course grades on student-learning assessment reports
- December 2021 -- Promoting self-regulated learning through feedback strategies.
- August 2021 -- Incorporating online assessments into your in-person courses
- July 2021 -- The importance of considering Maslow's hierarchy of needs when teaching and assessing students online
- June 2021 -- The value of student feedback for course or program improvement
- May 2021 -- Tips for designing effective assessments
- April 2021-- The importance of distinguishing between performance assessment outcomes and assignment grades
- March 2021 -- Interactive formative assessment tools for online learning
- February 2021 -- Creating a collaborative assessment process
- January 2021 -- Tips for speeding up grading
- December 2020 -- Designing online course assessments to promote students' engagement and metacognitive thinking skills
- November 2020 -- Strategies for reducing cheating on online assessments
- October 2020 -- The importance of periodically reviewing, updating, and sharing program curriculum maps
- September 2020-- Creating an effective assessment plan
- August 2020 -- The importance of self-assessment for online learning
- July 2020 -- Tips for fostering students' self-regulated learning in asynchronous online learning environments
- June 2020 -- The importance of reflecting on online learning assessments
- May 2020 -- Using an online multifaceted assessment approach
- April 2020 -- Online discussion rubrics
- March 2020 -- Online learning assessments
- February 2020 -- Creating uniformity in assessment terminology
- January 2020 - The importance of revising and refining rubrics
- December 2019 - Performance assessments
- November 2019 - Tips for generating and posing strategic questions in the classroom
- October 2019- Midsemester reflections on student learning and instruction
- September 2019 - Creating an effective assessment plan
- May 2019 - Suggestions for “closing the assessment loop!”
- April 2019 - Turning generic rubrics into task-specific rubrics
- March 2019 - Using online discussions for informal assessment
- February 2019 - Direct versus indirect assessment evidence
- January 2019 - Using final exam results to close the assessment loop!
- December 2018 - Revisiting your assessments
- November 2018 - Rubric essentials
- October 2018 - Assessing your program-level assessment process
- September 2018 - Creating meaningful assessments
- August 2018 - The importance of aligning student learning objectives, instructional methods, and methods of assessment.
- July 2018 - The importance of implementing multiple and varied assessment methodologies
- June 2018 - Suggestions for “closing the assessment loop!”
- May 2018 - The importance of formative assessments for informing student learning and instruction
- April 2018 - Using self-assessment to enhance learning and performance
- March 2018 - Reflecting Upon Midterm Exam Results for Program Improvement
- December 2017 - Revisiting your assessments
- November 2017 - Tips for assessing student learning in online learning environments
- October 2017 - Tips for creating an effective rubric
- September 2017 - Mapping course objectives onto program objectives
- August 2017 - Involving students in the assessment process
- July 2017 - Changing the way we talk about assessment
- June 2017 - The importance of aligning student learning objectives, learning opportunities, and methods of assessment.
- May 2017 - Suggestions for “closing the assessment loop!”
- April 2017 - Reflecting upon midterm exam results for program improvement
- March 2017 - The benefits of peer feedback for improving student learning
- February 2017 - Reflecting on, revising, and refining program-level assessment processes
- January 2017 - Fostering students' motivation to master learning material by providing timely and concrete assessment feedback
- December 2016 - Why continually documenting student learning assessment is so important
- November 2016 - Midterm reflections
- October 2016 - Using E-portfolios to document a student’s learning journey
- September 2016 - Creating an effective assessment plan
- August 2016 - Distinguishing between grading and assessment
- July 2016 - The power of self-assessment
- June 2016 - Curriculum mapping
- May 2016 - The essentials about rubrics
- April 2016 - Distinguishing between learning goals and objectives
- March 2016 - What does it mean to "close the loop"?
- February 2016 - Distinguishing between formative and summative assessment
- November 2015 - Distinguishing between indirect and direct assessment evidence
-
- Clearly define program’s/major’s mission
- Identify student outcome learning goals that directly align with program’s/major’s mission
- Define learning goals by stating objectives
- Map out which program/major courses and learning experiences will enable students to achieve program/major goals (curriculum mapping)
- Devise a program/major assessment plan and timeline
- Identify which goals you are going to assess and when
- Develop comprehensive methods for both directly and indirectly assessing students’ attainment of those goals (NOTE: no one assessment can evidence learning—multiple pieces of evidence are needed)
- Develop corresponding scoring rubrics to ensure consistency and accuracy in scoring of assessments (NOTE: rubrics are not the assessment, but a tool for scoring assessments)
- Implement the assessment plan and continuously monitor its effectiveness, making changes or improvements when necessary
- Analyze assessment results and communicate/report findings
- Use assessment results to inform and improve program’s/major’s effectiveness in meeting learning goals and objectives
- Document steps 1-11
-
How to Write Missions from the University of Connecticut’s assessment website
-
How to Write Learning Goals from the University of Connecticut’s assessment website
-
- How to Write Learning Objectives from the University of Connecticut’s Assessment website
- Bloom’s Taxonomy of Learning Domains and Bloom’s Taxonomy Action Verbs
-
- AAC&U VALUE Rubrics: The Association of American Colleges and Universities (AAC&U) has developed a set of peer reviewed Valid Assessment of Learning in Undergraduate Education (VALUE) rubrics including: critical thinking, written communication, oral communication, quantitative literacy, information literacy and more.
- Rubistar:A web-based interactive software for designing rubrics
- Presentation on the Design & Use of Scoring Rubrics
- Using rubrics for program assessment from Loyola Marymount University's assessment website
- Rubric Tutorial from University of South Florida
-
- Assessing Student Learning and Institutional Effectiveness: Understanding Middle States Expectations (2005). Middle States Commission on Higher Education. A brochure prepared by MSCHE detailing its expectations for meeting Standard 7: Institutional Assessment and Standard 14: Assessment of Student Learning
- Assessment Primer (University of Connecticut). Provides an overview of assessment related concepts including information on how to write effective program-level mission statements, goals, objectives, curriculum maps, and assessment plans
- OAPA Handbook (Program-Based Review and Assessment, University of Massachusetts at Amherst). An online handbook on program-based assessment. It includes chapters on why to assess, defining goals and objectives, designing assessment programs, assessment strategies and methods, analyzing, reporting and using results
- Student Learning Assessment: Options and Resources (2nd Edition, 2007). Middle States Commission on Higher Education. A publication by MSCHE presenting assessment-related options and resources.
- Weiner, Wendy (2009). Establishing a culture of assessment: Fifteen elements of assessment success—how many does your campus have? American Association of University Professors. Retrieved January 22, 2020
-
- Anderson, L. W., & Krathwohl, D. R. (Eds.). (2001). A taxonomy for learning, teaching and Assessing: A revision of Bloom's Taxonomy of educational objectives: Complete edition, New York: Longman.
- Assessment: FAQ. Stanford University Office of Institutional Research and Division Support. Retrieved October 31, 2013.
- Assessment Primer. University of Connecticut. Retrieved Nov. 5, 2013.
- Characteristics of excellence in higher education (online version, 2009). Middle States Commission on Higher Education: Requirements of affiliations and standards for accreditation. Retrieved Nov. 4, 2013.
- Glossary. Lehman College Office of Assessment and Planning. Retrieved Oct. 31, 2013.
- Linn, R. L., & Miller, M. D. (2005). Measurement and Assessment in Teaching 9th ed. Upper Saddle River, NJ: Pearson.
- Mager, R. F. (1997). Preparing instructional objectives: A critical tool in the development of effective instruction. Atlanta, GA: Center for Effective Performance.
- OAPA handbook program-based review and assessment. UMass Amherst. Retrieved Oct. 31, 2013.
-
The purpose of the AAC is to promote and support YU’s learning assessment efforts by:
- Fostering a positive assessment culture throughout the University
- Supporting and facilitating University-wide assessment activities such as (1) disseminating assessment information across YU colleges/schools, including identifying best models and practices, and (2) collecting, documenting, and sharing assessment information for program/major improvement
The Committee meets at least once each semester. Its members are as follows:
- Dr. Rachel Ebner (Chair), Director of Assessment; Clinical Assistant Professor of Psychology
- Dr. Selma Botman, Provost and Senior Vice President for Academic Affairs
- Dr. Timothy Stevens, Deputy Provost; MSCHE ALO, NYSED CEO-Designee & NC-SARA Primary Contact
- Dr. Sean McKitric, Director of Assessment and Quality Improvement, Katz School of Science and Health
- Dr. Yuxiang Liu, Director of Institutional Research
- Dr. Rebecca Cypess, The Mordecai D. Katz and Dr. Monique C. Katz Dean, Undergraduate Faculty of Arts and Sciences
- Dr. Avi Giloni Associate Dean of Sy Syms School of Business
- Dr. William Stenhouse, Associate Dean for Academic Affairs, Yeshiva College
- Dr. Daniel Rynhold, Dean of the Bernard Revel Graduate School of Jewish Studies
- Dr. Sandy Moore – Director of University Libraries
- Dr. John Vivolo, Executive Director of Academic Operations and Teaching and Learning, Katz School of Science and Health
Program Mission Statements, Goals, & Objectives
-
- M.S. Biotechnology Management and Entrepreneurship
- M.S. Cybersecurity
- M.S. Digital Marketing and Media
- M.S. Enterprise Risk Management
- M.S. Artificial Intelligence
- M.S. Data Analytics
- M.S. Physician Assistant Studies
- A.S. Management
- A.S. Liberal Arts Education
- M.A. & Ph.D. Math Program
- M.S. Physics
- M. A. Quantitative Economics
- M.S. Speech Language Pathology
- B.S. Nursing
- Doctorate in Occupational Therapy
FAQ
-
Assessment is "the systematic and ongoing process of gathering, analyzing, and using information from multiple sources to draw inferences about the characteristics of students, programs, or an institution for the purpose of making informed decisions to improve the learning process" (Linn & Miller, 2005). The principle that assessment is a systematic and continuous process, not an end product, is central to this definition.
Assessment is...
- a cyclical process not an end goal
- planned and systematic not random and variable
- ongoing and cumulative not one point in time
- multifaceted not singular
- informative not a judgment
- objective not subjective
- transparent not unclear or hidden
- pragmatic not useless
- faculty designed and implemented not imposed from the top down
-
- Classroom assessment: assessing an individual student’s learning experience in a course
- Program assessment: assessing a group of students’ learning experience in relation to a program, departmental major or unit of study
- Institutional assessment: assessing campuswide factors
-
“If you don’t know where you are going, the best-made maps won’t help you get there” (Mager, 1997, p. vi).
- Assessment promotes self-reflection, which is essential for effective teaching and learning (Assessment: FAQ, Stanford University). It helps you to reflect on:
- What goals you are trying to accomplish
- How well you are meeting those goals
- How you can improve
- Accreditation: Middle States Commission on Higher Education Standard 14: Assessment of Student Learning—“Assessment of student learning demonstrates that, at graduation, or other appropriate points, the institution’s students have knowledge, skills, and competencies consistent with institutional and appropriate higher education goals” (MSCHE, 2009). MSCHE.org.
- Assessment promotes self-reflection, which is essential for effective teaching and learning (Assessment: FAQ, Stanford University). It helps you to reflect on:
-
Assessment involves collecting evidence of student learning and attainment of intended learning outcomes. To develop a more complete understanding of the extent of student learning, multiple pieces of evidence are needed. Evidence of student learning can be direct or indirect. To obtain the best indication of student learning, a combination of direct and indirect measures should be used.
- Direct assessment: evidence based on directly examining and measuring students’ performance (e.g., exams, projects, papers, portfolio assignments, oral presentations, fieldwork observations)
- Indirect assessment: evidence based on reports of perceived student learning (e.g., surveys and interviews with students, employers, faculty)
-
It means to take action by using assessment results to make program-level improvements or decisions. This might include:
- Revising your program-level outcomes
- Changing curricula by adding or removing courses or program experiences, requiring prerequisite courses, changing instructional methods or assignments within courses
- Creating or modifying assessments
- Creating or modifying rubrics
- Using assessment results to support current program practices or to make other program policies or decisions
-
Transparency showcases evidence of student learning from program experiences. It also enables you to reflect on program practices and effectiveness for meeting student outcomes.
-
Please contact us if there is any aspect of this website or student learning assessment that you would like to discuss.
Rachel J. Ebner, PhD
Director of Student Learning Assessment
Belfer Hall 1300A; 215 Lex. Room 606
212.960.5400, ext. 6138
rachel.ebner@yu.edu