- Las Positas College
- Accreditation
- Accreditation
- SLO Handbook
Accreditation
SLO Handbook
Table of Contents
- Student Learning Outcomes Handbook
- Las Positas College Mission, Vision, Values Statement
- Introduction to Student Learning Outcomes & Assessment
- Course-level Student Learning Outcomes
- Program-level Student Learning Outcomes
- Institutional-level Student Learning Outcomes (formerly Core Competencies)
- Assessment Guidelines
- Types of Assessments
- Closing the loop: The importance of dialogue in the assessment process
- Definitions
- ELumen Step by step instructions (Coming soon)
- Appendix A: Excerpts from the ACCJC Standards for Accreditation (as revised June 2014)
- Appendix B: SLO Assessment Results Form
- Appendix C: The Genie in the Bottle: Disaggregation of Student Learning Outcomes Data
- Appendix D: Responses to Questions from the ACCJC Accreditation Standards Symposium April 23-24, 2015
Las Positas College Mission, Vision, Values Statement
Mission Statement
Las Positas College is an inclusive learning-centered institution providing educational opportunities and support for completion of students’ transfer, degree, basic skills, career-technical, and retraining goals.
Inclusive - welcoming of a diverse group of students including but not limited to DSPS, EOPS, CalWORKS, International, Multicultural, various Economic Backgrounds, Distance Education, and Lifelong Learners; all with varying skill levels and learning styles.
Learning-Centered - refers to courses, programs, disciplines, modes of delivery, learning communities, accounting for varying skills levels, creative and critical thinking, and having necessary and specialized facilities.
Educational opportunities - include but are not limited to classroom and Distance Education (DE) instruction, athletics, field trips, guest speakers, student government, cultural opportunities, clubs, labs, internships, tutorial service, workshops, library research, and mentoring.
Support includes tutorial center, reading and writing center, counseling, office hours, Integrated Learning Center, Admissions and Records, advisory boards, Health Center, financial aid, BlackBoard, technology, enrollment management, assessment, tutorial services, Library, Computer Center, Student Services, Administrative Services; all provided by a dedicated group of administrators, faculty and classified professionals.
Vision Statement
Las Positas College strives to be California’s premier Community College, setting the standard through opportunities for developing knowledge, skills, values, and abilities that foster engaged and contributing members of the society.
Values Statement
Las Positas College thrives as a collaborative teaching and learning community committed to integrity and excellence by:
- Encouraging and celebrating lifelong learning
- Responding to the needs of the ever-changing workplace
- Demonstrating civic, social and environmental responsibility
- Promoting ethical behavior, tolerance and mutual respect in a diverse community
- Fostering a climate of discovery, creativity and personal development
- Holding firm to the belief that each of us makes an astonishing difference.
Introduction to Student Learning Outcomes & Assessment
The central mission of Las Positas College is its commitment to student learning. To further that mission, the college recognizes the importance of evaluating progress towards that goal. Every sector of the college engages in informal and formal evaluation procedures. This document will focus on using formal assessment to assess student learning at the course-, program-, and college-level. Faculty across the college engage in continual assessment of student learning. Those assessment results are then discussed within each discipline and across the college.
Student Learning Outcomes (SLOs) are the observable or measurable results student achieve after completing a course or program. Assessment of SLOs are regularly done at the course-level (CSLO), program-level (PSLO), and institutional-level (ISLO). In regular cycles analysis of course-level, program-level, and institutional-level SLOs is completed for ongoing feedback and implementation of ideas for improvement. Assessment results are entered into eLumen by full-time and part-time faculty. Through the process, faculty dialog is central to SLO work. Collegial dialog about assessment results takes place at discipline meetings, division meetings, SLO committee meetings, and staff development workshops. Evidence of collegial dialog is found in meeting minutes and in program review documents. The assessment of student learning is a significant part of the program review and resource allocation processes at Las Positas College.
Establishing and assessing student learning outcomes (SLOs) has pedagogical value. Use of SLOs supports a learner-centered approach to teaching, focusing education on helping student achieve well-defined outcomes. Communicating SLOs to student will allow students to focus their time and attention on what needs to be learned in a course. Assessment of SLOs provides students with information about their strengths and weaknesses. Lastly, assessment of SLOs informs faculty about areas to improve program effectiveness.
The Accreditation Standards ask faculty and staff to articulate SLOs for each course and each degree/certificate that the school offers (see Appendix A). This is, in large part, a response to the U.S. Department of Education call for colleges and universities to engage in a process of continual self-examination and reflection with the goal of improvement. Faculty and staff must also define them for library, learning support services, and student support services [called Student Area Outcomes (SAOs)]. Then, assessment activities must be designed that provide students with an opportunity to demonstrate what they have learned.
The use of assessment results is meant to stimulate discussion and direct activities that can improve instructional delivery and support systems on campus. Results will not be used as the basis of evaluation or disciplinary action for individual faculty members. However, as part of the professional responsibilities of faculty, instructors are expected to participate in the SLO process.
The Student Learning Outcome Committee is responsible for organizing and facilitating our SLO efforts. The SLO Coordinator works directly with departments to assist in developing their outcomes, determining the means of assessment, compiling the results of that assessment, analyzing those results, and making changes to their program or unit if necessary in order to improve student learning. Please visit the SLO website for updated information on all aspects of SLO development and assessment: http://laspositascollege.edu/SLO/index.php.
The charge of the SLO Committee: The SLO committee seeks to elicit broad perspectives and advice regarding learning goals for all Las Positas students, faculty, administrators, and staff. This group provides an advisory linkage to the Academic Senate on matters pertaining to the College’s immediate and long range plans to integrate student learning outcomes and assessment at the course, program, and institutional levels. With the advice and consent of the Academic Senate, this group reviews institutional student learning outcomes for LPC students and develops strategies and timelines for incorporating and coordinating these competencies into continuous, informative, and useful assessment that is diverse in method, captures achievement over time, and focuses equally on learning outcomes and the experiences that direct students to those outcomes. The Student Learning Outcomes Committee works with the Curriculum Committee, establishing policies and procedures concerning the institutionalization of SLOs, which will be brought to the Senate for review and approval. The SLO committee also collaborates with the Staff Development Committee to provide support and materials needed for the development of SLOs and assessment. This group also works with the Program Review Committee to coordinate, collect, and archive assessment activities in all sectors and organizes campus dialogue process concerning student learning outcomes and assessment.
Course-level Student Learning Outcomes
SLOs are the observable or measurable results subsequent to a learning experience. They may involve knowledge (cognitive), skills (behavioral), or attitudes (affective) that provide evidence that learning has occurred. SLOs encompass students’ ability to synthesize discreet skills using higher level thinking skills and produce something that applies what they have learned; this is exemplified through a gathering of smaller objectives and applies analysis, evaluation, and synthesis in more sophisticated ways.
Guidelines for course-level SLOs:
- Each course should have a limited number of SLOs that encompass the major areas of learning expected of students by the end of the course (2-6 outcomes per course as a general guide but standards specific to your program may require more than 6).
- SLOs should be written for students. Use language that your students can understand rather than technical language.
- Course-level SLOs should be consistent across multiple sections of the same course. The assessments used by faculty do not need to be the same across sections.
- SLOs must be communicated to students on all course syllabi and match to the official course outlines of record (COR).
- Course-level SLOs for a particular course are analyzed during the 3-year assessment cycle tied to our program review cycle.
- When a course with multiple sections is going to be assessed, we recommend that multiple sections be assessed. This will allow faculty to make stronger conclusions concerning assessment results. This is important when investigating differences between day, evening, online courses.
Figure 2: Assessment cycle for Course-level SLOs
Must SLOs be consistent across all sections?
Maintaining consistent SLOs across all sections helps faculty to analyze the results of SLO assessments and look for trends in student learning across time and across multiple sections. This also assures that all students will know what to expect when completing a course successfully. In addition, the ACCJC requirement is that each course has a single set of SLOs that is common to all sections of the course, no matter who teaches the section (see Appendix A).
Course-level Outcomes versus Course Objectives
There has been a lot of confusion, both locally and on the state level, about what differentiates SLOs from objectives. We are attempting to resolve this confusion in this section. Course SLOs and course objectives are intricately linked to one another. Course SLOs describe the broadest goals for the course, ones that require higher-level thinking abilities; require students to synthesize many discreet skills or areas of content; ask them to then produce something - papers, projects, portfolios, demonstrations, performances, art works, exams, etc., – that applies what they have learned; and require faculty to evaluate or assess the product to measure a student’s achievement or mastery of the outcomes. The assessment of SLOs is useful in helping professors know where their teaching and learning activities have and have not been successful. SLOs also let students know what they can expect to attain as a result of completing the course.
Course objectives are on smaller scale, describing small, discrete skills or “nuts and bolts” that require basic thinking skills. Think of objectives as the building blocks used to produce whatever is assessed to demonstrate mastery of an outcome. Objectives can be practiced and assessed individually, but are usually only a portion of an overall project or application. Objectives guide how professors plan the class lessons or activities that will lead to the desired outcomes as stated in the SLOs.
Table 2: Some examples of wording differences between course objectives and their related SLOs
Course Objective | Related Student Learning Outcome (SLO) |
---|---|
|
Upon successful completion of this course, students will be able to critique psychological research studies. |
|
Upon successful completion of this course, students will be able analyze the homeostatic mechanisms maintaining the human body. |
Program-level Student Learning Outcomes
Program-level Student Learning Outcomes (PSLOs) are defined as the knowledge, skills, abilities, or attitudes that students have at the completion of a degree or certificate. Faculty within a discipline should meet to discuss the expected learning outcomes for students who complete a particular series of courses, such as those required for a certificate or a degree. PSLOs should be the BIG things you want students to get out of a degree or certificate. PSLOs should be developed throughout the program and in multiple courses. Discussions might also involve colleagues in other programs on prerequisites and transfer courses and with community stakeholders for job expectations. Program outcomes and assessment translate into the important role of institutional improvement. This level of outcomes assessment has the greatest potential to improve student pathways and overall achievement.
It is recommended that each program have 3-6 PSLOs. Discipline faculty members might to have a more comprehensive list based on the requirements of external stakeholders (employers, state requirements, etc.). PSLOs can be assessed in many ways but for most programs, PSLOs are only assessed through linked course-level SLOs. First, you might assess PSLOs in a capstone project or capstone course that many students complete when earning a certificate or degree. Second, you could assess development of a set of skills as students advance through different courses in your program (ENG 1A -> ENG 4 or 7). Third, programs could use assessment results from standardized tests developed internally or by outside organizations. You could compare longitudinal data across a three-year period to see if there have been any changes in student learning. Additionally, you might compare results between different groups of students (online and face-to-face).
Program-level outcomes should
- describe what students are able to do after completing a degree or certificate;
- be limited in number (3-6 outcomes);
- be clear so that students and colleagues can understand them;
- be observable skills (career-specific or transferable), knowledge, attitudes, and/or values;
- be relevant to meet the needs of students, employers, and transfer institutions;
- be rigorous yet realistic outcomes achievable by students
Why do we assess PLOs? We assess PLOs
- to make sure students are prepared for further study in the program
- to encourage and document faculty dialogue about student learning and achievement
- to help faculty with program improvement
- to communicate and clarify our expectations to students
Analysis of PLO data is reported during the annual Program Review process. Analysis requires all faculty understanding, contributing, and discussing the impacts of PSLO results. All PLOs for a certificate or degree should be assessed every 3 years (aligning with the 3-year program review process).
Table 3: Example Program-level Student Learning Outcomes for degrees and certificates.
Upon completion of an AA or AAT degree in anthropology, students will be able to analyze the ethical responsibilities and concerns in the conducting of anthropological research. |
Upon completion of an AA degree in music, students will demonstrate a working knowledge of musical analysis and harmonic theory applicable to their area of specialization. |
Upon completion of an AAT degree in psychology, students will be able to apply theories, concepts and findings in psychology for self-understanding, self-improvement, and lifelong learning. |
Upon completion of an AAT degree in psychology, students will be able to use basic research methods in psychology including research design, hypothesis testing, and data interpretation. |
Upon completion of an AS degree in mathematics students will use mathematical reasoning to solve problems and a generalized problem solving process to work word problems. |
Upon completion of the AA or AAT degree in theater arts, students will be able to evaluate the work performed by theatre practitioners, with special attention to the skills involved in acting, directing, and designing. |
Upon completion of the AA or AAT degree in theater arts, students will be able to integrate acting skills and techniques in the preparation and performance of dramatic literature. |
Upon completion of the certificate in automotive mechanics, students will be able to diagnose, repair, and replace electrical and electronic systems and components. |
Upon completion of the certificate in automotive mechanics, students will be able to diagnose, repair, and replace brake systems and components. |
Upon completion of the certificate in early childhood development, students will implement a wide array of developmentally appropriate approaches, instructional strategies, and tools to connect with children and families. |
Upon completion of the certificate in medical assisting, students will be able to perform clinical office responsibilities such as vital signs, exam room preparation, patient data collection, simple dressing changes, lab tests, phlebotomy, and EKG’s. |
Upon completion of the certificate in computer networking technology, students will be able to analyze simple business or technical problems relevant to programming, and prepare solutions to them. |
Institutional-level Student Learning Outcomes (formerly Core Competencies)
Las Positas College’s primary mission is to foster learning and student success. Students will develop cognitive, behavioral, and affective skills in five areas (communication, critical thinking, creativity and aesthetics, respect and responsibility, and technology) when completing the GE pathway as part of an AA, AS, AAT, AST degree.
Each of the five Institutional Student Learning Outcomes (ISLO) is evaluated annually by the Integrated Planning and Effectiveness Committee (IPEC). The IPEC reviews the ISLO data and identifies areas of improvement. IPEC communicates that information to the SLO committee and to the campus community at a Town Hall meeting. Suggestions from the campus community and the SLO committee are used to develop an action plan. For example, if across a 3-year period achievement on the Communication ISLO declines the SLO committee might separate out the results by program or course to better understand the cause of the decline.
Mapping Course-level SLOs to Program-SLOs and ISLOs
SLOs should be clearly mapped and aligned throughout a course sequence and between courses, programs, and the institutional student learning outcomes to achieve the most efficient and effective assessment.
Figure 3: Course SLOs feed into program and institutional SLOs.
Alignment is the process of analyzing how explicit criteria line up or builds upon one another within a particular learning pathway. When dealing with outcomes and assessment, it is important to determine that course outcomes align or match up with program outcomes; that ISLOs align with the college mission and vision. In student services, alignment of services includes things like aligning financial aid deadlines and instructional calendars.
Assessment Guidelines
Defining (and Re-assessing) Assessment: A Second Try T. A. Angelo, (1995) AAHE Bulletin no.48, p.7. "Assessment is an ongoing process aimed at understanding and improving student learning.
Assessment involves
- making expectations explicit and public;
- setting appropriate criteria and high standards for learning quality;
- systematically gathering, analyzing, and interpreting evidence to determine how well performance matches those expectations and standards; and
- using the resulting information to document, explain, and improve performance.
The SLO committee encourages faculty to make use of your existing assignments to assess student learning. This is called embedded assessment. Embedded assessment occurs within the regular class or curricular activity. Class assignments linked to student learning outcomes serve as grading and assessment instruments (i.e., common test questions, CATs, projects or writing assignments). Specific questions can be embedded on exams in classes across courses, departments, programs, or the institution. Embedded assessment can provide formative or summative information for pedagogical improvement and student learning needs.
For managing the assessment cycle
A critical part of assessment is reflecting on assessment results to improve student learning through collegial dialog. It is part of the continuous cycle of collecting assessment results, evaluating them, using the evaluations to identify actions that will improve student learning, implementing those actions, and then cycling back to collecting assessment results. At Las Positas College the dialog about student learning and assessment takes place within disciplines and/or departments, within the SLO Committee, and the Institutional Effectiveness Committee. This dialog is captured in Program Review documents and in annual reports made by the SLO and IE Committees Appendix B contains a form to help discipline faculty capture that dialog and show program improvement.
Why we don’t use grades as an assessment of Student Learning
Letter grades are a summation of many activities students have completed during a course. Letter grades can include components that are not about student learning. In addition, we want to be able to examine student learning of major course outcomes independent from one another and without the extraneous components.
Figure 4: Example of how discipline faculty might set-up an assessment plan for a single course.
Types of Assessments
There are many approaches to assessing student learning. You might want to assess students’ emotions or attitudes about an assignment by giving them a self-assessment. You might want to compare scores on a standardized assessment early and late in the semester. The following sections describe the various factors to consider when designing assessments.
Formative and Summative Assessment
Formative assessment. Formative assessment is a diagnostic tool implemented during the instructional process that generates useful feedback for student development and improvement. The purpose is to provide an opportunity to perform and receive guidance (such as in class assignments, quizzes, discussion, lab activities, etc.) that will improve or shape a final performance. This stands in contrast to summative assessment where the final result is a verdict and the participant may never receive feedback for improvement such as on a standardized test or licensing exam or a final exam.
This assessment is most important in its role as a diagnostic tool which allows you to
- identify areas of deficiency;
- prescribe alternative learning strategies; and
- motivate the student to a deeper learning experience.
Summative assessment. A summative assessment is a final determination of knowledge, skills, and abilities. This could be exemplified by exit or licensing exams, senior recitals, capstone projects or any final evaluation which is not created to provide feedback for improvement, but is used for final judgments.
Indirect and Direct Assessments of Learning
Direct Assessments. Direct assessments provide evidence of student knowledge, skills, or attitudes for the specific domain in question and actually measuring student learning, not perceptions of learning or secondary evidence of learning, such as a degree or certificate. For instance, a math test directly measures a student’s proficiency in math. In contrast, an employer’s report about student abilities in math or a report on the number of math degrees awarded would be indirect data.
Indirect Assessments. Indirect assessments indirectly measure student performance. For instance, certificate or degree completion data provide indirect evidence of student learning but do not directly indicate what a student actually learned. Indirect assessments often use surveys or self-assessments of the learning process or the learning environment.
Quantitative and Qualitative Data
Qualitative data. Qualitative data are descriptive information, such as narratives or portfolios. These data are often collected using open-ended questions, feedback surveys, or summary reports, and may be difficult to compare, reproduce, and generalize. Qualitative data provide depth and can be time and labor intensive. Nonetheless, qualitative data often pinpoint areas for interventions and potential solutions which are not evident in quantitative data.
Quantitative data. Quantitative data are numerical or statistical values. These data use numbers such as scores, rates, or percentages to express quantities of a variable. Qualitative data, such as opinions, can be displayed as numerical data by using Likert scaled responses which assign a numerical value to each response (e.g., 4 = strongly agree to 1 = strongly disagree). These data are easy to store and manage providing a breadth of information. Quantitative data can be generalized and reproduced, but must be carefully constructed to be valid.
Classroom assessment techniques. Classroom assessment techniques (CATs) are “simple tools for collecting data on student learning in order to improve it” (Angelo & Cross, 1993, p. 26). CATs are short, flexible, classroom techniques that provide rapid, informative feedback to improve classroom dynamics by monitoring learning, from the student’s perspective, throughout the semester. Data from CATs are evaluated and used to facilitate continuous modifications and improvement in the classroom. For an overview of the different CATs go to the Field tested Learning Assessment Guide (FLAG): http://www.flaguide.org/cat/cat.php.
Table 4: A list of different types of assessment that are direct or indirect measures of student learning.
Method | Description | Direct or Indirect |
---|---|---|
Capstone Project or Course |
A capstone project or course that integrates knowledge, concepts, and skills students are to have acquired during the course of their study. Capstones provide a means to assess student achievement within a program. |
Direct |
Clinical Evaluation |
An evaluation of students’ performance in a clinical setting. The clinical performance is scored using a rubric. |
Direct |
Competition (Juried) |
An evaluation of students’ performance or work based on the scoring or judging of external reviewers. |
Direct |
Demonstration/Presentation |
An evaluation of students on a demonstration or presentation to the class or other audience. The demonstration or presentation is scored using a rubric. |
Direct |
Document Review |
A review of course or unit documents for the purpose of determining if information is available and clear. |
Indirect |
Entrance/Exit Interviews |
An assessment based on interviews conducted with students when they enter college and when they leave—either through graduation or early departure. These interviews can be designed to measure program- specific SLOs or to gather feedback on student services SAOs. |
Direct/Indirect |
Exam - Exit |
A comprehensive exit exam given near the end of the student's academic career (usually during the final semester prior to graduation). The exam is generally given to determine a student’s acquisition and application of a particular type or form of knowledge or skill, as well as the ability to integrate knowledge from various disciplines. The exam can be written, oral, or a combination. |
Direct |
Exam or Quiz – In Course |
An exam or quiz that is administered by individual professors in their classes. It may be the entirety of the exam or embedded questions within an exam. |
Direct |
Exam – Standardized/Licensure |
A test that is developed outside the institution for use by a wide group of students using national, regional, or professional norms. |
Direct |
Exhibit |
An evaluation of students’ work in a public exhibit. The exhibit is scored using a rubric. |
Direct |
Field Work |
An evaluation of students on the demonstration of skills during field work. The skills demonstration is scored using a rubric. |
Direct |
Focus Group |
A series of structured discussions with students who are asked a series of open-ended questions designed to collect data about beliefs, attitudes, and experiences. |
Indirect |
Frequency/Count |
An assessment based on the number or frequency of things, such as usage of particular services. |
Direct/Indirect |
Group Project |
An evaluation of students’ work on an assigned group project. The work is scored using a rubric. |
Direct |
Institutional Data |
A review of program and student data collected at the institutional level. Data may include program enrollment, retention, or student GPA. |
Direct/Indirect |
Internship |
An evaluation of students’ job performance during an internship or volunteer placement. The job performance is scored using a rubric. |
Direct |
Journal Review |
An evaluation based on students’ written journals. Entries can be used to determine students’ overall engagement with the course material and to assess their understandings of course content. |
Direct |
Lab Practicum |
An evaluation of students’ work during a lab practicum. The work is scored using a rubric. |
Direct |
Lab Report |
An evaluation of students’ work on a lab report. The work is scored using a rubric. |
Direct |
Observation/Interview Report |
An evaluation of students’ work on an observation or interview report. The work is scored using a rubric. |
Direct |
Outreach |
An assessment of the successes, benefits, or quality of outreach activities. |
Direct/Indirect |
Participation |
An evaluation of students on their course participation. Participation is scored using a rubric. |
Direct |
Performance |
An evaluation of students during musical, theatre, athletic, communications, or other performance. The performance is scored using a rubric. |
Direct |
Portfolio |
An evaluation of students’ work collected in a portfolio and evaluated using a common rubric. Portfolios may contain research papers, reports, tests, exams, case studies, videos, personal essays, journals, self- evaluations, or exercises. |
Direct |
Pre/Post Testing |
An exam administered at the beginning and at the end of a course or program to determine the progress of student learning. |
Direct |
Reflective Essay |
Reflective essays used to determine students’ opinions and perceptions. |
Indirect |
Survey - Alumni |
An assessment based on the surveying of program alumni. Alumni surveys can provide information about program satisfaction, preparation (transfer or workforce), employment status, and skills for success. Surveys can ask alumni to identify what should be changed, altered, maintained, improved, or expanded. |
Indirect |
Survey - Student |
An assessment based on the surveying of students designed to collect perceptions of their college experiences. |
Indirect |
Writing Assignment/Project |
An evaluation of students’ work on written assignments or essays. The work is scored using a rubric. |
Direct |
Closing the loop: The importance of dialogue in the assessment process
Closing the loop is often seen as the most valuable portion of assessment by faculty, administrators, and ACCJC but it is also the most problematic. If assessments are too generic or investigate processes at too high of a level, the data can not produce meaningful change. If the assessment looks at a part of the SLO that is too narrow, the results may only be applicable to your own section of a course. Discipline faculty should coordinate with other faculty teaching the same course to select the SLO(s) to be assessed and develop assessments.
At the course- and program-level, assessment results are shared with and discussed by discipline faculty at department meetings, division meetings, and through e-mail where decisions about improvement plans are also discussed. A brief summary of the dialogue should be documented in the comprehensive program review or annual program review update. Appendix B contains an example form that can be used to capture that dialogue.
Actions that result from the dialogue can be anything from concluding that student performance meets expectations to making major curriculum changes. Other actions may include changing specific assignments in a course, adding prerequisites, or providing support services such as tutoring. Another action could be to rewrite the SLO.
Definitions
Program: A program is an organized set of courses and/or services that lead to a defined objective(s) in support of student learning. There are three types of programs: educational, student services, and administrative.
Program Review: A process to examine the effectiveness of an academic, administrative, or student services program. The process provides feedback (a) to the unit primarily responsible for the program, (b) to the appropriate administrators, and (c) to external units in the form of confirmation of the existence of the program review process and in the form of summaries of the outcomes.
Educational Program: Educational program is an organized sequence of courses leading to a defined objective, a degree, a certificate, a diploma, a license, or transfer to another institution of higher education.
Student Learning Outcomes: SLOs are the observable or measurable results subsequent to a learning experience. They may involve knowledge (cognitive), skills (behavioral), or attitudes (affective) that provide evidence that learning has occurred. SLOs encompass students’ ability to synthesize discreet skills using higher level thinking skills and produce something that applies what they have learned; this is exemplified through a gathering of smaller objectives and applies analysis, evaluation, and synthesis in more sophisticated ways.
ELumen Step by step instructions (Coming soon)
Logging into elumen
Entering SLOs into elumen
Running reports of course SLO data and program SLO data
Instructions for managers and administrators to verify entry of SLO data
Appendix A: Excerpts from the ACCJC Standards for Accreditation (as revised June 2014)
Standard I: Mission, Academic Quality and Institutional Effectiveness, and Integrity
The institution demonstrates strong commitment to a mission that emphasizes student learning and student achievement. Using analysis of quantitative and qualitative data, the institution continuously and systematically evaluates, plans, implements, and improves the quality of its educational programs and services. The institution demonstrates integrity in all policies, actions, and communication. The administration, faculty, staff, and governing board members act honestly, ethically, and fairly in the performance of their duties.
B. Assuring Academic Quality and Institutional Effectiveness
Institutional Effectiveness
5. The institution assesses accomplishment of its mission through program review and evaluation of goals and objectives, student learning outcomes, and student achievement. Quantitative and qualitative data are disaggregated for analysis by program type and mode of delivery.
6. The institution disaggregates and analyzes learning outcomes and achievement for subpopulations of students. When the institution identifies performance gaps, it implements strategies, which may include allocation or reallocation of human, fiscal and other resources, to mitigate those gaps and evaluates the efficacy of those strategies.
Standard II: Student Learning Programs and Support Services
The institution offers instructional programs, library and learning support services, and student support services aligned with its mission. The institution’s programs are conducted at levels of quality and rigor appropriate for higher education. The institution assesses its educational quality through methods accepted in higher education, makes the results of its assessments available to the public, and uses the results to improve educational quality and institutional effectiveness. The institution defines and incorporates into all of its degree programs a substantial component of general education designed to ensure breadth of knowledge and to promote intellectual inquiry. The provisions of this standard are broadly applicable to all instructional programs and student and learning support services offered in the name of the institution.
A. Instructional Programs
- All instructional programs, regardless of location or means of delivery, including distance education and correspondence education, are offered in fields of study consistent with the institution’s mission, are appropriate to higher education, and culminate in student attainment of identified student learning outcomes, and achievement of degrees, certificates, employment, or transfer to other higher education programs. (ER 9 and ER 11)
- Faculty, including full time, part time, and adjunct faculty, ensure that the content and methods of instruction meet generally accepted academic and professional standards and expectations. Faculty and others responsible act to continuously improve instructional courses, programs and directly related services through systematic evaluation to assure currency, improve teaching and learning strategies, and promote student success.
- The institution identifies and regularly assesses learning outcomes for courses, programs, certificates and degrees using established institutional procedures. The institution has officially approved and current course outlines that include student learning outcomes. In every class section students receive a course syllabus that includes learning outcomes from the institution’s officially approved course outline.
- If the institution offers pre-collegiate level curriculum, it distinguishes that curriculum from college level curriculum and directly supports students in learning the knowledge and skills necessary to advance to and succeed in college level curriculum.
- The institution’s degrees and programs follow practices common to American higher education, including appropriate length, breadth, depth, rigor, course sequencing, time to completion, and synthesis of learning. The institution ensures that minimum degree requirements are 60 semester credits or equivalent at the associate level, and 120 credits or equivalent at the baccalaureate level. (ER 12)
- The institution schedules courses in a manner that allows students to complete certificate and degree programs within a period of time consistent with established expectations in higher education. (ER 9)
- The institution effectively uses delivery modes, teaching methodologies and learning support services that reflect the diverse and changing needs of its students, in support of equity in success for all students.
- The institution validates the effectiveness of department-wide course and/or program examinations, where used, including direct assessment of prior learning. The institution ensures that processes are in place to reduce test bias and enhance reliability.
- The institution awards course credit, degrees and certificates based on student attainment of learning outcomes. Units of credit awarded are consistent with institutional policies that reflect generally accepted norms or equivalencies in higher education. If the institution offers courses based on clock hours, it follows Federal standards for clock-to-credit-hour conversions. (ER 10)
- The institution makes available to its students clearly stated transfer-of-credit policies in order to facilitate the mobility of students without penalty. In accepting transfer credits to fulfill degree requirements, the institution certifies that the expected learning outcomes for transferred courses are comparable to the learning outcomes of its own courses. Where patterns of student enrollment between institutions are identified, the institution develops articulation agreements as appropriate to its mission. (ER 10)
- The institution includes in all of its programs, student learning outcomes, appropriate to the program level, in communication competency, information competency, quantitative competency, analytic inquiry skills, ethical reasoning, the ability to engage diverse perspectives, and other program-specific learning outcomes.
- The institution requires of all of its degree programs a component of general education based on a carefully considered philosophy for both associate and baccalaureate degrees that is clearly stated in its catalog. The institution, relying on faculty expertise, determines the appropriateness of each course for inclusion in the general education curriculum, based upon student learning outcomes and competencies appropriate to the degree level. The learning outcomes include a student’s preparation for and acceptance of responsible participation in civil society, skills for lifelong learning and application of learning, and a broad comprehension of the development of knowledge practice, and interpretive approaches in the arts and humanities, the sciences, mathematics, and social sciences. (ER 12)
- All degree programs include focused study in at least one area of inquiry or in an established interdisciplinary core. The identification of specialized courses in an area of inquiry or interdisciplinary core is based upon student learning outcomes and competencies, and includes mastery, at the appropriate degree level, of key theories and practices within the field of study.
- Graduates completing career-technical certificates and degrees demonstrate technical and professional competencies that meet employment standards and other applicable standards and preparation for external licensure and certification.
- When programs are eliminated or program requirements are significantly changed, the institution makes appropriate arrangements so that enrolled students may complete their education in a timely manner with a minimum of disruption.
- The institution regularly evaluates and improves the quality and currency of all instructional programs offered in the name of the institution, including collegiate, pre- collegiate, career-technical, and continuing and community education courses and programs, regardless of delivery mode or location. The institution systematically strives to improve programs and courses to enhance learning outcomes and achievement for students.
B. Library and Learning Support Services
- The institution supports student learning and achievement by providing library, and other learning support services to students and to personnel responsible for student learning and support. These services are sufficient in quantity, currency, depth, and variety to support educational programs, regardless of location or means of delivery, including distance education and correspondence education. Learning support services include, but are not limited to, library collections, tutoring, learning centers, computer laboratories, learning technology, and ongoing instruction for users of library and other learning support services. (ER 17)
- Relying on appropriate expertise of faculty, including librarians, and other learning support services professionals, the institution selects and maintains educational equipment and materials to support student learning and enhance the achievement of the mission.
- The institution evaluates library and other learning support services to assure their adequacy in meeting identified student needs. Evaluation of these services includes evidence that they contribute to the attainment of student learning outcomes. The institution uses the results of these evaluations as the basis for improvement.
- When the institution relies on or collaborates with other institutions or other sources for library and other learning support services for its instructional programs, it documents that formal agreements exist and that such resources and services are adequate for the institution’s intended purposes, are easily accessible and utilized. The institution takes responsibility for and assures the security, maintenance, and reliability of services provided either directly or through contractual arrangement. The institution regularly evaluates these services to ensure their effectiveness. (ER 17)
C. Student Support Services
- The institution regularly evaluates the quality of student support services and demonstrates that these services, regardless of location or means of delivery, including distance education and correspondence education, support student learning, and enhance accomplishment of the mission of the institution. (ER 15)
- The institution identifies and assesses learning support outcomes for its student population and provides appropriate student support services and programs to achieve those outcomes. The institution uses assessment data to continuously improve student support programs and services.
- The institution assures equitable access to all of its students by providing appropriate, comprehensive, and reliable services to students regardless of service location or delivery method. (ER 15)
- Co-curricular programs and athletics programs are suited to the institution’s mission and contribute to the social and cultural dimensions of the educational experience of its students. If the institution offers co-curricular or athletic programs, they are conducted with sound educational policy and standards of integrity. The institution has responsibility for the control of these programs, including their finances.
- The institution provides counseling and/or academic advising programs to support student development and success and prepares faculty and other personnel responsible for the advising function. Counseling and advising programs orient students to ensure they understand the requirements related to their programs of study and receive timely, useful, and accurate information about relevant academic requirements, including graduation and transfer policies.
- The institution has adopted and adheres to admission policies consistent with its mission that specify the qualifications of students appropriate for its programs. The institution defines and advises students on clear pathways to complete degrees, certificate and transfer goals. (ER 16)
- The institution regularly evaluates admissions and placement instruments and practices to validate their effectiveness while minimizing biases.
- The institution maintains student records permanently, securely, and confidentially, with provision for secure backup of all files, regardless of the form in which those files are maintained. The institution publishes and follows established policies for release of student records.
Appendix B: SLO Assessment Results Form
Table 1: Using assessment data from last year, describe the impacts of SLO practices on student learning, achievement, and institutional effectiveness. Describe the practices which led to the success.
Course: |
Course SLO (CSLO): |
Describe the quantitative or qualitative results: |
Discuss any actions taken so far (and results, if known): |
Discuss your action plan for the future: |
Table 2: Using assessment data from last year, describe the impacts of SLO practices on student learning, achievement, and institutional effectiveness. Describe the practices which led to the success.
Student Services Area: |
Student Area Outcome (SAO): |
Describe the quantitative or qualitative results: |
Discuss any actions taken so far (and results, if known): |
Discuss your action plan: |
Table 3: Using assessment data from last year, describe the impacts of SLO practices on student learning, achievement, and institutional effectiveness. Describe the practices which led to the success.
Degree/Certificate: |
Program SLO (PSLO): |
Describe the quantitative or qualitative results: |
Discuss any actions taken so far (and results, if known): |
Discuss your action plan: |
Appendix C: The Genie in the Bottle: Disaggregation of Student Learning Outcomes Data
September 2015
Randy Beach, ASCCC Accreditation and Assessment Committee Chair
With the release of the revised ACCJC Standards in 2014, Standard I.B.6 has received a great deal of attention and prompted many discussions across the California Community College System, as well as an ASCCC resolution at the Spring 2015 Plenary (2.01 S15). This standard requires colleges to not only collect but also to disaggregate student learning outcomes (SLO) data, which is the practice of collecting an individual student’s SLO data and linking his or her scores to student’s demographic data, especially gender, ethnicity, and other metrics related to student equity and disproportionate impact. Colleges are required to then analyze SLO data for disproportionate impact among subpopulations and make program changes according to the results.
With this change, the idea of a genie in a bottle fits fairly well when discussing disaggregated data and student learning outcomes. The most famous version of the Persian folktale of Aladdin and the genie in the lamp is told in the One Thousand and One Nights in this way: After Aladdin discovers the lamp and releases the genie, the genie helps Aladdin to become wealthy and powerful, and even helps him to to marry the emperor's daughter Princess Badroulbadour, who was betrothed to another, and to build a grand palace. Other stories tell of genies, or the Jinn, whose intentions when released from the bottle are not benevolent but are very nefarious in the same vein as the “trickster” character in western literature. Even in the One Thousand and One Nights tale, a sorcerer tricks Aladdin’s wife and steals the lamp only to command the genie to take away all the riches Aladdin has gained. Like in the tales, SLOs and disaggregation are fickle genies, and this duplicity raises the question of whether SLO data disaggregation will be a good genie, a bad one, or something in between.
The Good Genie
A 2012 brief by the National Center for Mental Health Promotion and Youth Violence Prevention, an organization that provides technical assistance and training to 106 federally funded Safe Schools/Healthy Students in K-12, argues in favor of disaggregation. The brief points out that aggregate data masks inequities in success rates among subpopulations, leaving those struggling subpopulations unrecognized and on their own in terms of improving success rates. The brief also argues that disaggregation informs and provides data support for changes in how programs are implemented in order to support all students. These changes can take the form of specific policy changes, funding augmentations, and more surgically precise program improvements that take into account the diversity in the classroom.
Student Equity Planning through the Student Success and Support Act at its core relies on disaggregated data for planning improvements in student achievement for subpopulations. Taking that philosophy to the course-level and program-level learning outcome assessment is an extension of that effort, at the federal and state levels, to increase access, course completion, ESL and basic skills completion, degrees, certificates, and transfer for all colleges. Title 5 regulations require colleges to review and address disproportionate impact for Indians or Alaskan natives, Asians or Pacific Islanders, Blacks, Hispanics, Whites, men, women, and persons with disabilities (§54220(d)) and to develop specific goals or outcomes and actions to address inequities. Action plans for improvement then evolve through the program review process. Disaggregation advocates say meaningful conversation about disproportionate impact cannot happen without disaggregation of course-level learning outcomes.
The Bad Genie
Later in the story of Aladdin, an evil sorcerer tricks Aladdin’s wife and takes the lamp. He uses the genie to take away from Aladdin the riches he attained with the genie’s help. Similarly, we might ask whether SLO disaggregation, like the Jinn from Persian lore, also has a bad side or whether this particular genie can be used for mischief and mayhem in the wrong hands.
The concerns over the disaggregation genie are wide-ranging. Student privacy concerns are real and require very precise data reporting practices that must be collegially agreed upon by faculty, administrations, and researchers at each college and in keeping with FERPA regulations. When data are disaggregated for courses that only offer one section or are rarely offered at all, publicizing results with demographic information may allow students to be identifiable, especially for underrepresented minority students. Also, low sample sizes call into question the validity of the data collected in the first place. If only 20 Asian-American students are included in learning outcomes assessments out of 250 students total across two or three sections of a capstone course, that data may not really tell you anything significant about Asian students. Even if the data are longitudinal over several years, small sample sizes may not provide useful information.
We have to also remember that SLO assessment frequently raises controversy in any context. Some faculty bargaining units, which may already be resistant to SLO assessment, will certainly ask relevant questions about additional workload associated with this type of data entry that may reinforce the opinion of local unions that ACCJC is imposing standards without deference to bargaining agreements. Local senates should approach the way they respond to this standard with their bargaining unit partners as part of the conversation, in the same way they would be involved in any discussion related to district policy or practice intended to address accreditation standards.
The Genie Is Out and He’s Not Going Back In
SLO assessment is here to stay, and the ASCCC has made statements regarding compliance with SLOs in the last decade. For better or worse, this genie is not going away.
In order to use the genie for good while acknowledging the arguments for and against, colleges should begin disaggregation data conversations slowly and in measured steps:
- Pick one course in a program, maybe the course with the most sections, and ask faculty in those sections to collect and input disaggregated data into their database systems.
- Review less controversial data attributes in reporting. For example, look at sections taught in the evening versus sections taught during the day, sections taught online versus sections taught on ground, or sections taught at a central campus versus at an education center or remote site. Such a beginning may be a way to get start the process while keeping in mind the requirement in the ACCJC Standards that data on subpopulations must be disaggregated by the time of your college’s next self-evaluation report to be in compliance, beginning Spring 2016.
- Look to Student Equity funding. If issues of workload are impeding the conversation over disaggregation, look to Student Equity funding as potential seed money to build an infrastructure where disaggregation is not a hardship or burden for faculty.
So, How Does the Story End?
One cannot predict at this time how this story will end because it is just beginning. As more colleges begin adopting and revising processes in order to comply with the new standards in Spring 2016, questions over SLOs in general and disaggregation specifically will begin making their way to meeting rooms across the state. Community colleges throughout California must begin discussions of how they will address the SLO disaggregation requirement and consider the various implications of this practice regarding workload, student privacy, data relevance, and other issues in order to ensure that the ACCJC’s requirement turns into a good genie that can grant positive results for colleges and students.
National Center Brief: The Importance of Disaggregating Student Data.
National Center for Mental Health Promotion and Youth Violence Prevention, Safe Schools; Healthy Students. April 2012. Web. 10 Aug 2015
Appendix D: Responses to Questions from the ACCJC Accreditation Standards Symposium April 23-24, 2015
ACCREDITING COMMISSION FOR COMMUNITY AND JUNIOR COLLEGES, WESTERN ASSOCIATION OF SCHOOLS AND COLLEGES (ACCJC)
Responses to Questions from the ACCJC Accreditation Standards Symposium April 23-24, 2015
Below find responses to participation questions concerning the Accreditation Standards adopted in June 2014. Also please review the posted slides from the conference, as information from questions answered during presentations is included there.
Standard I
Standard I.B.1
Q. Is the term “student learning outcomes” different from “learning outcomes?”
A. Throughout the Accreditation Standards, the terms student learning outcomes and learning outcomes are used interchangeably.
In standard I.B.1, there is a term of “student outcomes” used, to address inclusively student learning and student achievement. Standards I.B.2-6 discuss separately and specifically both student learning outcomes and student achievement in the context of Assuring Academic Quality and Institutional Effectiveness.
Standard I.B.3 and federal regulations 34 C.F.R. § 602.16(a)(1)
Q. Does the job placement rate apply to vocational certificates only, or also to degrees such as nursing and teacher training? Does it not apply to liberal arts?
A. College must have institution-set standards and assess program and institutional performance related to job placement rates and licensure examination placement rates (for programs where graduates must complete an examination in order to work in that field) in their vocational/career- technical education programs, commonly referred to as CTE programs. These CTE programs include certificate programs and degree programs, as well as other “programs” defined by the institution.
Colleges must also have institution-set standards and assess performance as to course completion rates. Being mindful that the standards require institutions to have institution-set standards appropriate to their missions, it is assumed that institutions will have additional standards set in a number of other areas. In that vein, a college could include job placement as a measure for all of its programs, if it determines this would be appropriate to determining if it is meeting its mission.
I.B.6 – Several Questions
Q. Since the California Equity Plan disaggregates data, does that fulfill the disaggregation requirement of the standard?
A. We have seen a number of equity plans, and they have significant variations. As a general practice, it may be useful for an institution to look at all of its plans and reports, to identify synergies in the data gathered and analyses completed.
The Standard asks for disaggregation for subpopulations of students to identify performance gaps related to student learning and student achievement. The institution will want to determine relevant student populations for inclusion in institution-level analysis, and will also likely want to set criteria to aid programs in determining populations of students for analysis at the program level, based upon the institutional mission and programmatic emphases.
Please note that Standard I.B.5, related to assessing accomplishment of the institutional mission, also creates the expectation of disaggregated data for analysis by program type and mode of delivery.
Q. Does I.B.6 apply only to Gainful Employment or CTE programs?
A. No, the standard refers to “the institution” and applies to student learning and achievement gap identification across the institution and in all programs.
Q. Does I.B.6 require disaggregation of student learning outcomes data specifically by demographics including race and gender? What does subpopulation mean—can that mean DE vs online, evening versus day, etc?
Standard I.B.6 does not require disaggregation by specific demographic characteristics. Instead, the institution will want to determine the relevant student populations. The purpose of disaggregation is to provide information that will help the institution examine student learning and student achievement performance gaps and create strategies for addressing those gaps. The identification and disaggregation of relevant student populations should facilitate this work.
Standard I.B.5 addresses disaggregation by program type and mode of delivery.
Q. To what extent should data sets be disaggregated, and in what ways? What data sets did the Commission have in mind when drafting this standard? Are they retention/success oriented or course/program/institutional SLOs?
A. The institution should identify the populations of students, based upon the students it serves, for which to disaggregate data about student learning outcomes and student achievement. Per Standard I.B.5, it should also disaggregate by program type and mode of delivery to assess how the institution is meeting its mission, given the methods by which instructional services are delivered. As mentioned above, it will be helpful for the institution to identify criteria for disaggregation of data within programs, based upon the institutional mission as well as programmatic emphases.
Q. Does the Commission believe that collecting and analyzing disaggregated learning outcomes data, compared to interventions to improve outcomes, actually can be used to evaluate causal relationships?
A. We know there are many factors and causes for why individual students successfully complete classes, leave the institution, do or do not complete degrees or transfer, and why they learn or do not learn something. While there are some factors outside the control of the institution, we also know there are institutional factors (institution-wide, or perhaps only within a single classroom) which can negatively or positively impact multiple students’ learning and success. Some of those factors have disparate impact on particular populations of students. The purpose of disaggregated institutional data and analysis is to get to a level of detail that informs institutional choices pertinent to the populations of students it serves and to advancing their success (through strategies and decisions that may apply to them as individuals, as members of an identified group, or as part of the entire student body). Some analysis may prove to be most helpful to a particular department or program, and other analysis may provide institution-wide insight. Whatever the level, data analysis should be used to inform decisions and plans to improve student learning and achievement, and to meet the college’s mission.
Q. As to the requirement for disaggregating data on learning outcomes, there are faculty in small programs and researchers who are troubled by the idea of disaggregating data on learning outcomes in small classes—because it starts to get easy to identify students individually. Recognizing this concern, doesn’t it make sense to disaggregate at higher levels of power (degrees, or a combination of courses across years)?
A. Principles of good practice would support the expectation that instructors, in every class taught, are looking at the achievement of student learning outcomes at the individual student level, as well as aggregating results. This is an essential aspect of the continuous improvement that education professionals have practiced for many years. It is one reason for the promotion of embedded assignments for SLO practice, used in combination with assessment rubrics. Moreover, effective faculty practice—as instructors, advisors, and department members-- has long included strategies for encouraging student behaviors which can lead to stronger student achievement as well (from attendance and course completion, to program and course selection for certificates, degrees, and transfers, to career planning and preparation).
When it comes to classroom, curriculum, departmental, and institutional planning and resource allocation, there are different levels of granularity which will be useful for decision making. The conversation at each level should include identification of the data and measures which should influence decisions at that level, and how the necessary level of information availability can be achieved.
The Standards have for years required that institutional credentials be based upon student learning outcomes. As credentials are assigned to individual students, many institutions and systems are realizing the value of accessing certain information at the student level. Of course, just as with any research or provision of services related to individual subjects, there must be scrupulous adherence to the privacy and confidentially safeguards.
While unspoken in the question, there can be times when concerns about workload, capacity, and accountability of individuals involved in assessment and research come into play. These are appropriate subjects for conversation at the institution, to ensure the focus of data analysis and decisions remains on student learning and achievement.
I.C.14
Q. This is a new standard. What are some examples of evidence that could be used to meet this standard?
A. Within complex higher education institutions-- whether private for-profit, private nonprofit, or public—there can be competing values and priorities such as those described in the standard. The statement of purpose at the governing board or institutional level might stress the higher priority of student achievement and student learning than on these other objectives. There may be provisions within the conflict of interest and ethics policies, as well as demonstrated consideration of student learning and achievement in resource and other decision making, which can shows the requisite institutional commitment.
For more information please contact:
Tim Druley
Document Manager
(925) 424-1658
tdruley@laspositascollege.edu