Here’s a podcast recorded from a brown bag lunch session at Virginia Commonwealth University entitled “Assessment in a Web 2.0 Learning Environment.” Give a listen and let me know your thoughts. Let’s talk!
Good question. As we under take curriculum mapping here at NVC, some food for thought on the benefits courtesy of Indiana University Southeast. I, R/E, or M refers to Introduced, Reinforced/Emphasized and Mastered.
For faculty, completed curriculum maps:
- Show how courses relate to one another
- Lend adjunct and part-time faculty a voice in program curriculum and better communicates to them what is expected in their assigned courses
- Reveal if certain program goals or learning outcomes are not adequately covered in the current curriculum (course rows in the map with few or none I’s, R’s/E’s, and M’s). This can lead to substantive curriculum revisions that benefit student learning and add to program cohesiveness.
- Reveal gaps in the curriculum (goal or outcome columns in the map with few or none I’s, R’s/E’s, and M’s). This can lead to substantive program goal or student learning outcome revisions that benefit student learning and better reflect faculty expertise.
- If each cell in a column is filled (i.e., a program goal or learning outcome has an I, R/E, or M related to every course), it might suggest redundancy and unnecessary overlap related to that goal or outcome in your curriculum. A reduction of overlap can provide opportunities to focus on gaps that emerge elsewhere.
- Often lead to more energized and engaged faculty
For students, viewing curriculum maps:
- At the start of a course and throughout the program, shows how courses interrelate and build on one another, contributing to student grounding in their discipline and seeing how individual components fit together to make a coherent whole
- Can help students better understand course sequencing and might lead to greater knowledge retention and transfer through a sequence or curriculum
I recently discussed assessment at NVC with Dr. Robyn Wornall, Director, Institutional Research, and Chris Farmer, Research Analyst, of the Office of Research, Planning and Institutional Effectiveness. Give a listen. Leave a comment. Let’s talk!
Indirect assessment is a great way to gauge student learning. A good way to ask students is through a survey. But to do that you have create an account with something like Survey Monkey, then write the survey. I’ve found something that might work better, Student Assessment of their Learning Gains, or SALG. Here’s some info from their about page:
The SALG site currently has 9686 instructors, 5296 instruments, and 207560 student responses.
The Student Assessment of their Learning Gains (SALG) instrument was developed in 1997 by Elaine Seymour while she was co-evaluator for two National Science Foundation-funded chemistry consortia (ChemLinks and ModularCHEM) that developed and tested modular curricula and pedagogy for undergraduate chemistry courses. The original SALG was used by over 1000 instructors in 3000 classes and by over 65,000 students. The instrument was subsequently revised by Stephen Carroll, Elaine Seymour, and Tim Weston in 2007 to better reflect the goals and methods used in a broader array of courses beyond chemistry.
The SALG instrument focuses exclusively on the degree to which a course has enabled student learning. In particular, the SALG asks students to assess and report on their own learning, and on the degree to which specific aspects of the course have contributed to that learning. The instrument has since been revised to include five overarching questions, each of which an instructor can customize to a course through sub-items. These questions are:
- How much did the following aspects of the course help you in your learning? (Examples might include class and lab activities, assessments, particular learning methods, and resources.)
- As a result of your work in this class, what gains did you make in your understanding of each of the following? (Instructors insert those concepts that they consider most important.)
- As a result of your work in this class, what gains did you make in the following skills? (A sample of skills includes the ability to make quantitative estimates, finding trends in data, or writing technical texts.)
- As a result of your work in this class, what gains did you make in the following? (The sub-items address attitudinal issues such as enthusiasm for the course or subject area.)
- As a result of your work in this class, what gains did you make in integrating the following? (The sub-items address how the students integrated information.)
The website is http://www.salgsite.org.
Here’s a podcast from Brigham Young University recorded in 2009 discussing the relationship between grades, assessment and learning. What do you think about the content discussed here? Let’s talk.
I recently heard a podcast by Dr. Joe Marolla, Vice Provost of Instruction at Virginia Commonwealth University from 2006. It really got me thinking about what a student centered, or learning centered institution would look like and why we need to demonstrate that learning. Dr Marolla’s talk was based on the 1995 article in Change magazine, “From Teaching to Learning – A new Paradigm for Undergraduate Education.” Give a listen to the podcast and take a look at the article linked below and let me know your thoughts.
Below is the link to the article:
Welcome to the Assessment blog for Napa Valley College. On this blog you’ll find articles, podcasts and links concerning Student Learning Outcome, Program Level Outcomes, and Institutional Level Outcomes, all to advance student learning.