ºÙºÙÊÓƵ is going about undergraduate program review in a new way, intending to streamline the process and provide more meaningful outcomes.
Previously, said plant sciences professor Dan Potter, "It was perceived as a lot of work, especially for staff, with little benefit."
Data are the backbone of the new process, Potter said, and should make it easier. The data cover everything from student-faculty ratio, number of teaching assistants and percentage of underrepresented students, to grade distribution and assigned space.
The data also include results of surveys of recent graduates, who are asked about overall understanding of their majors, about their satisfaction with their majors, instruction, advising and courses, and other questions dealing with access to small classes and access to faculty outside of class, among many others.
Pat Turner, vice provost for undergraduate studies, said: "One of the heroes in all this is technology. We are able to get data now that we couldn't get as recently as 1990."
The data, from Student Affairs Research and Information, the Office of the Registrar, and the Office of Resource Management and Planning, provide the evidence that people here and nationally are looking for these days, evidence that educational programs are succeeding.
"ºÙºÙÊÓƵ faculty have long taken the caliber of their programs seriously," Turner said. "They want their undergraduate students to be well-served.
For this reason, the Academic Senate requires individual reviews and evaluations of every undergraduate teaching program and-or major every seven years. "In the past, there were delays at every step of the process," said Potter, adding that reviews often were not completed within the seven-year time frame.
He recalled "a pretty significant backlog" at the time he became chairman of the senate's undergraduate program review committee, part of the Undergraduate Council. One review dragged on so long that by the time it made its way to Potter's committee, the program no longer existed.
"We kind of decided we ought to take a look at this and try to fix and improve it," Potter said.
Around the same time, the Western Association of Schools and Colleges — which grants accreditation — asked for more evidence-based reviews of educational outcomes.
The evidence is in the data, Turner and Potter said. And now such data are easily provided to academic units at the start of the review process. "The old system was more anecdotal, more subjective," Potter said. "Now it is more data-driven."
The new system applies to three of ºÙºÙÊÓƵ' four undergraduate colleges: Letters and Science, Agricultural and Environmental Sciences, and Biological Sciences. The College of Engineering has its own review system, in line with the external accreditation process that the college goes through.
For L&S, A&ES and Bio Sci, the seven-year interval for program review remains intact, but the university's goal now is to see the process completed in 20 months, with deadlines every step of the way.
Further, the new system standardizes the review process among the colleges, and establishes a cluster format under which similar programs are reviewed at the same time — say, all foreign language majors or all environmental sciences programs. Within each cluster, data are shared among the programs, for comparison purposes.
"So they can see how their majors fit with others that are similar, and look for overlap," Potter said. "They could see that we're teaching the same concepts in three required courses, for example."
Academic units with overlap could then explain how their programs are distinctive, or, if warranted, come up with recommendations for change, Potter said.
Most people agree that the greatest value of the old system was engaging departments in self-review of their teaching programs, Potter said. Self-review remains the most valuable part of the new review process, he said, but by itself self-review does not engender participation.
"Now, with the data and comparative approach," Potter said, "we hope to see more people get involved, because they can see evidence of how they are doing."
For example, Potter said, "Faculty may ask, 'Wouldn't it be neat if we knew such and such about our graduates?' Well, now we can supply that."
The new system took effect in January, when department chairs and program directors received notice that their units were up for review. The chairs and program directors also received the applicable data for their units.
Potter explained that the new system comes with a standardized set of templates to be filled out. "We tell the departments to keep it brief unless they have something significant to comment on," he said, explaining how the new process is meant to be less time consuming.
During this first cycle under the new system, majors and programs had from January to June 1 to complete their reviews. College program review committees are due to complete their reviews this fall, and departments and deans have until the end of the winter quarter to respond and forward the reviews to the Academic Senate's Undergraduate Council.
In subsequent cycles, the review process is scheduled to begin in October and be completed by June 1 of the following year.
The final official step for every review is approval by the Academic Senate, after the Undergraduate Instruction and Program Review Committee has summarized each program's relative strengths and weaknesses, and commented on any overlap within clusters. The committee also is asked to include recommendations for change, if appropriate.
In recent years, after senate approval, the reviews have been sent along to the provost — and that will continue, Potter said. He added: "The provost can read the reviews and make informed decisions about allocation of resources."
The bottom line, according to Turner: "How do you know you're doing the best you can for your students in their majors?"
Media Resources
Clifton B. Parker, Dateline, (530) 752-1932, cparker@ucdavis.edu