Admins eye dean reviews

The University Provost’s office is exploring ways to refresh reviews of dean performance.

by Brian Edwards

The comprehensive review process for deans at the University of Minnesota could get a revamp this summer.
Executive Vice President and Provost Karen Hanson discussed the dean review process with faculty members last week, and she and others said the process could be improved in certain areas — like the survey questionnaire and timeline for the reviews. 
Conducting a comprehensive review earlier in a dean’s tenure — potentially within their first year — is one change the provost’s office is exploring.
At a Faculty Consultative Committee meeting, Hanson said the time frame could be adjusted for the next review depending on the results of the first. 
Updating the survey questions to further focus on academic and strategic areas is among the other ideas being considered.
Deb Cran, Hanson’s chief of staff, said the provost will likely undertake a more deliberative process this summer to take a look at the dean review process.
“[It] seems like it is time to refresh [the reviews],” she said.
Before the current iteration of the review process, many University leaders found the process burdensome because of the large committee size and difficulty in selecting committee members, said Kate Stuckert, human resources director in the Office of the Provost.
Though some minor enhancements could be made immediately, Stuckert said, anything major — like the aforementioned changes — would need feedback and deliberation from groups around campus.
A process under review
Currently, the provost conducts three to five comprehensive dean reviews a year, in addition to annual reviews of deans.
Near the beginning of a comprehensive review, a dean is asked to prepare a background statement that lists things like accomplishments, goals and challenges they have faced.
A survey is then distributed to faculty, staff and certain students who can give relevant feedback on the dean. This survey helps create quantitative data to help the process.
The survey is open for three weeks, Stuckert said, but a low response rate sometimes prompts a weeklong extension. One survey this year was shortened as an experiment in increasing response rates.
The number of survey participants varies by the size of the college, Stuckert said, citing numerous reasons why they may not be filled it out.
For example, she said the survey is completely anonymous, but some employees may not fill it out because they are nervous to rate a superior.
Busy schedules and survey fatigue — the survey is about 40 questions — could also be prohibitive factors in maintaining high participation numbers, Stuckert said.
After the survey is completed, the provost meets with a standing committee, which is composed of a dean, faculty from outside of the college and the chairs of the FCC and Professional and Administration Consultative committees.
Additionally, up to four members from within the college — who can be a mix of faculty and PACC — are chosen for the committee to help provide context about the college.
The provost wants to make sure the group is representative, Cran said.
This committee discusses the survey numbers with the provost to help make sense of the work the dean has done, said David Pui, a Distinguished McKnight University professor of mechanical engineering and standing committee member.
Pui has been a member of the committee since about 2005 and said standing committee members help provide the group with continuity and context gathered from conducting multiple reviews.
Having reviewed deans multiple times, Pui said he has witnessed the process improve facets of a dean’s work.
Some issues that arise can be quickly fixed through a procedural change, he said, like adding a student engagement office if a dean learns through their review that students don’t feel they can communicate properly with the head of their college.
Of all of the deans Pui has reviewed, he said, “Only two have been a disappointment, but they are gone.”
After the committee has finished, the provost discusses the results with the dean. The information isn’t public because of privacy laws, but deans are encouraged to share at least some of their review, Stuckert said.
Improving the process
Nevertheless, faculty expressed some suggestions to improve the reviews.
Gary Gardner, a professor in the department of horticultural science, said it’s important to ensure deans are evaluated not just on their ability to balance a budget but also their intellectual leadership.
“The University produces knowledge,” he said. “The leadership isn’t what you sell, but what you produce for society.”
He cited MnDRIVE and the school’s Grand Challenges Research as two programs designed to encourage faculty to pursue original work.
Joe Konstan, a computer science professor, said a dean’s success is measured by individual managerial strengths as well as accomplishments of the college as a whole.
Since the personal information contained within the review is private, Konstan said, faculty can’t see the accompanying college review.
If possible, separating those two measures could help encourage members of the college to participate in reviews.
“In most cases, if the college is doing well, then the dean deserves credit. But there are some cases where the college may be missing out on things, or the dean is alienating people,” Konstan said.