Undergrads grade TAs in English proficiency

The University is planning to change the way it tests international TAs.

Fifteen University of Minnesota undergraduates spent Thursday afternoon listening to recordings of other people taking an oral test and grading their own ability to understand what was being said. The Center for Teaching and Learning (CTL) had the undergraduate students come in to help decide when an international teaching assistantâÄôs English is âÄúgood enoughâÄù for them to learn from. âÄúWeâÄôre not going for perfect, just understandable,âÄù said Barbara Beers, the CTLâÄôs educational specialist running the study . The standards the University uses will not change, however, the test used to measure them will. The English speaking test the University has been using since 2000 âÄî the Spoken Proficiency in English Assessment Kit (SPEAK) âÄî is very old and is no longer supported by its manufacturer, said Kate Martin, the Assistant Program Director for the UniversityâÄôs International Teaching Assistant Program . The University will be switching to the Test of English as a Foreign Language (TOEFL) in the fall of 2010, and is trying to determine how scores on the new test relate to undergraduatesâÄô ability to understand TAs. Teaching evaluations show that the standard has been set at about the right place, Martin said. International TAs at the University showed virtually no difference in overall teaching satisfaction scores compared to other TAs. The current English speaking standards are completely voluntary, she said. Despite a state law requiring the University to screen non-native English speakers before allowing them to teach and University policies setting standards for that screening, there is no way to enforce those policies, Martin said. Also, there isnâÄôt an efficient way of tracking compliance, she said. However, compliance was high the last time it was measured a few years ago, Martin said. The undergraduates who attended were asked to listen to samples from seven peopleâÄôs TEOFL responses. The undergraduates then rated their ability to understand the person. Anton Zuponcic, a computing engineering student, said he found out about the event because his advisor sent out an e-mail telling people about it. He said that for him, it was more important to understand what words someone was using than whether they were putting them together correctly. âÄúIf theyâÄôre having trouble saying voltage or KirchoffâÄôs Law or Greek letters, thatâÄôs going to be a problem,âÄù Zuponcic said. Zuponcic said he spent a year in Japan teaching English as a foreign language, and he found it really hard to put a numerical rating on a personâÄôs language proficiency. A person can have great pronunciation and sentence structure, but still have a limited vocabulary, he said. Roxanne Corbin, a pre-dental sophomore, said the international TAs she has had were usually able to get their point across, but sometimes their inability to express an idea in more than one way made it more difficult. Beers is hoping to gather more data from undergraduates in the future to improve the interpretation of English proficiency test scores.