Groundbreaking Study from ACTFL Measures Listening and Reading

Fifty years ago, John B. Carroll undertook a landmark study analyzing the oral proficiency of students in language programs in the United States. The often-cited article on his research, “Foreign Language Proficiency Levels Attained by Language Majors Near Graduation from College,” appeared in Foreign Language Annals in 1967—the first year of the journal’s publication. The impact on the language education profession was widespread and has been key to our knowledge of second language acquisition over the past half-century.

Now a new study from ACTFL promises to be the next major milestone in our professional understanding of how students acquire language—this time focusing on interpretive listening and reading.

Conducted in 2014-2015, the ACTFL Listening and Reading Proficiency Benchmark Study gathered close to 4,000 assessments of college students in each of the two modalities (approximately 8,000 tests total) using the ACTFL Listening Proficiency Test (LPT) and ACTFL Reading Proficiency Test (RPT). The LPT and RPT—both developed in 2013—are among ACTFL’s assessment offerings administered by Language Testing International (LTI), the exclusive licensee of all ACTFL tests (see box on next page). LTI was a partner with ACTFL in this study, handling the test administration and providing the data to the researchers.

Among preliminary findings being reported:

  • Listening and reading are acquired at a faster rate than speaking skills and therefore Advanced levels of listening and reading proficiency appear to be realistic goals for Category I languages at graduation, even if the students have not yet reached that level in oral proficiency. (Category I languages have many similarities to English for native speakers of English; Category IV languages have very few similarities to English for native speakers of English.)
  • Contrary to the belief that both interpretive skills are acquired more or less in the same timeframe, listening proficiency actually develops more slowly than reading. For this reason, development of interpretive listening skills most likely needs additional attention in language education classes and curricula.
  • While labeled as a Category I language for English speakers, French does not appear to be a Category I language (e.g., Spanish, Italian, Portuguese) as far as listening proficiency is concerned, due to its deeper orthography (i.e., it is generally more difficult to understand spoken French than to read it, because it does not sound like it looks).

More detailed findings and implications of the Listening and Reading Benchmark Study will be discussed in greater detail in a technical report available from ACTFL this fall (sign up to get a copy at info.languagetesting.com/benchmark). There will also be an article highlighting the details of this research featured in an upcoming issue of Foreign Language Annals.

Impact of Research

A greater focus on interpretive skills is an important advancement in the language profession. According to Erwin Tschirner, the study’s director, there has been a rediscovery of reading as a higher education goal, with a noticeable trend of more institutions trying to align their goals with the ACTFL Proficiency Guidelines 2012–Reading. Many educators and researchers have a renewed understanding of reading as a highly effective and efficient way of accumulating knowledge, according to Tschirner.

He also notes the importance of the role of listening comprehension “not as an indicator, but as scaffolding” for developing inter-personal speaking. “Right now we know that most language students, even at the postsecondary level, do not go beyond the Intermediate-Mid level in speaking. One factor may be not giving enough time to teach listening comprehension, a skill that comes first,” he says.

ACTFL Director for Professional Programs Elvira Swender says this research in listening and reading will have an impact on the entire language education profession. “Over time, there have been many studies on oral proficiency—and more recently on writing proficiency—but there is really little data on proficiency level expectations for listening and reading in college foreign language programs,” says Swender. “Of course, there was anecdotal information: ‘Oh, I think that after four semesters, my students can read at this level.’—but we didn’t have the actual data to support such statements.”

Background of the Study

For many years, ACTFL has had a reputation for focusing on speaking proficiency, says Swender. She notes that the ACTFL Proficiency Guidelines—Speaking(1989), underwent multiple revisions before the most recent version was published in 2012. ACTFL’s Oral Proficiency Interview (OPI) has also been a long-established, well-known method for assess-ing speaking skills.

In contrast, Swender says, the guidelines for listening and reading had not been updated from the original until the publication of the revised Proficiency Guidelines 3 years ago. As a result of that 2012 update, ACTFL began to look again more closely at listening and reading. “After the revisions of the Proficiency Guidelines for Listening and Reading, we were positioned to offer assessments for these skills, which we had never had before,” she says. The RPT and LPT became the first reading and listening assessments from ACTFL, which then joined the previously established speaking and writing assessments (OPI, OPIc, WPT).

The scope of the 2015 ACTFL Listening and Reading Proficiency Benchmark Study was extensive, including language students from 22 colleges and universities. Three institutions—Michigan State University, the University of Utah, and the University of Minnesota—were the biggest participants with funding through the Language Flagship Program (www.thelanguageflagship.org). Other schools taking part were UC-Berkeley, USC, Yale University, University of Delaware, University of Maryland-College Park, University of Pittsburgh, University of Wisconsin-Eau Claire, SUNY Plattsburgh, Middlebury College, Hunter College, Bowdoin College, Loras College, San Diego State University, Georgia Southern University, Lee University, North Carolina State University, Eastern Washington University, Grand Valley State University, and Old Dominion University.

Data were gathered from students of seven world languages—Spanish, French, German, Russian, Italian, Portuguese, and Japanese. Research questions explored were:

  • Are professional (advanced) levels of proficiency in the interpretive modes possible in college foreign language programs?
  • What is the relationship between the interpretive and productive modes in second language acquisition?
  • What role does language distance play (referring to similarity of the language to one’s native language, e.g., for native speakers of English, Spanish is a Category I language, and Korean a Category IV)?

The Language Flagship program was instrumental in funding the study, as they were offering proficiency initiative grants to support higher education language programs outside of the Flagship umbrella. “They wanted to encourage other non-Flagship programs to use assessment data to make decisions about instruction,” says Swender. “This has also always been ACTFL’s intention: To be able to identify benchmarks that are realistic for students to meet and then to design instructional programs to move students towards these benchmarks.”

The three schools awarded grants all had the option of assessing their students in language programs for which ACTFL had listening and reading tests. “The timing of this was perfect in that we were able to recruit from these three programs a core of the students that were the test-takers,” says Swender.

Fernando Rubio of the University of Utah has great praise for his institution’s involvement: “The results of the study in our case had immediate implications on our curriculum. We were able to see what was reasonable to expect and to adjust our curriculum accordingly. In some cases we were assuming that students were able to do things at a higher level than they really were, but in other cases we saw that students could be pushed to do more than we thought.” Rubio says they also discovered that they needed to offer students additional structured opportunities for listening practice, in line with the finding that listening developed more slowly than reading.

Paula Winke of Michigan State University (MSU) notes that: “One of the questions that applied linguists have is: ‘How realizable are these proficiency goals and are they different across different languages? What should we be expecting and are we on target?’” She says their school provided the test scores to both students and teachers to use for their own information and to use as they see fit in the programs.

“It’s been very informative and so far we’ve gotten great feedback,” she says. The study also collected anonymous demographic data connected to scores (e.g., study abroad, heritage languages) and Winke believes that there will be considerable research done in second language acquisition in the future using this large pool of data. “We’ll be able to do some robust statistical analyses predicting language performance,” she says.

Implications for the Profession

“I believe that this research once it’s published—much like the Carroll study—will be cited in the literature when educators are talking about realistic expectations at the college level for reading and listening,” says Swender. “We can see how the best students were able to perform, how the majority of students were able to perform, and in between programs can set benchmarks at different milestones in the language learning process.”

“What I’m already seeing come out of this research is different language departments within the same institutions are talking together, determining if they have a systematic approach to develop listening proficiency,” says Tschirner. “We also know vocabulary explains more than 50% in improving one’s reading proficiency so we will need to establish a more consistent approach to increasing vocabulary as well.”

The research can also be used to motivate students. “Most colleges and universities have a 1- or 2-year language requirement and most students never go beyond that,” Rubio notes. However, he says that if educators can point out all the things students can do in the language after 1 year—including receptive as well as productive skills—this may improve learner persistence. “Students can do quite a lot in terms of reading and listening after two semesters, and that helps them stay with the language, seeing progress in acquisition,” says Rubio.

The implications of the ACTFL Listening and Reading Proficiency Benchmark Study will certainly be felt across the profession and the data and findings will be used for many years to come. Look for the technical report and article in Foreign Language Annals in 2016, as well as additional updates from ACTFL online and in The Language Educator.

“We are very excited that ACTFL is conducting this research; it was really needed in our field. This is going to be helpful not just to know where our students are, but for the application of the theories of language proficiency and how they are actually realized in language programs.” —Paula Winke

“The most exciting thing I have seen in presenting this research is that more people are thinking anew about the role of reading and listening. The fact that an Advanced level of proficiency in these skills is achievable is something we in the language profession should be promoting.” —Erwin Tschirner

“The whole idea of setting benchmarks really helps set a roadmap for what it means to be proficient at a level, what reading tasks, what vocabulary to support the reading tasks, what knowledge of the language, and what kind of cultural knowledge you need. It really helps us to define and develop instruction.” —Elvira Swender

“We’ve never had a good idea of what to expect when someone has had 1, 2, or 3 years of a language and this study gives us that hard data about achievable levels of proficiency in listening and reading. This is going to help us organize our curriculum and understand what we’re able to do much better.” —Fernando Rubio

ACTFL Listening Proficiency Test

The LPT is a standardized test for the global assessment of listening ability in a language. LPTs measure how well a person understands spoken discourse as described in the ACTFL Proficiency Guidelines 2012–Listening. All listening passages are in the target language. The LPT is a carefully constructed assessment that evaluates Novice to Superior levels of listening ability. It is delivered by computer via the Internet. The test can assess a specific range of proficiency (e.g., Novice Low to Intermediate Mid; Intermediate Mid to Advanced Mid; and Advanced Low to Superior). For further information, go to languagetesting.com/listening-proficiency-test.

ACTFL Reading Proficiency Test

The RPT is a standardized test for the global assessment of reading ability in a language. RPTs measure how well a person spontaneously reads a language when presented with texts and tasks as described in the ACTFL Proficiency Guidelines 2012–Reading, without access to dictionaries or grammar references. The RPT is a carefully constructed assessment which evaluates Novice to Superior levels of reading ability. It is delivered by computer via the Internet. The test can assess a specific range of proficiency (e.g., Novice Low to Intermediate Mid; Intermediate Mid to Advanced Mid; and Advanced Low to Superior). For further information, go to languagetesting.com/reading-proficiency-test.

Learn more about the ACTFL Proficiency Guidelines–2012 at  www.actflproficiencyguidelines2012.org.

Explore this topic further at ACTFL’s Pre-Convention Workshop Defining and Testing Proficient Reading, presented by Troy Cox and Ray Clifford, Brigham Young University (Thursday, November 19, 1:00-4:00 p.m.). Register at www.actfl.org/convention-expo.

-Sandy Cutshall