nclrc_logo
George Washington University, Georgetown University, The Center for Applied Linguistics
language_students student_girl student_computer university


Powered by Google

About Teaching Topics - Assessment
Topics
Critical Languages
Crossroads in the Classroom
Language Policy
Reports and Publications
Institutes Highlights
Podcasts and
Training Materials
Newsletter
Feature Article World Languages Teaching Learner Diversity Teacher Education and Certification Language Learning Strategies Assessment

Articles Index

The Foreign Language Assessment Directory by Colleen Gallagher, Megan Montee, and Margaret E. Malone
Unpacking the Standards: informing instruction through performance assessment
Linguafolio - Where are we now? by Sheila Cockey
Assessing Speaking Skills in Young Language Learners: The Student Oral Proficiency Assessment (SOPA) by Helen Carpenter & Rachel M. Lunde
Online Learning Meets Assessment Training for Teachers: A New Twist in the Foreign Language Distance Learning Movement by Helen Carpenter & Meg Malone
Trends in Foreign Language Assessment: Simulated Oral Proficiency Interviews (SOPIs) by Helen Carpenter
Interview-Based Oral Proficiency Assessments By Dorry Kenyon


Articles

The Foreign Language Assessment Directory
By Colleen Gallagher, Megan Montee, and Margaret E. Malone, Center for Applied Linguistics.

Overview

The Center for Applied Linguistics (CAL) announces an online resource to assist foreign language professionals in selecting assessments. The Foreign Language Assessment Directory (FLAD) is a free, searchable directory of information on nearly 200 tests in over 80 languages. Previously, CAL hosted two databases of information about foreign language assessments. CAL received a United States Department of Education grant (P017A050033) to update, merge and expand these two resources. The resulting directory, the FLAD, was made available online in November 2007.

The FLAD serves as a starting point for teachers and educators looking for foreign language assessments. FLAD users can find information about assessments including the grade and proficiency level for which a test is intended, the skills targeted by a test, information about a test’s development and the publisher’s or developer’s contact information for further inquiries. The FLAD can be accessed at www.cal.org/flad In addition to the updated directory, CAL staff members are working on two additional resources to help foreign language professionals locate and select assessments: 1) an online tutorial on language assessment and 2) a moderated user review of foreign language assessments. The online tutorial will provide educators with basic information about assessment and assist FLAD users in the test selection process. The moderated user review, which will be added to the FLAD, will allow users to provide comments and feedback about tests listed in the directory.

Background

The current version of the FLAD grew out of two separate databases of information on foreign language tests developed and maintained by CAL and partner organizations for nearly two decades. The need for a comprehensive resource arose when staff at CAL began receiving inquiries from educators searching for tests to use in their programs. To meet this need, in 1991 CAL staff members collaborated with the National Capital Language Resource Center (NCLRC) to develop a printed database of assessment information. This information was updated and published on the CAL website in 1999 as the Foreign Language Test Database. This database contained information on tests for secondary and post-secondary school students. A complementary resource, the Directory of K-12 Foreign Language Assessment Instruments and Resources, developed by CAL and Iowa State University’s National K-12 Foreign Language Resource Center (NFLRC), had been hosted on the CAL website since 2000. In 2004, CAL applied for and was awarded a United States Department of Education grant (P017A050033) to merge, update and expand the assessment databases.

Uses and Applications

The newly published FLAD has several functions; it serves as a resource of information on specific language assessments and also provides a way for test providers to submit new assessment entries and update existing entries.

Assessment Information

The FLAD contains information on nearly 200 tests in over 80 languages. The intended examinees for the tests listed in the directory are foreign language learners from grades PK through 16 and beyond. The directory also lists a limited number of assessments for elementary students in bilingual programs. FLAD users can search by any one the following or by combinations of the following:

  • name of the test;
  • the skills tested;
  • the language tested; and
  • the grade level(s), proficiency level(s) and use(s) for which the test was designed.

After the initial search, detailed information available for most tests includes a general description, the cost, the author, information on the development and validation of the test, and administration information such as how long it takes to administer, the materials necessary to administer the test and how the test is scored.

The newly revised directory lists information on tests for both commonly taught languages and less commonly taught languages for a range of ages. Table 1 provides an overview of tests in the FLAD.

chart

Submit New Entries and Update Existing Entries

One goal of the FLAD is to provide accurate information to users. In order to do so, project staff members have added a way for test developers and administrators to submit new tests or update existing entries directly from the FLAD. With a few clicks, users can submit new tests and CAL staff can easily verify and upload this information to keep the directory up-to-date and accurate.

Development

Developing the Foreign Language Assessment Directory included several phases, including updating information in the existing databases, merging the two, and applying teacher feedback to improve the new resource. In addition to updating entries, CAL solicited new entries through professional organizations, listservs, announcements at conferences and meetings, and contacts in the field of foreign language testing. In order to gather feedback from the language education community, project staff conducted a series of focus groups with educators in the Washington, DC metro area between fall 2006 and spring 2007. An attempt was made to include as much diversity within each group as possible in terms of position, experience and language specialties. CAL staff members used these suggestions as they designed the FLAD.

Future Directions

As mentioned in the introduction, the FLAD project has three phases. With the completion of the updated and merged directory, the project team has now turned its attention to the remaining phases of the project. Throughout 2008, CAL will develop two companion resources to the FLAD: 1) an online test selection tutorial and 2) a moderated user review. The tutorial will assist FLAD users and other foreign language educators in selecting tests to meet their needs and those of their students and will provide information about key testing concepts. The moderated user review will allow users to submit comments on tests they’ve used, which CAL staff will then review and post so that other foreign language professionals can use this information in making decisions about which test to use. This feature will help FLAD users make informed choices about what tests to use by providing information on actual test use. It is our hope that these resources will provide a valuable service to the field of language education and ultimately contribute to a positive learning experience for language students throughout the United States.

See the FLAD at www.cal.org/flad

Back to top


Unpacking the Standards: informing instruction through performance assessment
Dr. Jennifer Eddy, Vice President of the National Association of District Supervisors of Foreign Languages (NADSFL) and Assistant Professor of World Language Education at Queens College, The City University of New York.

If you wish to PRINT this article, Download PDF Version

Standards

What are key indicators of true performance and success in a language? Are real life situations always predictable? What do your assessment tasks ask students to do? Do your students forget, misunderstand concepts, or only can use material the way it was taught? Are you mired in the minutiae of a coverage driven curriculum? How do you know when a student really understands? What contexts will students likely encounter in a given culture and how will they respond given the practices and perspectives of that culture? How do you plan for instruction when the goal is performance?

These questions speak directly to what our National Standards imply for curriculum and assessment design. The Standards set the stage for language learning that goes beyond mastering language codes and lists of vocabulary in isolation; they signaled a paradigm shift in our field from reciting mechanized responses to emphasis on meaningful communication, authentic real life contexts, and cultural understanding. Individual states have further interpreted the Standards to include recursive unit themes and functions at benchmark levels (New York State, 1986).

In our experiences across the lifespan, issues reprise and resurface. We spiral concepts, skills, and lessons learned, applying them to different contexts and situations with flexibility and adjustment (Eddy, 2007c). We generally remember learning experiences when we know what the goal is and what the end product should look like, especially when we are given multiple opportunities to refine, check, and fine tune through quality feedback. We also remember more when we are engaged in hands-on, real life application. Because contexts and variables do change in real life, we need to demonstrate flexibility to adapt to these changes, and apply what we know to unanticipated situations on our own (Eddy, 2007a).

It is precisely this flexibility and high adaptability that is required when faced with new situations anywhere within a culture. Using a language appropriately in a given culture requires tolerance of new situations, dealing with incomplete information, and problem-solving with minimal or no cues all the time. (Eddy, 2006b) To understand a language and culture, one cannot rely on rules or a well rehearsed libretto (Eddy, 2007a). Rather, it is through exploration, true performance, and reflection via meaningful interaction. Tasks which echo these challenges will best prepare students for what people face using language outside the classroom. Performance assessment most closely resembles what the learner will be expected to do in real life as is appropriate to the culture.

The National Standards (National Standards in Foreign Language Education project, 1999/2006) or 5Cs are: Communication, Culture, Comparisons, Connections, and Communities. Our Standards can guide the teacher on curricular decisions and what performance assessment should look like. The Standards engage a shift to move the teacher from being a random activity planner to an assessor (Mc Tighe & Wiggins, 2004) for performance, constantly engaged in feedback with the learner. Many teachers still see instruction and assessment as separate and disconnected. Without a cultural anchor that could assign purpose and coherence to skills, they devote much time to predictable drill practice in isolation and random coverage rather than designing performance scenarios that demand flexibility and transfer of skills within a cultural context. This kind of assessment, albeit indicated in the Communication standard, is often misunderstood. What remains is the requisite paradigm shift from tests of rote memorization to assessments that require transfer of culturally embedded concepts and flexible application of a language repertoire for use in the real world. (Eddy, 2007a)

One possible tool to facilitate that shift is UC ADAPT (Eddy, 2004, 2005, 2006a, 2006b, 2006c, 2007a, 2007b), a curricular design model that reveals cultural practices and perspectives within recurrent and reflective themes, using them as the purpose for performance assessment evidence and higher selectivity of knowledge and skills. The model addresses national standards and facilitates its performance based assessment system using a unique framework based on backward design (Mc Tighe & Wiggins, 2001/2005). UC ADAPT stands for Uncovering Content: Assessment Design Aligning Performance and Transfer. (Eddy, 2007a) UC ADAPT also speaks to a two-tier feedback system which helps teachers plan to adjust, continuously informing their practice through diverse assessment evidence and transparent criteria and strategies for continued progress to the learner. An articulated curriculum can explore these recursive themes that uncover cultural practices, perspectives, attitudes, and response (Eddy, 2007b). Inquiry on these topics can continue across the lifespan of the learner and be meaningful beyond the classroom.

UNCOVER THROUGH CULTURE

For the time being, set the Communication standard aside; we will get back to that later. That standard determines the mode of assessment and is the backbone of the assessment system. Let us turn to Culture and what the culture thinks, believes or says about a particular theme. What do you want learners to go away with and remember about the culture vis-à-vis the unit theme? How does your culture respond to ideas on family life, education, how it spends leisure time, or health? If that issue or theme had a moral, what would it be? Enduring Understandings are deliberately framed as the moral of the story. What would be a lesson learned regarding health care in the culture? Or what would you want your students to understand about how leisure time is spent in the culture. (Eddy, 2005)This design model uncovers content by unpacking the Culture standard first in Stage One, because cultural response to recurring themes drives the curriculum and gives purpose and coherence to knowledge and skills. UC ADAPT guides teachers in uncovering key concepts inherent to the culture studied, within themes that appear along the life-span of the people. Units are designed to explore how a culture responds to these themes. (Eddy, 2007b) Cultural practices, perspectives, and the products created by the culture are revealed as learners uncover the themes across levels in articulation, allowing for comparisons with their own cultures and application of interdisciplinary content to real life contexts encountered outside the classroom. From here, we develop Essential Questions that are at the heart of the culture, recur, and engage the learner in further inquiry. They are not meant to be answered in one class period but are large enough to be answered, revisited, and readdressed in greater depth and complexity as the learner’s language develops over time. Once these are in place, you can design the performance assessment that will demonstrate learner understanding of these concepts.

COMMUNICATE FOR PERFORMANCE

What does true performance look like? Consider what happens anytime we communicate: Interactions are non-scripted, information has gaps. Materials are not filtered, arranged cleanly or adapted. People always must sift through anything they hear or see to get precisely what they need to solve a problem or create a product of any use (Eddy, 2007b; Gardner, 1983/1993).The Communication standard (Glisan, Adair-Hauck, Koda, Sandrock, & Swender, 2003, Adair-Hauck, et. al., 2006) determines the mode of assessment evidence, so we use that standard to design the performance assessment tasks in Stage Two (Eddy, 2004) These three modes aid in the shift from rote memorization and four skills in isolation to authentic performance. Mirroring authentic communication, they form the context and purpose for putting the skills to actual use. Culturally authentic materials made by and for the culture provide context for Interpretive Mode task. Active negotiation of meaning and solving information gaps via spontaneous, unrehearsed tasks characterize the Interpersonal Mode. In the Presentational Mode, students have time to consult resources, develop, and present an oral or written piece that has value or use to a culture. The Performance Assessment within this standard provides the learner with evidence of what it means to have your abilities tested in situations they may encounter in the target culture. This kind of assessment will require the learner to take stock of what they know and are able to do, and use that repertoire appropriately in a given situation. This means that the learner understands that situations change and flexibility is essential. Their ability to transfer knowledge and skills to new contexts not previously encountered is a key indicator of success in using a language (Eddy, 2007a).

DESIGN FOR TRANSFER

Transfer occurs when your students apply knowledge and skills on their own in new contexts and situations that have purpose or intent in the culture. Tasks designed within the three Communicative modes assess for transfer. The Integrated Performance Assessment (IPA) (Adair-Hauck, B., et. al., 2006) is the ultimate demonstration of transfer, especially because the feedback loop moves the learner toward improved performance and closes the gap between assessment and instruction for the teacher. With alignment to transfer, instruction can make that paradigm shift to performance (Eddy, 2006d). Transfer tasks are those which ask you to process and use knowledge and skills in new, different or unanticipated situations or contexts from how they were originally taught. Transfer requires inference, critical thinking skills, and negotiation of meaning; not just amassing of facts and completing a drill (Eddy, 2006). Do your tasks assess for transfer or are they drills? Drills have their place, but they cannot be confused with performance. Nor do all the drills in the world combined indicate what is required in authentic performance. Without transfer, the language learner often forgets, misunderstands, or only knows it in the rigid, predictable context in which it was taught. Performance assessment design engages the learner in transfer tasks with less reliance on cues or repeated drill. They teach the learner to expect variation. The evidence of understanding in a language class is flexible transfer. Transfer depends upon recurrent ideas that connect otherwise isolated facts, skills, and experiences for the students. When the learner can take these skills and apply them flexibly in new situations on their own, this demonstrates the best evidence for understanding and is the best indicator of success in a language (Eddy, 2004).

INFORMING YOUR INSTRUCTION

After designing the assessments, you can make informed decisions now on the knowledge and skills most worthy and needed for the assessment and instruction. You can be much more selective on what is essential, pushing back from the heavy laden table of content in a coverage driven curriculum (Eddy, 2006c). Now you can carefully choose what students need to know, be able to do, and transfer. With the performance assessment tasks designed and that goal in mind, your instructional plans will mirror those tasks, selecting what is required for the performance. The standards of communities, comparisons, and connections describe the knowledge, skills, structures, vocabulary, and resources required for doing the assessment. These outline what students should be able to do in situations outside the classroom, in communicating with people of the culture, and in comparing discoveries within the language and culture with their own.

MAKING THE SHIFT HAPPEN

The Standards provide reasonable yet challenging expectations for students. They emphasize performance, requiring assessment and instruction to move as one rather than separate entities, from preparing students for predictable responses to a goal of authentic performance. This model enables this paradigm shift, addressing those and other coverage issues that are the bane of activity driven curricula and repetitive item testing. Grammar and vocabulary are there, but they are carefully chosen to support the learner for a larger concept that assigns purpose and reason for the skills. Teachers can be more selective of the skills required for their assessments and plan instruction more mindfully, thus reducing the amount of unrelated material. In this model, the culture’s story unfolds between grades and levels. As the learner develops in the language over time, essential questions about that culture continue to be answered. The National Standards are integrated goals that entail moving away from rote memorization of forms in isolation of context and toward flexible communication with other people in a variety of contexts and content areas with an understanding of other cultures. These goals surpass past practices in language teaching of mastering codes or linguistic systems and move the learner toward using the language in real life contexts within the culture. The model UC ADAPT offers teachers Standards based design tools to plan curriculum, assess for transfer, and inform future practice.

REFERENCES
Adair-Hauck, B. Glisan, E. W., Koda, K., Sandrock, S. P., & Swender, E (2006). The Integrated Performance Assessment (IPA): Connecting Assessment to Instruction and Learning. Foreign Language Annals, 39(3), 359-382.

Eddy, J. (2004, October). LOTE by design: Assess for Understanding. Paper presented at the New York City Association of Foreign Language Teachers. New York, New York.

Eddy, J. (2005, May). Language assessment by Design: Understanding with the End in Mind. Paper presented at the CUNY Council on Foreign Languages, New York.

Eddy, J. (2006a). Sonidos, Sabores, y Palabras. Boston: ThomsonHeinle.

Eddy, J. (2007a). Uncovering Content, Designing for Performance. Academic Exchange Quarterly, 11(1).

Eddy, J. (2007b). Children and Art: Uncovering Cultural Practices and Perspectives through works of art in world language performance assessment. Learning Languages, 12(2).

Eddy, J. (Ed.). (2007c). Editorial , NYSAFLT Journal, Spring.

Eddy, J. (Writer), & Couet, R. (Director). (2006b). What is performance assessment? [Television series episode]. In South Carolina Department of Education (Producer), Teaching and Language Learning Collaborative. Columbia, SC: ETV.

Eddy, J. (Writer), & Couet, R. (Director). (2006c). How do I define assessment? [Television series episode]. In South Carolina Department of Education (Producer), Teaching and Language Learning Collaborative. Columbia, SC: ETV.

Eddy, J.(Writer), & Couet, R. (Director). (2006d). What does transfer look like in language assessment? [Television series episode]. In South Carolina Department of Education (Producer), Teaching and Language Learning Collaborative. Columbia, SC: ETV.

Glisan, E. W., Adair-Hauck, B., Koda, K., Sandrock, S. P., & Swender, E. (2003). ACTFL integrated performance assessment. Yonkers, NY: ACTFL.

Mc Tighe, J., & Wiggins, G. (2004). Understanding by Design Professional Development Workbook. Alexandria, VA: ASCD.

Mc Tighe, J., & Wiggins, G. (2005). Understanding by Design (2nd ed.). Alexandria, VA: ASCD. (Original work published 2001)

National Standards in Foreign Language Education project. (2006). Standards for foreign language learning in the 21st century. Lawrence, KS: Allen Press. (Original work published 1999)

New York State Department of Education. (1986). Modern Languages for Communication. Albany, NY.

Editor’s note: Dr. Jennifer Eddy is an Assistant Professor of World Language Education at Queens College, CUNY. Her program prepares teachers as assessors for performance using Understanding by Design and UC ADAPT. Dr. Eddy is consultant to states and school districts on curriculum and assessment reform issues in language education.

Dr. Eddy will be presenting workshops throughout this summer with Dr. Grant Wiggins on Understanding by Design for educators in the field of World Languages. See: http://www.authenticeducation.org/summerinstitutes

If you wish to print this article, download the PDF Version here.

Back to top


LinguaFolio – Where Are We Now?
By Sheila W. Cockey

In the Nov/Dec issue of the Language Resource you read about LinguaFolio, the exciting new project now being piloted in Virginia that enables an individual to document how he has learned languages. The document enables one to list both traditional and non-traditional ways of learning, as well as enabling one to document the life-long learning of languages. There has been a lot of activity recently in the LinguaFolio world, so we are providing an update for you.

Teachers and students who participated in the pilot study in 2005-2006 enthusiastically received LinguaFolioVirginia. The results of the student pilot have been tabulated and adjustments are being made to improve implementation of the program. There was only one piece that needed attention. According to Helen Small, Foreign Language Specialist at the Virginia Department of Education, the "results showed a strong need for professional development among the teachers, and the necessity of embedding LinguaFolio activities into the curriculum, rather than completing them in a short time frame."

To meet this need, a teacher manual and training opportunities has been developed. Virginia Commonwealth University (VCU) held a summer immersion program for teachers of French and Spanish that focused on improving a teacher’s knowledge of the target language culture while increasing opportunities for improving language skills through the creation of a personal LinguaFolio, the creation of lesson plans and activities incorporating LinguaFolio elements, and the development of on-going support via a web site and a CD Rom. As Pat Cummins, professor of French at VCU states, "VCU continues to play a leadership role in K-16 use of LinguaFolio in Virginia and in our 5-state" area.

Over the 2004-2005 academic year a consortium of 5 states was formed to create a nation-wide LinguaFolio based on the Virginia version. This will be piloted in schools in the 5 states (VA, NC, SC, KY, GA) in the 2006-2007 academic year in grades 9-16. In addition, a LinguaFolio Jr. was been developed based on the Kentucky version that will be ready for grades 3-8 in the fall of 2006. Finally, the consortium has developed a 5-part series of lessons as a training mechanism for teachers who are implementing one of the LinguaFolio versions in their classes.

LinguaFolio is gaining ground in all levels of education and will be the topic of two half-day workshops this fall. In October Helen Small and Pat Cummins will lead a workshop on LinguaFolio at the Foreign Language Association of Virginia (FLAVA) conference in Richmond VA. They will lead a similar workshop, along with Jacqueline Bott van Houten of the National Council of State Supervisors for Languages (NCSSFL), at ACTFL in November. Stay tuned for further updates in the future!

For information about LinguaFolio and to download a copy of the document, please go to: http://www.pen.k12.va.us/linguafolio/

Back to top


Assessing Speaking Skills in Young Language Learners: The Student Oral Proficiency Assessment (SOPA)
By Helen Carpenter & Rachel M. Lunde Center for Applied Linguistics

Imagine an oral proficiency assessment for elementary school students, that is fun for children yet based on national criteria designed as a standardized format. Ideally, the assessment would be based on authentic pedagogical materials the children were accustomed to using in their language classes. Furthermore, the assessment would meet the varying needs of the multiple types of elementary school second and foreign language programs. For adults and high school students, oral proficiency is usually assessed through some form of conversational interview that takes place in a face-to-face, interactive environment, but the assessments may not be appropriate for children. To answer the need for valid, reliable assessments for children based on national criteria, the Center for Applied Linguistics (CAL), in collaboration with Iowa State University's National Foreign Language Resource Center, has developed the Student Oral Proficiency Assessment (SOPA).

The SOPA: A Description

The SOPA was developed for use in various program types with children of different language proficiency levels. It targets children in the 2nd-5th grades (an oral proficiency assessment for students in Pre-K-3rd grade is currently under development). Because of its flexible design, the assessment can be adapted for use with FLES programs, as well as immersion programs. The SOPA is currently offered in Chinese, French, German, Japanese, Russian, and Spanish. Studies conducted between 1997 and 1999 show the validity of SOPA's claim to accurately assess the speaking and listening proficiencies of young language learners (Thompson, et al; 2002).

Administration

Children participate in the assessment in pairs, in a non-stressful, interactive environment. The pairing of students helps to encourage interaction and reduce the pressure of the testing situation. The test length is 15-20 minutes, depending on the students' proficiency levels. Two administrators assess the students. One takes the role of the interviewer, guiding the students through the assessment, while the other serves as a rater, who takes careful notes on production and rates the students' proficiency. A more detailed description of the components of the SOPA follows.

The SOPA Warm-up: Like most direct measures of oral proficiency, the SOPA includes a warm-up in the target language. The SOPA warm up is accomplished in two ways. As the students enter the room, the SOPA administrators greet them warmly and ask their names in the target language. Once students are comfortably seated, the interviewer presents students with a bag of appealing manipulatives such as colorful plastic fruits or animals. This first task of the SOPA focuses on students' listening skills so students are not pressured to produce speech immediately. After students have successfully responded to listening comprehension probes, to begin assessing oral fluency the interviewer asks a few simple questions about the color of the objects, number of each type, and the students' favorites.

Tasks: The initial warm-up task is followed by 2 to 4 additional tasks depending on student ability. These additional tasks are scaled so that examinees initially produce language associated with lower proficiency levels, but are soon encouraged to produce language associated with higher proficiency levels. Generally, once the warm-up task is completed, the interviewer uses informal questions to further assess fluency and comprehension. This second task is followed by tasks that employ visuals, additional manipulatives, or scenarios presented by the interviewer to assess students' abilities to follow or give directions, describe, retell a story, or persuade.

Wind-down: After the students have completed the assessment tasks, they participate in a wind-down activity. Importantly, this concluding activity is designed so that all children, regardless of proficiency level, can respond successfully and leave the assessment with a feeling of accomplishment.

The Assessment Criteria

The SOPA allows schools and school districts to assess their students' speaking abilities using holistic criteria based on the American Council on the Teaching of Foreign Languages (ACTFL) Guidelines-Speaking (Revised 1999) and Listening, and informed by the ACTFL Performance Guidelines for K-12 Learners (1998). The rating scale typically consists of nine levels, starting with Junior Novice-Low and scaling up to Junior Advanced-High. An adjusted six-level rating scale may be used with younger students and/or with programs that target lower proficiency levels. Assessment tasks are designed to allow students to show their abilities in oral fluency, grammar, vocabulary, and listening comprehension.

Training

As a direct measurement of oral proficiency, administering and rating the SOPA requires training. Usually, SOPA training is provided through workshops conducted by CAL staff, either in school districts or at regional or national conferences. A two-day training workshop prepares teachers to begin assessing students with the SOPA and familiarizes them with the rating scales. A particularly useful feature of the SOPA is that it can be tailored to the needs of any school district by CAL staff, who will then design training that is appropriate for the school district's unique needs.

Benefits of Using the SOPA

There are many benefits associated with using the SOPA. For the students, the SOPA provides an interactive and engaging experience that is designed to minimize anxiety. Since the interviewer tailors the assessment to the level of the student, student language abilities, rather than deficits, are displayed. For teachers and administrators, the assessment provides a reliable method of measuring oral proficiency that can be standardized, quantified, and evaluated for program needs and reports to funders. A side benefit is that once teachers and administrators are familiar with the proficiency criteria, they can apply their knowledge to curriculum design and language teaching.

For more information on the SOPA and for SOPA training opportunities, go to http://www.cal.org/

Thompson, L. Kenyon, D., & Rhodes, N. 2002. A Validation Study of the Student Oral Proficiency Assessment (SOPA). Ames, IA and Washington, DC: Iowa State University National K-12 Foreign Language Resource Center and the Center for Applied Linguistics.

Back to top


Online Learning Meets Assessment Training for Teachers: A New Twist in the Foreign Language Distance Learning Movement
By Helen Carpenter & Meg Malone Center for Applied Linguistics

It's a common dilemma: foreign language educators want to learn how to apply the proficiency criteria of the American Council for the Teaching of Foreign Languages (ACTFL) guidelines to teaching and testing, but then find that learning these skills can be both time consuming and expensive. Since distance-learning initiatives are increasingly common in the field of foreign language education, it was only a matter of time before the need to learn how to assess student language proficiency was combined with online distance education.

The Center for Applied Linguistics (CAL), in collaboration with ACTFL, is developing a distance learning initiative to train language professionals to understand and apply the ACTFL Guidelines to assess speaking skills. Through a grant from the United States Department of Education: International Research and Studies Program, CAL and ACTFL will develop and deliver a distance learning course on oral proficiency assessment that makes use of the Internet. Although the project specifically involves Spanish and Japanese, the goal of the project is to design a distance learning course that can be tailored to any language.

Because of scarce resources, professional development on oral proficiency assessment is not available to all teachers in all languages, particularly those who teach less commonly taught languages like Japanese. Online learning is an ideal medium for conducting professional development training for practical and pedagogical reasons. Logistically, distance learning unites a range of professionals who share the same interests, but are located all over the country. Pedagogically, Internet-based distance learning combines self-access with personal interaction with trained instructor. The self-access course materials will enable participants to study at times that suit their different schedules nationwide. At the same time, in order to garner the benefits of a live workshop, participants will also be supported by Internet discussions and online interaction with a qualified trainer. In addition to providing a new method of training oral proficiency assessment skills, using the Internet can chart new territory in strengthening our national language resources by reaching a wider audience.

The CAL-ACTFL Proficiency Assessment Training project will consist of three Internet-delivered distance learning courses on oral proficiency assessment. The first course will be a generic model of professional development on oral proficiency assessment targeted to teachers from multiple language backgrounds. This distance learning model will deliver a professional development course of oral proficiency assessment for teachers and other language professionals, regardless of language taught.

The additional courses are designed specifically for Spanish and Japanese foreign language professionals who wish to receive further training. The second course will train Spanish language teachers to assess the oral proficiency of their students, and will include authentic examples from Spanish oral proficiency tests. Similarly, the third course targets Japanese language professionals, using authentic examples from Japanese oral proficiency tests.

To ensure the quality of the courses, both pre-service and in-service teachers will participate in the courses' field tests. In addition, future participants in ACTFL's OPI workshops will evaluate the courses' ability to prepare for the workshop. Because the courses are designed to be replicated for additional languages in the future, CAL staff, ACTFL's Director of Professional Programs, and ACTFL OPI Tester Trainers will work collaboratively to ensure that the courses meet high standards of professional development. For more information about CAL go to www.cal.org.

Back to top


Trends in Foreign Language Assessment: Simulated Oral Proficiency Interviews (SOPIs)
By Helen Carpenter

SOPIs, or Simulated Oral Proficiency Interviews, developed by the Center for Applied Linguistics with support from the National Capital Language Resource Center, are a variation of the Oral Proficiency Interview (OPI). While the OPI uses a face-to-face interview to elicit a speech sample, the SOPI elicits a ratable speech sample through a tape-mediated procedure. Examinees' responses are tape recorded and rated later by a trained SOPI rater.

SOPIs are beneficial and convenient for foreign language teachers for a number of reasons: (1) No additional training is necessary to administer SOPIs; (2) They are available in Spanish, French, German, Japanese, Chinese, Arabic, Hebrew, Hausa, Indonesian, Portuguese and Russian; (3) They are cost-effective; (4) They can be administered to several examinees at the same time; and (5) They are reliable and valid instruments for assessing student oral proficiency.

A typical SOPI consists of five of each of the following: picture tasks, discussion topics, and role play situations. There are native speaker prompts, oral and written English directions, and built-in planning time for the examinee to think before giving his or her response. The 15 tasks provide opportunities for examinees to demonstrate speech at three general levels: Intermediate, Advanced and Superior. These levels are based on the Speaking Proficiency Guidelines of the American Council on the Teaching of Foreign Languages (ACTFL). Once examinees have completed the test, the administrator will possess a means with which to assess their proficiency: up to 15 different performance samples. On the SOPI, proficiency levels are assessed using the ACTFL Guidelines for Speaking Proficiency.

In the Guidelines, speaking proficiency is determined through evaluating eight different criteria: (1) ability to fulfill the communicative task; (2) use of the appropriate context (e.g. formal or informal); (3) provision of the appropriate content; (4) accuracy; (5) fluency; (6) vocabulary; (7) sociolinguistic ability; and (8) pronunciation. SOPI tasks are designed to allow examinees to demonstrate their abilities in these eight different areas.

When it is time to rate examinee performances, the ACTFL Guidelines provide useful descriptions of prototypical speech at four main levels: Superior, Advanced, Intermediate, and Novice. To better illustrate these levels, abridged examples follow.

  • Performances that are rated Superior generally are marked by detailed explanations, hypotheses on concrete and abstract topics, and support for opinions about controversial issues. Such performances contain rhetorical devices to emphasize, compare and contrast, and organize ideas extensively.
  • Performances that are rated Advanced contain narration and description at a paragraph-level.
  • In a performance at the Intermediate level, language is used creatively, though primarily in a reactive manner. The examinee uses simple, discrete, and often parallel sentence structures with little or no use of cohesive devices or connectors to link sentences into paragraphs.
  • In Novice level performances, examinees show minimal, if any, communicative abilities. These responses are marked by the use of memorized words and phrases and little creation with language.

To learn to rate the samples elicited by SOPI tasks, three options are available: a workshop, a rater training kit, or a multimedia rater training program. Workshops are available in all SOPI languages. Rater training kits are available in Spanish, French, German, Japanese, Chinese, Russian and Arabic, and the multimedia kits will soon be available in Spanish, French and German.

The skills teachers acquire while learning to rate SOPIs can also assist them in the classroom. Knowledge of the ACTFL Guidelines can help teachers assess students informally in the classroom in both day-to-day activities and in classroom oral proficiency tests. Teachers can better analyze the type of discourse their students are producing and monitor them for development. Furthermore, teachers can devise in-class oral proficiency tasks by applying their knowledge of how SOPI tasks are created. Finally, knowledge of the general characteristics of student performance at various ACTFL levels can help teachers devise syllabi and curricula that meet the needs of their students.

For more information on ordering SOPIs, go to http://www.cal.org/ For further reading on SOPI reliability and validity, refer to the references below.

Reference

Stansfield, C.W. and Kenyon, D.M. (1992). Research on the comparability of the oral proficiency interview and the simulated oral proficiency interview. System, 20, 347-364.

Back to top


Interview-Based Oral Proficiency Assessments
By Dorry Kenyon

The work of the American Council on the Teaching of Foreign Languages (ACTFL) has made the use of interview techniques in the assessment of oral language ability well known in K-16 education. This article examines ACTFL's Oral Proficiency Interview (OPI) and discusses the similarities and differences between the OPI and its variations--the MOPI, SOPI, VOCI, COPI. All of these assessments are most appropriate for high school and college-level students.

OPI

The ACTFL Oral Proficiency Interview (OPI) is "a standardized procedure for the global assessment of functional speaking ability, or oral proficiency." (Buck, Byrnes & Thompson, 1989, p. 1-1). It is a one-on-one interview procedure with the ultimate goal of assessing the examinee's ability to speak a foreign language. A trained interviewer poses questions to elicit a speech sample that is ratable by the criteria of the ACTFL Speaking Proficiency Guidelines (American Council on the Teaching of Foreign Languages, 1986). After concluding the interview, the rater listens to the audio-taped interview and awards a final rating on the ACTFL scale.

Although not the intended purpose, the technique does elicit information from examinees regarding their background, life experiences, interests and opinions. This data is useful, however, because the testers use this information to formulate questions that will engage examinees. These questions present speaking tasks in a way that allows examinees to demonstrate aspects of their oral language proficiency as outlined in the Guidelines. However, because the topics discussed in the interview are individualized for each examinee, each interview is unique regarding the specific topics covered. Thus, the goal of the OPI is to use information offered by the examinee to elicit speech samples that are ratable by the Guideline's criteria.

For further information on the OPI and training in this procedure, see ACTFL's web site at www.actfl.org.

MOPI

The acronym MOPI refers to the modified version of the ACTFL OPI (Modified OPI). The modification does not actually occur in the assessment procedure, which is the same as the OPI. What is different is the tester training. ACTFL offers a modified version of the full OPI tester training workshop designed for language instructors who teach and test primarily at the ACTFL Novice and Intermediate levels. Following this workshop, participants may apply to become ACTFL OPI Testers with Limited Certification, which means that they may only officially test examinees at those levels. While for full certification OPI testers must themselves demonstrate proficiency at the Superior level on the ACTFL scale, for this limited certification testers need only be at the ACTFL Advanced level.

For further information on modified OPI test training, see ACTFL's web site at www.actfl.org.

SOPI

SOPI stands for the Simulated Oral Proficiency Interview, a variation of the OPI developed by the Center for Applied Linguistics. While the OPI uses a face-to-face interview to elicit a speech sample, the SOPI elicits a ratable speech sample through a tape-mediated procedure. It presents examinees with speaking tasks through a Master Tape, from which examinees hear task directions, and a Test Booklet, which presents the written directions and provides any visuals used in the task. While in the OPI the interviewer uses topics nominated by the examinee, the SOPI presents a selection of tasks, at the various levels of the ACTFL scale, that cover a wide range of speaking functions and content domains. On any given SOPI form, each examinee responds to the same tasks. The carefully developed, field-tested tasks are open-ended, allowing examinees the freedom to draw on their own background and life experiences in responding. On the SOPI, examinees present their own thoughts, feelings, and opinions while imagining themselves in various real-life situations. While the SOPI can be individually or group administered, examinee responses are reviewed individually by trained raters, who apply the criteria of the ACTFL Guidelines to the assessment speech sample. Like the OPI, each examinee receives a proficiency rating on the ACTFL scale. In terms of eliciting information, the SOPI is less direct than the OPI. Except in an unrated warm-up section, no direct personal information is directly requested of examinees. Nevertheless, the goal of the SOPI and the OPI remain the same: to assess the oral proficiency of examinees through eliciting a speech sample ratable using the ACTFL scale.

The NCLRC provides training workshops in scoring the SOPI. Workshops are also provided in SOPI test development. For further information on these workshops, contact Susan Dirstine at susan@cal.org. The NCLRC also provides self-instructional materials to use in learning to score SOPIs in Spanish, French, German, Japanese, Chinese, and Arabic. For further information on SOPI self-instructional rater training kits, or on SOPIs developed by CAL in Spanish, French, German, Japanese, Chinese, Arabic, Hebrew, Hausa, Indonesian, Portuguese and Russian, contact Laurel Winston at laurel@cal.org. The NCLRC is currently developing a multimedia program to provide this rater training in Spanish, French and German. Additional information on the SOPI can be accessed at the web site for the Center for Applied Linguistics at www.cal.org, and at the NCLRC web site at www.cal.org/nclrc.

VOCI

The Visual Oral Communication Instrument (VOCI, pronounced vochee) was developed by the NCLRC's sister Language Resource Center at San Diego State University. Like the SOPI, the VOCI uses technology rather than a face-to-face conversation to elicit a speech sample ratable on the ACTFL scale. The VOCI uses a videotape to present a variety of performance-based tasks to examinees, incorporating visual and well as audio input. Like the SOPI, examinee responses are recorded on a separate audiotape for later scoring by trained raters. In addition, like the SOPI, the VOCI may be group administered. Different versions of the VOCI are available, depending on the level of the examinees. VOCIs have been developed for Chinese, ESL, French, German, Japanese, Russian, and Spanish.

Like the SOPI, the VOCI is less of a true interview than the OPI. Depending on the task presented, more or less personal information is required of the examinee. Still, the ultimate goal of the VOCI is the same as for the OPI and SOPI: to elicit a speech sample ratable using the ACTFL scale.

For further information on the VOCI, see the San Diego's LRC web site (LARC) at larcnet.sdsu.edu.

COPI

The Center for Applied Linguistics is currently developing a version of the oral proficiency test using multimedia for the administration of the test. The new test is currently known as the COPI, or Computerized Oral Proficiency Interview. The COPI, envisioned as the next generation of the SOPI, uses the advantages of computer technology to give examinees more control of the testing situation. It allows examinees to choose the language (English or the target language) in which test directions are given and allows them to control the amount of preparation and response time for each task. The test will also permit examinees to select the topics of the tasks, as well as the level of task difficulty of alternate tasks.

The COPI's goal remains the same as that of the OPI, SOPI, and VOCI: to elicit a speech sample that can be assessed by trained raters using the criteria of the ACTFL Guidelines. Differences between the COPI and the SOPI or VOCI will be the amount of control examinees will have over how they are assessed: topics selected, preparation and response time. The COPI also begins with a self-assessment and sample tasks and responses at the different ACTFL levels for examinees to better understand the goals of the assessment. The COPI will also be adaptive to the level of tasks chosen by the examinee, as well as to the examinee's gender, grade level and occupation (work or student). Unlike the OPI, however, examinees do not nominate topics for discourse; however, they can select them out of various options.

For further information on the COPI, and to learn how you may participate in its field testing, visit the page on the COPI at CAL's web site at www.cal.org/projects/copi.html.

References

American Council on the Teaching of Foreign Languages. (1986). Proficiency guidelines. Hastings-on-Hudson, NY: ACTFL.

Buck, K., Byrnes, H., & Thompson, I. (Eds.) (1989). The ACTFL oral proficiency interview tester training manual. Yonkers, NY: ACTFL.

Back to top


Back to top


Back to top


Back to top


®2006 National Capital Language Resource Center
If you have any questions regarding this site, please contact our Webmaster | Graphic Design by Susana Echeverría (Andean Frog).

Home | Professional Development | Newsletter | Culture Club | Contact Us |