nclrc_logo
George Washington University, Georgetown University, The Center for Applied Linguistics
language_students student_girl  
Desilearn


Powered by Google

Teaching Materials Assessment: Testing Tips
Materials by Language
FL Teen Interviews
Russian for Heritage Learners
Russian Webcasts
Arabic Webcasts
Chinese Webcasts
Links to FL Materials
Careers and Languages
Assessment
Newsletter
Testing Tips Assessment Tools Portfolio Assessment Guide Database of Assessment Advanced Placement Tests Performance-Based Assessments Language Testing Associations

March/April 2014

Will my assessment work?
Victoria Nier, Center for Applied Linguistics

This month's Testing Tips answers a question from a university-level Arabic teacher. She recently attended a workshop on designing assessments for her students and left the workshop feeling confident in the assessment tasks she'd developed and excited to implement them. Back in the classroom, however, she was beginning to have some doubts. Would her assessments actually work?
Read more...
(PDF)


January/February 2014

Beyond Testing Tips
Francesca Di Silvio, Center for Applied Linguistics

This month's Testing Tips answers a question that many readers of this column may have: How can I learn more about assessment and improve my assessment knowledge?

One way, of course, is to submit a question for a future Ask a Tester column by sending an email to opat@cal.org with the subject line Ask a Tester. Beyond this newsletter, there are a variety of resources to help increase your understanding about the fundamentals of assessment and ways to assess language in your specific context. The following list includes suggestions to fit different needs but is by no means comprehensive. Take a look in your area for other opportunities - local college and university ESOL, FL, or linguistics departments sometimes provide resources, host workshops, and offer courses about assessment as well!
Read more...
(PDF)


November/December 2013

What Now?
Victoria Nier, Center for Applied Linguistics

As mentioned in the October 2012 newsletter, Testing Tips columns focus on questions from newsletter readers. To submit a question for a future Ask a Tester column, send an email to opat@cal.org with the subject line Ask a Tester.

As I write this article, it's mid-November, 2013 - roughly the middle of fall semester, just after the end of the first marking period in many schools. One teacher wrote in to ask what he should be doing at this point regarding assessment in his Chinese language class. Despite his lofty ambitions, he wasn't able to set up an assessment plan for the year during the summer. His question: Is it too late to start now?
Read more...
(PDF)


July/August 2013

Valid and Reliable Tasks and Rubrics
Victoria Nier, Center for Applied Linguistics

As we mentioned in the October 2012 newsletter, this year’s Testing Tips columns focus on questions from newsletter readers. To submit a question for a future Ask a Tester column, send an email to opat@cal.org with the subject line Ask a Tester. This month’s Testing Tips article comes from a teacher who wants to know how she can be sure that an assessment task and rubric are valid and reliable.

notebookFor readers who participated in formalized training in language testing or assessment, terms like “validity” and “reliability” may sound familiar. We often hear them in the context of large-scale, standardized tests and statistical analyses, where researchers and practitioners strive to ensure that a test is used for its intended purpose(s) with the appropriate population (i.e. the test is valid) and that the test administration and rating procedures are consistently implemented (i.e. the test is reliable).

While terms like validity and reliability are less commonly discussed in the context of small-scale assessment tasks and rubrics, they are no less important to consider in the classroom. This issue’s question comes from a teacher who wants to know if a task she developed is valid and reliable. The task is an ACTFL Intermediate-level interpersonal speaking task in which her students ask each other about their likes and dislikes. She wants to use the information she gathers from this task to provide her students and herself with feedback on their progress. She has designed an accompanying rubric to rate her students’ performance on this task.

groupThe key questions for this teacher to ask herself with regards to validity are:
What am I trying to find out from this assessment task and rubric (e.g., what is my test purpose)?
Are the task and rubric going to give me this information?
If the teacher’s objectives include getting her students to the point where they can exchange information about preferences with each other in casual conversation (an example of an Intermediate speaking function), then this task aligns with that purpose. But if she wants to get other information from this task, such as the students’ interpretive reading abilities, or if her students are actually at the Novice level, then the task won’t provide the information that she needs. The teacher also needs to ask herself if her rubric is designed to give her the information she needs. Are the listed criteria (like fluency, accuracy, or complexity) relevant to the task? Are any criteria missing, or should any be excluded?


After determining whether the task and rubric are being used for their intended purposes, the teacher needs to question whether her task administration and rubric use procedures are consistent to ensure assessment reliability. Is the task administered the same way each time to each student? Has the teacher trained herself and anyone else using the rubric to interpret the criteria the same way each time? Is the way that feedback is communicated to students also consistent and clear? Does she notice that her consistency in rating drifts as she gets tired, hungry or cranky from rating so many student responses? How can she work to improve the consistency of her rating?

Like the teacher in this column, we must keep in mind that validity and reliability apply in all assessment situations, large and small. While one test may seem very small to us as teachers, our students (we hope!) take each test and the grade they receive on it very seriously! Don’t forget not only to Ask a Tester, but ask yourself and other teachers these important questions!

May/June 2013

What does Piloting a Test Entail?
Victoria Nier, CAL

When your trusty Testing Tips writers are not writing these missives to you, they work on many other projects, including test development and professional development on assessment for language educators. One question we are frequently asked regarding test development is, “How important is it, really, to pilot a new assessment and what do we learn from this process?”

When we talk about piloting, we mean trying out all of the processes of an assessment before implementing them and using them in a way that will have consequences for the students, teachers, and language program. This does not just mean reviewing procedures for administration and grading.  It also asks the students what they think about the test, including preparation and fairness, test instructions, and feedback provided about test results.

desk

Recently, we have been piloting language assessments for K-16 students. Many things went wrong! The computer interface froze from time to time. Test administration took longer than we had anticipated. Specific test questions seemed to completely stump more students than expected – a sure sign that the question or the directions for the question need to be changed.

At times, we felt discouraged by the pilot. But, in fact, the goal of a pilot is to catch these issues and fix them before an assessment is used for purposes that will affect students. Piloting a test helps to improve it in many ways. The test questions can be refined to ensure that the content and constructs that you want to measure are, indeed, being measured. The materials can be improved to ensure that all elements function as intended. The procedures can be revised to ensure that students and test administrators understand what to do at every point in testing.

teacher

Piloting is not only helpful for large-scale assessments but also for small-scale assessments like quizzes or exit tickets. Sometimes as educators, we know the content and skills we intend to teach so well that it is difficult to be objective. Often, what seems like an obvious answer or task is actually much more challenging than we imagined. Even for low-stakes assessments, ensuring that the students understand what they are being asked to do is critical so that students have the chance to show what they know and can do. Otherwise, we don’t know whether students cannot complete a task or if they simply don’t know what’s being asked of them.

Sometimes, we truly don’t have time to fully pilot every assessment tool we use. In such cases, even a quick check with a colleague or a student one-on-one to ascertain whether they understand the assessment’s content and procedures is helpful. We can also learn from each time an assessment is administered and use the experience to improve subsequent administrations of the same assessment. No matter how we fit it in, piloting ensures that we keep improving our teaching to better serve our students. It’s worth thinking about new ways to incorporate piloting into our curriculum planning and our daily practice.


March/April 2013

World Languages and the Common Core State Standards
Victoria Nier, Center for Applied Linguistics

As introduced in the October newsletter, this year’s Testing Tips columns are taking on questions from newsletter readers. To submit a question for a future Ask a Tester column, send an email to opat@cal.org with the subject line Ask a Tester. This month’s testing tips article comes from a recent query about how world languages fit into the Common Core State Standards.

The Common Core State Standards (CCSS) provide a unified set of educational standards in English language arts (ELA) and mathematics that have been adopted by many states in recent years. As part of a growing movement towards standards-based education and accountability, the CCSS clearly articulate high expectations for academic achievement in ELA and math. As world language teachers, it’s easy to feel that the CCSS are not particularly relevant to us. We should note, though, that the mission statement on the Common Core State Standards homepage states that:

The standards are designed to be robust and relevant to the real world, reflecting the knowledge and skills that our young people need for success in college and careers. With American students fully prepared for the future, our communities will be best positioned to compete successfully in the global economy.

This statement could easily describe our goals as world language teachers. At a foundational level, the CCSS and world language instruction are interrelated in both philosophy and practice.

man-globeThe American Council on the Teaching of Foreign Languages (ACTFL) has focused a great deal of discussion and research on aligning the CCSS and the Standards for Language Learning in the 21st Century (National Standards in Foreign Language Education Project, 2006).ACTFL’s document “Alignment of the National Standards for Learning Languages with the Common Core State Standards” (2012) carefully connects the reading, writing, speaking, and listening strands of the English Language Arts and Literacy Common Core State Standards to ACTFL’s Interpersonal, Interpretive, and Presentational Communication standards. This document shows how similar principles are foundational to the CCSS and world language standards, but this isn’t the end of the conversation between world languages and the CCSS.

For years, researchers and educators have documented the academic benefits of foreign language study and learning. In the elementary school context, Armstrong and Rogers (1997) found that third graders who had received Spanish instruction showed gains in math and language scores on a standardized achievement test after only one semester of Spanish. Interestingly, some of the students receiving language instruction scored higher in math than students in a control group receiving no language instruction who had more weekly hours of math instruction. Also in the elementary school context, Saunders (1998) found that third graders who had received four years of world language instruction scored significantly higher on the math portion of a standardized test than students one year older who had received no language instruction. The data in the secondary school context are encouraging as well. For example, Robinson (1992) summarized the College Board’s 1992 College-Bound Seniors report, which indicated that students with four or more years of world language instruction had higher SAT verbal scores than students with four or more years in any other subject area.

classroomThese are only a handful of the many studies that link world language study to academic achievement, which is precisely what the Common Core State Standards aim to define, measure, and promote. At the same time, more researchers and practitioners need to conduct additional studies to connect student achievement and cognitive benefits to specific standards within the CCSS. While some studies of this type are already underway, each of us as a world language professional can play a role in both gathering further evidence and advocating for world languages’ critical role in educating global citizens and supporting academic achievement within the framework of the Common Core State Standards.

 

 

References

American Council on the Teaching of Foreign Languages. (2012, April 3). Alignment of the
National Standards for Learning Languages with the Common Core State Standards.
Retrieved fromACTFL

Armstrong, P. W. and J. D. Rogers. (1997). Basic Skills Revisited: The Effects of Foreign Language
Instruction on Reading, Math and Language Arts. Learning Languages, Spring, 20-31.

National Standards in Foreign Language Education Project. (2006). Standards for foreign
language learning in the 21st century, 3rd Ed. Yonkers, NY: Author.

Robinson, D. W. (1998). The Cognitive, Academic and Attitudinal Benefits of Early Language
Learning.  In M. Met, (Ed.), Critical Issues in Early Language Learning (2nd ed. 37-56). White Plains, NY: Longman.

Saunders, C. M. (1998). The Effect of the Study of a Foreign Language in the Elementary School
on Scores on the Iowa Test Of Basic Skills and an Analysis of Student-participant Attitudes and Abilities. (Unpublished doctoral dissertation). University of Georgia: Athens.


Jan/Feb 2013

Teaching and Testing Real World Language Skills
Teaching to the Test?

Victoria Nier, CAL

As introduced in the October newsletter, this year’s Testing Tips columns are taking on questions from newsletter readers. To submit a question for a future Ask a Tester column, send an email to opat@cal.org with the subject line Ask a Tester.

This month’s topic comes from a recent online course on assessment basics for language instructors conducted by the Center for Applied Linguistics. Participants talked about being frustrated by the (seemingly) competing interests of teaching and testing. One Spanish teacher (Languages have been changed to protect participants’ anonymity) wrote,

“Because I do not believe in teaching to the test (rather testing what I teach), this poses a difficulty in how I use this [external] test as a reflection of what students have learned.”

A Mandarin instructor noted that the use of an imposed, summative assessment had negative washback on her students. She reported,

“They became more concerned with memorizing material that might be on the test than with actually building proficiency in the language.”

With the increased focus on assessment and accountability in U.S. education at all levels, many language teachers may feel an increased pressure to “teach to the test.” In some districts and schools, graduation requirements stipulate that students achieve certain scores on standardized tests in order to graduate. Others may emphasize the role of test scores in determining whether or not to continue funding a foreign language program. In still other schools, students and parents may desire high marks on language tests as an advantage in the college application process. It seems that the higher the stakes of the test, the greater the likelihood of pressure for students to receive good scores on it

As an educator, you may at times be required to use a certain language test that you do not find particularly useful. However, there is a great push these days for summative language assessments to be aligned with the Standards for Foreign Language Learning in the 21st Century (National Standards in Foreign Language Education Project, 2006) and to be focused on general language proficiency. If the summative assessment your program uses is standards-based and proficiency-focused, you shouldn’t feel bad about “teaching to the test.” Teaching to the test in this case means that you’re teaching your students according to a set of nationally agreed-upon standards of what constitutes language learning, and that you’re helping them build real-world, functional communicative abilities.

The Standards for Foreign Language Learning in the 21st Century were developed by a national task force of language educators in order to provide a consensus view of the goals of foreign language education in the United States. While the Standards do not specify what should be taught or tested, they make clear the objectives of functional language learning, which include that students should be able to interpret and present information, as well as interact, in a foreign language.

A proficiency-based test in a Standards-aligned program attempts to measure a learner’s real-world language abilities. Teaching to this kind of test does learners a service, not a disservice; it gears teaching towards proficiency within a Standards-based framework and grounds teaching in authentic communicational purposes.

Not all language programs are Standards-based, nor are all summative assessments proficiency-based. However, a language teacher can still implement a Standards- and proficiency-based curriculum. Teaching for real-world, communicative ability will empower students to excel on most summative tests because they will have functional language skills and be capable of dealing with a multitude of topics and tasks in a language. Further, teaching in this way and then measuring students’ proficiency can provide instructors with evidence of learning that can be used to argue for aligning the language program with the Standards and proficiency-based goals.

In a Standards- and proficiency-based language program, teaching to the test shouldn’t be a dirty phrase. It should be part of the definition of high-quality language learning and instruction.

References
National Standards in Foreign Language Education Project. (2006). Standards for foreign language learning in the 21st century, 3rd Ed. Yonkers, NY: Author.


Nov/Dec 2012

Ask a Tester: Assessing Culture
Mackenzie Price
Center for Applied Linguistics

As introduced in the October newsletter, the Testing Tips column this year will be taking on questions from newsletter readers. To submit a question for a future Ask a Tester column, send an email to opat@cal.org with the subject line Ask a Tester.

This month’s question comes from an instructor who writes:
I find culture one of the most difficult standards to assess.  It is not difficult to assess knowledge or the use of certain culturally appropriate gestures or forms of politeness, etc. (products and practices).  However, it is a much more difficult task to guide students to understanding perspectives and then assessing them, and to do so in the target language even at the beginning levels.
This comment describes a common issue for language instructors seeking to assess culture, particularly at the Novice level.

Figure 1: Three Components of Culture

testing

(Standards for Foreign Language Learning in the 21st Century. Pg. 47)

As this instructor points out, culture has multiple components to be assessed: cultural practices, cultural products, and cultural perspectives.   As the model above shows, each dimension has a relationship with the other two.  Therefore, in the same way that a task on culturally appropriate forms of politeness assesses cultural practices and perspectives, other assessment activities can combine cultural practices and products, or cultural products and perspectives. As the Standards of Foreign Language Learning in the 21 Century (2006) point out, “language is the primary vehicle for expressing cultural perspectives”. Cultural practices and products allow students to produce tangible aspects, or the “what” of culture, while understanding perspectives allows them to understand why social interactions unfold the way that they do (pg.  491). To provide students with cultural perspectives is also to explain why particular tangible aspects of culture, like games or experiences, are significant.

To illustrate how cultural perspectives can be brought into an existing assessment activity, consider the instructor’s example of forms of politeness.  In an English assessment task, the polite way to ask a stranger for directions might be as follows:

Excuse me, how do I get to the art museum from here? Or Pardon me, where is the art museum?

Beyond signaling that you are going to make a request, the target phrases “excuse me” and “pardon me” show that speakers are sensitive to imposing on people they don’t know.  This reflects the American cultural perspective that a person’s right to their personal space should be recognized at all times (Tannen 1987).  Sure, you may need directions, but before you ask you must apologize first for the interruption.

Understanding that many forms of politeness in English are designed to protect people’s right to their own personal space, particularly when interacting with strangers or in formal contexts, will help students gain and integrate knowledge of cultural perspectives along with practices.  To ensure that students have developed this understanding, assessment practices need to reflect not only a student’s ability to recognize and reproduce cultural practices and products, but to remark on the perspectives that relate to them.

Incorporating a discussion of cultural perspectives into language learning can be a successful complement to those assessment activities designed to highlight similarities and differences between cultures.  And as the Standards point out, the cultural perspectives dimension of language learning is also a unique opportunity for heritage students to bring any extra understanding or insight they have to the classroom.  

Language learners need an understanding of each of these three components of culture in order to be culturally competent.  The ability to use language correctly in appropriate contexts becomes more important as students progress along the ACTFL scale and therefore full cultural competence is not a goal of Novice-level tasks. Still, students at all levels can benefit from extending their examination of cultural products and practices by asking why.

References
National Standards in Foreign Language Education Project. (2006). Standards for foreign language learning in the 21st century, 3rd Ed. Yonkers, NY: Author.
 Tannen, D. (1987). That’s not what I meant! New York: Harper Collins.


October 2012

testingTesting Tips: Introducing Ask a Tester

The Testing Tips column this year is going to take a new and more interactive approach. We are very pleased to introduce the Ask a Tester column. This column will give world language instructors and administrators the opportunity to ask specific questions of language testing experts.

Each issue, we will answer a question submitted by a reader of the NCLRC Language Resource. No question is too broad or narrow for this space.

  • Do you want feedback on an assessment activity that you have designed?
  • Do you want to know more about a testing term you’ve heard at a recent workshop?
  • Would you like to be pointed in the direction of particular assessment resources?
  • Are you having trouble understanding the results of an assessment?

We are excited to hear directly from you about your testing queries, curiosities, and challenges!

To submit your question, send an email to opat@cal.org with the subject line Ask a Tester. Be sure to check back in future issues to explore language testing questions of interest to the teaching community.

 


July 2012

Testing Tips: Performance task for assessing Communities
©Center for Applied Linguistics
Anne Donovan

This year, The NCLRC Language Resource is focusing each issue on one of the 5 C’s of the Standards for Foreign Language Learning in the 21st Century: Communication, Cultures, Connections, Comparisons, and Communities. Each Testing Tips column presents an assessment task that targets the C that is the focus of its issue. In September, we introduced a task template to serve as a guide for developing assessments.   In the November issue, we presented a sample task that assesses communication. In January, we addressed assessment of cultural learning. The March issue focused on connections, and the May issue focused on comparisons.

This issue’s task focuses on the Communities standard. The Communities standard emphasizes students’ use of the language as part of a larger community of speakers. Students should use the target language outside of the school for purposes related to their own interests. By using the language to share and gather information, learners can find ways in which they can become active participants in target language activities and communities. This task is written for students of German, but could be adapted to other language contexts.

Name of task

A Day in Vienna

Communicative mode(s) assessed

Presentational writing

Target proficiency level

Advanced

For which grade/age levels is the task written?

High school students

Background and context:

In the unit that accompanies this assessment task, students learned about the history and geography of Vienna. The focus was on the significant historical events and architectural monuments of the city. Now, you would like to know what the students are interested in seeing in Vienna. This assessment asks students to imagine that they have spent a day in Vienna, after which they write an email to you describing the sites and activities they saw. They will have to use German language guidebooks and websites to select the sites and activities they will describe. While this assessment involves reading in the target language, reading comprehension is not the target activity. Thus, instructors should identify level-appropriate guidebooks and websites in German for students to use.

Materials for this task include German-language guidebooks to Austria and/or Vienna, and a list of helpful websites to learn more about places of interest. These could include German Wikipedia, the Michelin website or the Vienna tourism website.

Instructions to students:

Now that you have learned about some of the important aspects of Viennese history, use the travel resources provided to identify two or more things you would like to do and see in Vienna.

Next, imagine that you are traveling in Vienna doing these activities and seeing these sites. It is the end of your trip and you decide to write an email to your German teacher. In your email tell him or her about what you did and why you chose to do it. Your email should include information about two sites, events, or activities that you experienced and why you enjoyed the experience. Your email should be at least four paragraphs: two paragraphs describing each of the sites, events, or activities and two paragraphs explaining what you liked about them. Each paragraph should be three or more sentences long.
Your email should show the ways in which you are interested in participating in German communities and activities, so take time to find actual activities that interest you. These could include athletic and musical events, shopping, dining, festivals, museums, tours, or anything else that you would actually do in Vienna. Remember to begin and end the email with appropriate greetings and conclusion.

Expected response:

Students should :

  • write at least four paragraphs of at least three sentences each.
  • include descriptions of two events, activities, or sites.
  • include a rationale for being interested in these two events, activities, or sites.
  • begin with  a greeting and end with a conclusion that is appropriate for an email to a teacher.
  • use the past tense.
  • talk about things they would actually want to do!


May/June 2012

Testing Tips:  Performance task for assessing comparisons
©Center for Applied Linguistics
Mackenzie Price

This year, The NCLRC Language Resource is focusing each issue on one of the 5 C’s of the Standards for Foreign Language Learning in the 21st Century: Communication, Cultures, Connections, Comparisons, and Communities. Each Testing Tips column presents an assessment task that targets the C that is the focus of its issue. In September, we introduced a task template to serve as a guide for developing assessments, in the November issue we presented a sample task that assesses communication, in January we addressed assessment of cultural learning, and the March issue focused on connections.

This issue’s task is designed for learners of Spanish and focuses on comparisons between languages.  The rationale behind the comparisons standard is that identifying similarities and differences between language systems (in this case, English and Spanish) will help students gain a better understanding of both systems and cultures. In addition, students will be more aware of strategies for communication in the target language.


Name of task

Describing where things are

Communicative mode(s) assessed

Interpersonal speaking, interpretive listening, presentational writing

Target proficiency level

Intermediate

For which grade/age levels is the task written?

High school students

Background and context:


In the unit that accompanies this assessment task, students have learned to ask about and describe the locations of people and things. This lesson emphasizes that Spanish uses three relative distances to describe where things are located, while English only differentiates between two distances.  The following task assesses students’ abilities to articulate this difference, using the vocabulary and phrases they have learned in the lesson.  Below is a list of the vocabulary and phrases students will need to have learned to complete the assessment task.

Phrases
¿Dónde está la basura?
La basura está…

Location Words

Spanish

English

Close by

Aquí ; Acá

Here

Further away

Allí

There

At a distance

Allá

There/over there

Materials for this task include a worksheet like the one below for indicating the relative distances of people and things. Students working in pairs are given different worksheets with a list of 5 people and things and 5 blank spaces.

 

  • Student A

Response

Student B

Response

  • 1. Trash can

 

1. Gym

 

  • 2. Flag

 

2. My teacher

 

  • 3. Library

 

3. Window

 

  • 4. My friend

 

4. Spain

 

  • 5. Mexico

 

5. Desk

 

  • 6.

 

6.

 

  • 7.

 

7.

 

  • 8.

 

8.

 

  • 9.

 

9.

 

  • 10.

 

10.

 



Instructions to students:

Think about where certain people and things are located in your classroom, your school, and the world.  In Spanish, what kinds of words do you need to use to describe where they are located? What kinds of words do you need to use in English?  
                                                            
You will work with your partner to determine where the people and things on your worksheet are.  First, think of five additional people or objects that you can ask your partner about and write them in the blank spaces. Next, using the words and phrases that you learned in class, ask your partner where each item on your list is. Write down your partner’s response for each item. Now switch roles and do the same task, with your partner asking where things are and you giving an answer using a location word.

 

Expected response:

  • Students should ask and answer ten questions.
  • Students should write down their partner’s ten responses.
  • Every response should include a location word.
  • Students should try to be descriptive in their responses!

testing

For example:

Student A: ¿Dónde está la basura?
Student B: La basura está allí, a la derecha.

Student A: ¿Dónde está la biblioteca?
Student B: La biblioteca está allá, al lado del gimnasio.

 

 

 


March 2012

Testing Tips: Performance task for assessing connections
©Center for Applied Linguistics
By Margaret E. Malone

This year, The NCLRC Language Resource is focusing each issue on one of the 5 C’s of the Standards for Foreign Language Learning in the 21st Century: Communication, Cultures, Connections, Comparisons, and Communities, and each Testing Tips column presents an assessment task that targets the C that is the focus of its issue. In September, we introduced a task template to serve as a guide for developing assessments, in the November issue we presented a sample task that assesses communication and in December, we addressed assessment of cultural learning.

This issue, we have provided a sample task to assess connections, which highlights how world language learning is connected to other content areas. This is a rich and complicated standard to assess because it is always a challenge to ensure that we are assessing not the content area but rather what students have learned in the target language about the content area.

The task below has been developed for novice learners of Spanish. When adapting tasks for another target language, it is critical to consider cultural and content implications as well as linguistic adjustments that must be made. For connections tasks, think about how other content areas are taught and learned in your language, school, and classroom. For instance, in immersion language classrooms, it is not only appropriate but a necessary part of the curriculum to assess a child’s math learning in a world language. Other classrooms may provide rich connections to other content areas, and we must be certain that we only assess in the language class what is relevant to that class. In other words, we won’t be giving students a math grade for counting in Spanish class! The following task is designed for foreign language in the elementary school (FLES) classrooms but can be adapted for other language program models. 

Name of task

My school community

Communicative mode(s) assessed

Presentational speaking, interpretive listening

Target proficiency level

Novice

What grade/age levels is the task written for?

Early elementary school

Background and context:

The following task accompanies a unit that explores students’ communities and connects the students’ work in social studies to their learning in their Spanish Foreign Language in Elementary School class. Students have learned a variety of vocabulary words for different people in their school community. These include:

Verb phrases

Key person nouns

Key location nouns

___ is a/the ______

This is the _____

___ is at/in the _____

teacher
principal
student
secretary
custodian

school       
cafeteria
classroom
office
playground

 

Students have practiced the vocabulary with pictures of places in the school and people who work in the school. Students are given flashcard-sized pictures of different places and people in their school so that they can show their understanding of what their partner conveys in this paired activity.

 

Instructions to students:

Think about the different people in your school community and where you can find them in the school.

 Work with a partner. Name a person in your school and say where she or he is located. Then, your partner should take the picture of the person you named and put it on top of the location you named.

Now switch roles and do the same task, with your partner naming a person and location, and you selecting the matching pictures.

 

Expected response: 

  • Students will use mostly words and short phrases, which is appropriate at this level.
  • Students will show both that they can produce language by speaking and understand it by matching words for people and locations with pictures.
  • Students, especially those new to the school, may confuse people so partners can work together to point to different people and name who they are.
  • The Spanish teacher can collaborate with the grade-level teacher to reinforce specific community members. For example, students in higher grades may also identify community members outside of the school building.

 

For Example

Student A: Mrs. Brown is the principal.
Student B: Picks up photo of Mrs. Brown.
Student A: She is in the office (note: Mrs. Brown is an involved principal, so students will likely place her all over the school--as they will with their teachers, fellow students, and custodian!)
Student B: Puts photo of Mrs. Brown on top of the photo of the office.

 


January 2012

Testing Tips: Performance Task for Assessing Culture
Francesca Di Silvio, ©Center for Applied Linguistics

This year, The NCLRC Language Resource is focusing each issue on one of the 5 C’s of the Standards for Foreign Language Learning in the 21st Century: Communication, Cultures, Connections, Comparisons, and Communities, and each Testing Tips column presents an assessment task that targets the C that is the focus of its issue. In September, we introduced a task template to serve as a guide for developing assessments, and in the November issue we presented a sample task that assesses communication. In this issue, we have provided a sample task to assess culture learning, which we hope will help you generate more ideas for tasks that assess learners’ knowledge and understanding of other cultures. As always, we recommend that the task be adjusted to the appropriate level for individual programs and curricula.

The task below has been developed for learners of Italian. When adapting tasks for another target language, it is critical to consider cultural implications as well as linguistic adjustments that must be made. For culture tasks, think about the specific practices, products, and perspectives that you examine in your instruction and which authentic materials could be used to bring the task to life.

Name of task

Choosing a high school

Communicative mode(s) assessed

Presentational writing and speaking

Target proficiency level

Intermediate

What grade/age levels is the task written for?

Middle school students

Background and context:

houseThe following task accompanies a unit that examines school life in Italy. Students learn that at the end of middle school, students in Italy take an exam and choose among different types of high schools. In the previous unit, students learned how to express likes and dislikes. During this unit, students have learned vocabulary and phrases for talking about the school system in Italy. These include:

Verb phrases
To take an exam
To choose a school
To go to ____ school
To study a subject
Types of schools
Classical high school
Scientific high school
Human sciences high school
Linguistic high school
Artistic high school
Key nouns
Secondary school
High school
Exam
School subjects

Authentic materials that students have examined include websites for high schools in Italy. To aid in the task, students are given a list of key words and first person sentence structures such as the above.

Instructions to students:

Think about which type of high school you would like to attend if you were an Italian student. In this assignment, you will pretend that you are a student at this school and will describe the school to other students.

Design a poster to advertise this type of high school, listing the school name, subjects offered, and other information that might interest a prospective student. The high school name you choose may be invented or taken from a school you have read about.

Then use the poster to present to the class information about the high school you are choosing to attend. Provide at least one reason why you choose that type of school or don’t choose another type of school.

Expected response: 

  • The poster should include the high school type and school name (e.g., Liceo Classico Cavour).
  • The poster should list at least three subjects offered at that high school.
  • In the oral presentation, say the full name of the school you choose.
  • Provide at least one reason you choose to attend that type of high school or don’t choose another type of high school.
  • Name at least three subjects that you study, using full sentences.
  • Be creative and have fun!

For Example
I attend Liceo Classico Cavour in Turin. I choose a classical high school because I like history. I study Latin, history, and philosophy. I like to read Italian literature.


November 2011

Performance Task for Assessing Communication
Aileen Bach - Center for Applied Linguistics

This year, Language Resource will be focusing each issue on one of the 5 C’s of the Standards for Foreign Language Learning in the 21st Century: Communication, Cultures, Connections, Comparisons, and Communities, and each Testing Tips column will present an assessment task that encompasses the C that is the focus of its issue. Last issue, we introduced a task template, which serves as a guide for developing assessments. This issue, we have provided a sample task that assesses communication, which we hope will help you generate more ideas for tasks that assess communication.

The task below has been developed for learners of French, but could be adapted to another target language. When adapting tasks from one language to another, keep in mind the cultural implications and any non-linguistic adjustments that must be made. Think about:

  • Would this task still be applicable in the language to which you are adapting it?
  • How could authentic materials be used to bring this task to life for the students?

This task also provides the opportunity to make use of authentic materials from the target language’s culture. Including authentic materials is a great way to make tasks more culturally relevant and engaging for students.

Name of task Buying a ticket at the train station
Communicative mode(s) assessed Interpersonal speaking, interpretive listening
Target proficiency level Intermediate
What grade/age levels is the task written for? High school students

Background and context:

The following task accompanies a unit that covers vocabulary and phrases related to travel. During this unit, students have learned vocabulary and phrases for talking about transportation, schedules, purchasing tickets, and related topics. These include:

“I would like a ticket to… “
“At what time does the train to… leave?”
“Here is the train schedule.”
“Here is your ticket to…”
“The train to … leaves at… o’clock am/pm”
“How much is a ticket to…” “A ticket to… costs… euros”
“That makes… euros, please.” “Do you have change?”
“At what time do I board?”
“Where does the train board?”
times, am/pm ticket
train
station
terminal schedule
express train
euros
cents
change
one way
round trip

The task can be performed either between a student and the teacher or between two students, depending on your intended use of the task. Keep in mind that if the task is performed between two students, assessing interpretive listening should be treated differently than if the student’s interlocutor is the teacher. A visual aid such as the train schedule below or a similar authentic material from the target culture may help to facilitate the interaction.

Train schedules
Destination Type/duration Boarding
Time Departure
Time Price
Adult Student Senior

Montpellier - Paris Local 8h00 8h20 60€ 48€ 30€
Montpellier - Paris Express 11h25 11h45 90€ 75€ 60€
Montpellier - Paris Local 11h43 12h03 60€ 48€ 40€
Montpellier - Paris Express 15h36 15h56 90€ 75€ 60€
Montpellier - Lyon Local 7h44 8h04 20€ 15€ 10€
Lyon-Paris Local 11h10 11h30 25€ 23€ 17€
Montpellier - Toulouse Local 7h33 7h53 20€ 15€ 10€

Instructions to students:

While studying abroad in Montpellier, France, you have decided to take the train to Paris for a long weekend. You go to the train station to ask the ticketing agent (your speaking partner) about the schedule and to buy a ticket. In your dialogue:

Both speakers should perform appropriate greetings and salutations.
The study abroad student should ask the ticket agent about the times and prices for tickets to Paris or another city on the schedule, with a return to Montpellier in 3 days.
The ticket agent should respond with answers to the student’s questions about schedule and prices, and ask the student which ticket he/she wants to purchase.
The study abroad student should then request a ticket for certain times and days, and pay the ticket agent with imaginary money.
Be creative and have fun!

Expected Response:

  • This task should last about one or two minutes and each speaker should speak six to ten times.
  • Speakers should use the phrases covered in the unit properly. Minor mistakes are permissible but utterances should be comprehensible.
  • The dialogue should follow a logical progression and should include appropriate greetings and salutations.

For Example

Ticket agent: Good morning, how may I help you?
Student: Good morning. I would like to go to Paris today.
Ticket agent: Express trains to Paris depart at 11h45 and 15h56. Local trains to Paris depart at 8h20, and 12h03. .
Student: How much does it cost?
Ticket agent: The adult rate is 90€ and the student rate is 48€. How many tickets would you like?
Student: I would like one ticket, please. [Hands money to ticket agent]
Ticket agent: [Hands change back to student] Two euros is your change. The train will board at 11h25. Have a nice day.
Student: Thank you, good bye.


September 2011

Developing Assessment Tasks: A Template
-Center for Applied Linguistics, Anne Donovan

With the new school year comes a fresh start for language programs across the country and a fresh start for the Testing Tips column! In the coming issues of the Language Resource there will be a focus on the 5 Cs: Communication, Cultures, Connections, Comparisons, and Communities, and each Testing Tips column will present an assessment task that encompasses the C that is the focus of its issue of the Language Resource. We encourage readers to take these assessment tasks as a starting block from which to build tasks that are relevant to your own programs and curricula, as you strive to incorporate all 5 Cs into your language programs. In this issue, we will introduce you to the format that we use to develop assessment tasks, which is also the format that we will use to present the assessment tasks that will accompany the Testing Tips columns in the issues to come.

We use a task development template when developing assessment tasks. Using the template:
Reminds the task writer to consider the details of the task, ensuring that it is appropriate to the subject matter, curriculum, and audience.
Allows for the task to contain the necessary detail for the task to be used or adapted by colleagues.
Allows an instructor to develop parallel tasks for different languages or proficiency levels.

Below is the Task Development Template, complete with tips about the importance and purpose of each section. Stay tuned next month for a completed assessment task!

Task Development Template ©Center for Applied Lingustics

Instructions: This worksheet will help you develop assessment tasks. For each task, complete the matrix below.

Name of Task The name of the task doesn’t need to contain every aspect of the task but should be a fun and easy way for you and your students to identify the task.
Communicative mode(s) assessed Your task should target (a) specific communicative mode(s).
Target Proficiency level Bear in mind the proficiency level of your students so that the task you design allows them to show you how much they can do. This will help you to design a task that is both appropriate for your students and informative to you.
What grade/age levels is the task written for? You want to develop a task that is meaningful to your students! Asking students to perform tasks that are not realistic for them (even though they may be realistic for adults or students of a different age) will affect their ability to complete them.

Background and context: Use the space below to write the background and context.

A good task should be contextualized. The task instructions should include a thorough description of the background and context. Providing context also supports task authenticity. You should consider the following. What information will you provide to give the student a foundation for completing the task? Describe the scenario in which the student is completing the task. What materials will you provide to support task completion? Describe any input (written, oral, or graphic) that will be included in the task.

Instructions to students: Use the space below to write the instructions. You may also want to include additional materials such as worksheets or graphics that are a part of the task.

In this section, you will describe what instructions will be given to students. Good instructions should be explicit, clear and tell the students what is expected of a response. You should consider the following.

What will students have to do? For example, will they:
read a text in the target language and respond to questions in English?
listen to a conversation and circle the picture that corresponds to the main topic?
write the name of a food next to a picture of it?
ask a simple question about a schedule?

To what question or statements will students respond?

Expected Response: Use the space below to describe the criteria of an expected response. You may also want to write a sample response.

A good task includes clear expectations. Having clear criteria is important to the reliability of the assessment. In addition, these criteria should be shared with students so they have a clear idea of what is expected on the task and can benefit from feedback on their performance.

Look at the background, context, and prompt that you have written above and ask yourself the following.
Is there enough information for students to create the expected response?
Does the expected response match the target proficiency level of my students?
Is the expected response something that I would expect from students of this level in real life?


July 2011

Using Standards-Based Assessment to Promote Sustainability in Foreign Language Programs
By Aileen Bach - Center for Applied Linguistics

New approaches to alternative teacher certification programs have meant more diversity in the background teachers bring to the classroom. While many teachers still obtain certification through traditional university-based programs, alternative teacher certification programs are also emerging to accommodate the diverse needs of teacher candidates. The field of teacher accreditation has been receptive to such programs but has also reported a need for continuity across programs (see the National Council for Accreditation of Teacher Education’s 2010 report: Transforming Teacher Education through Clinical Practice; NCATE, 2010). Ultimately, regardless of training, all teachers come into the classroom with different strengths and weaknesses and make unique contributions to the learning environment.

Educational program directors can work to maintain continuity within their programs by promoting a culture where teachers’ rich diversity of experience is celebrated and teachers are motivated to continuously reflect and improve upon their teaching. Programs can cultivate this type of work environment by communicating clear educational objectives for teacher and student performance and encouraging teachers to periodically assess whether their classrooms are meeting these objectives.

There is a need for certified teachers in foreign language programs, especially in the less commonly taught languages, which have experienced a rapid increase in program development and enrollment in recent years (Furman, Goldberg, & Lusin, 2007; Rhodes & Pufahl, 2009). Because of this need, it is also very important for foreign language programs to exercise sustainable practices, not just by hiring highly trained teachers, but also by implementing program-specific measures, such as standards-based assessments, to promote the program’s educational goals concretely and consistently to all of its teachers.

Standards-based assessments are assessments that reflect previously established criteria for meeting educational objectives. For instance, a foreign language program may develop assessments based on the ACTFL Performance Guidelines for K-12 Learners (1999), or based on other criteria that reflect the program’s instructional goals. Standards-based assessments provide clear and concrete ways for teachers and program planners to tell if students are meeting the educational objectives of a program.

There are two main types of assessment that programs may find useful for promoting continuity within their program. Summative assessments are generally conducted at the end of a course to provide feedback to the teacher, learner, or program about the learner's achievement of course goals. Formative assessments help teachers and programs monitor teachers’ effectiveness throughout a course, offering multiple opportunities for teachers to self-assess and improve subsequent lessons. Below are some examples of summative and formative assessments, as well as ways in which each helps programs and teachers work together to foster continuity across classrooms.

 

 

Summative assessment

Formative assessment

Examples

  • End-of-course standards-based assessment
  • Periodic quizzes
  • Projects, presentations, or portfolios
  • Student self-reflections

Role of program

  • Articulate educational objectives to teachers concretely and clearly
  • Monitor teachers’ yearly progress towards effectively implementing educational objectives
  • Encourage teachers to self-assess, make goals for improvement
  • Monitor teachers’ progress throughout the course

Role of teacher

  • Assess yearly progress towards effectively implementing educational objectives
  • Assess whether students meet the educational objectives
  • Make improvements to teaching strategies based on feedback from formative assessments

    

  • It helps students and teachers measure language proficiency.
  • It promotes accountability among language programs.
  • It documents success and can make an argument for increased funding for these programs. (Jackson & Malone, 2009; Jensen, 2007)
  • Assessment results can be used for program advocacy, improvement, and expansion. (Carstens-Wickham, 2008; Morris, 2006; Reese, 2010)

If you wish to see References please click HERE


May 2011

Collaboration and Backward Design Were Meant for Each Other
By Kate Riestenberg - Center for Applied Linguistics

When it comes to designing and implementing an assessment plan for a world language classroom, your peer teachers are a valuable resource. Thoughtful collaboration can make your language curriculum more coherent and effective. In particular, the three-step curriculum planning technique known as backward design is especially well-suited for use in collaborative environments. Through backward design, educators create and strengthen curricula by determining program goals and desired outcomes before planning specific classroom activities or tasks (Wiggins and McTighe, 1998). Read on to learn more about the three stages of collaborative backward design.

Stage 1:
Identify desired results.

Under the backward design model, teachers and administrators first work together to identify the desired results of the language program and each course within the program. For example, consider:

  • What language skills or communicative modes does your program/course want to emphasize?
  • What do you want students to be able to do in the language?
  • What level of proficiency would you like students to reach?

When educators and administrators work as a group to identify desired results, more viewpoints are heard, helping to ensure that no ideas or concerns are overlooked. This collaboration also ensures that courses progress from one to another in a logical and effective way.

Stage 2:
Determine what evidence will be used to show whether students have achieved the desired results.
Performance assessment is an effective way to gather evidence of student progress in world language classrooms. Performance assessment asks students to demonstrate what they can do with the language in real-world situations (Popham, 2011). Designing useful performance assessments means answering the following questions:

  • What types of student performances will be assessed?
  • What criteria will be used to assess these performances?
  • How will formative and summative assessments relate to each other?

When designing performance assessments, you can collaborate with peer teachers to write and share tasks and rubrics and establish common or related summative assessments. Additionally, you and your colleagues can share the burden of administering tests, especially individual proficiency interviews, by offering to manage each other’s classes during administration. You can also rate performance tasks according to a rubric in order to check that you are all interpreting the rating criteria in a similar way, and observe each other implementing formative assessments in the classroom in order to provide a basis for sharing tips and ideas.

For more information about using performance assessment in the language classroom, see Classroom Assessment: What Teachers Need to Know by James W. Popham, The Keys to Assessing Language Performance by Paul Sandrock, and Collections of Performance Tasks and Rubrics: Foreign Languages by Deborah Blaz.

Stage 3:
Plan Instructional Activities.

It is important to design and implement instructional activities with your desired learning outcomes in mind. To do this, you will want to ask the following questions:

  • What knowledge and skills will students need in order to achieve established goals?
  • What instructional activities will help students acquire such knowledge and skills?

When desired results and acceptable evidence are determined through peer collaboration, it is easier to create and share instructional activities that build students’ skills and knowledge. This is because you and your colleagues will all be working with the same goals in mind. Having your desired outcomes aligned will mean that your classroom activities will be more parallel, as well.  Such collaborations are likely to improve the quality of activities developed, as well as to save time for all involved.

Backward design is most effective when teachers and administrators collaborate at all stages of the process. Collaboration helps ensure that instructional activities and assessments are effectively advancing students towards desired results as they progress through a world language program.

Blaz, D. (2001). A collection of performance tasks and rubrics: Foreign languages. Larchmont, NY: Eye on Education.

Popham, W.J. (2011). Classroom Assessment: What teachers need to know. Boston, MA: Pearson.

Sandrock, P. (2010). The Keys to Assessing Language Performance. Alexandria, VA: American Council on the Teaching of Foreign Languages.

Wiggins, G. and McTighe, J. (1998). Understanding by Design. Alexandria, VA: Association for Supervision and Curriculum Development.


April 2011

Testing Cultural Knowledge through Standards-Based Assessments
Amanda Hamilton, Center for Applied Linguistics

 “Globally-aware” language students must know more than just what to say. They must also know how, why, and to whom to speak—that is, they must understand the culture associated with the languages they study. Many educators have long acknowledged the importance of teaching culture in world language classrooms. The importance of assessing cultural knowledge, however, is often overlooked.

Assessment of cultural understanding serves the same purposes as assessment of more concrete skills: it allows instructors to monitor their students’ learning, adjust their own teaching accordingly, and motivate students to absorb and retain information (Sinicrope, Norris, and Watanabe, 2007: 50). Assessing cultural knowledge can also be a challenge, though. For one, instructors will probably have to design their own assessment instruments. In addition, because culture is naturally subjective, measuring students’ knowledge of it objectively requires great care. The following tips will help you incorporate effective, standards-based assessment into your teaching of culture.

  • Set your standards.

Broadly speaking, what do you want your students to know about the target culture? That is, what are your desired student learning outcomes? At this stage, think globally about the elements of cultural knowledge you want your students to come away with, making sure that any standards you develop are aligned with your curriculum.

The National Standards in Foreign Language Education Project has already developed a set of standards for teaching culture in world language classrooms that you might want to use. You can read them in the 2006 book Standards for Foreign Language Learning in the 21st Century. An online summary is available here.  

  • Decide on progress indicators.

What are the skills that a student must possess in order to show that he or she has met each standard? At this stage, break up your standards into concrete skills. For example, the National Standards in Foreign Language Education Project sets out the following standard: “Students demonstrate an understanding of the relationship between the products and perspectives of the culture studied” (2006: 9). Example progress indicators for this standard include students being able to discuss and practice artwork from the target culture (177) and identify products from that culture in their own homes (341).

  • Develop assessments (formal or informal) that will measure students’ performances on the progress indicators.

To assess whether students have developed the skills laid out through the progress indicators, you can try using

  • Portfolios: students submit collections of journal entries, essays, artwork, video or audio materials, etc. on specified cultural topics;
  • Self-assessments: students rate their own understanding of cultural topics based on very clear criteria that you set out;
  • Role-plays: you observe students as they act out a scenario in which they behave as if they were interacting with members of the target culture; or
  • Interviews: you discuss cultural topics individually with students in order to assess their understanding.

As always, there are several important factors to consider before you start using an assessment. First, before selecting or creating any assessment of cultural understanding, be sure that you know which skills and concepts you want to test. Second, make sure that the assessment you use clearly reflects the material you’ve taught in class. Third, be sure to explain the assessment clearly to your students, showing them why it’s important and how it relates to what they are learning. Fourth, be careful to choose an assessment that is practical for your situation. For example, if you have a large number of students, you may not have time to conduct individual interviews. Fifth, consider the effects, both positive and negative, that any potential assessment might have on your students and classroom. And finally, when you first implement a new assessment, be sure to use it only for informal, low-stakes purposes until you can be sure of how well it meets your assessment needs.

References
National Standards in Foreign Language Education Project. (2006). Standards for foreign
language learning in the 21st century, 3rd Ed. Yonkers, NY: Author.

Sinicrope, C., Norris, J. & Watanabe, Y. (2007). Understanding and assessing intercultural
competence: a  summary of theory, research, and practice (technical report for the foreign language program evaluation project). Working Papers in Second Language Studies, 26(1), 1-58. Retrieved March 8, 2011, from http://www.hawaii.edu/sls/uhwpesl/26(1)/Norris.pdf


March 2011

Testing the Millennials
ByAnne Donovan
Center for Applied Linguistics (CAL)

The generation moving through the school and university system now has been named the Millennial Generation, Generation Y, the Net Generation, and other such allusive monikers to distinguish it from other generations. The tie binding members of this generation together is their unique status as digital natives – those for whom access to the internet and computers is a right, not a privilege (let alone a technological miracle) (Gaston, 2006). Millennials’ level of comfort with and attachment to digital technology should be viewed as a strength rather than a detriment to classroom learning. Access to the Web provides opportunities for authentic language exposure that allow students to engage more directly with everyday use of the language relevant to their lives.

Authentic materials have long been esteemed in language education and assessment because they provide students with a clear and relatable example of real language use. Use of authentic materials can lead to positive washback on student motivation (Gilmore, 2007), and instructors are encouraged to use authentic tasks to evaluate a student’s language proficiency (Brown, Hudson, Norris, & Bonk, 2002). The required authenticity of such tasks is two-fold; tasks must be authentic to both the target language culture and to the student’s own reality and experiences. For example, while a restaurant task may simulate an authentic interaction in the target language culture, a teacher would not construct a speaking task requiring elementary school students to order food because that is not a task that an 8-year-old would typically perform in his or her ordinary life.

A critical concern for authenticity is ensuring authentic language use – making sure that the language elicited is used to accomplish real-world communicative goals that are relevant to the learner. Ten years ago, a meaningful assessment task may have involved composing a letter to a customer service representative or penpal. Today, customer service complaints are frequently sent via Web forms and written penpal exchanges have largely been replaced by online chats that may include audio or video. Just as past generations may have been required to maintain journals, journal-writing can be adapted for Millennials to include blogs. Asking a student to perform a language task using a medium that is unfamiliar or seemingly antiquated will neither benefit student motivation nor assessment outcomes in the language classroom (Gilmore, 2007).
           
In summary, an authentic assessment task needs to be consistent with the target language culture and the student's experience, while most importantly reflecting real world language use. Thus, for Millennial students, authentic assessment tasks may reflect the digital experience. The potential washback for student motivation could be tremendous when a Millennial student, for example, realizes that he or she can successfully interpret a French Facebook page. Teachers must be careful when incorporating authentic digital materials in the classroom, however. The wealth of authentic materials available online must still be culled for general appropriateness and suitability to students’ linguistic and developmental levels. With this in mind, integration of technology-based assessment tasks can benefit student motivation and involvement, as well as present new opportunities for authenticity in testing the Millennials.

 


February 2011

Using Formative Assessment to Inform Teaching in the Target Language
Francesca Di Silvio
Center for Applied Linguistics

Teaching in the target language can facilitate your students’ language development by establishing a meaningful context for use of the language. Students surrounded by the target language perceive how real communicative needs drive language use as they learn how to express themselves and solve problems in the newly acquired language. While communicating in the target language can push students outside of their comfort zone,instructors can support their comprehension and production by providing constant target language input and opportunities to communicate. With such extended use of the target language, it is very important to regularly check that students understand this input and are able to produce the language being modeled. Formative assessment is therefore a critical tool for teaching in the target language.

Formative assessment refers to assessment that is used during the course of instruction to collect evidence about student progress in order to plan subsequent instruction (Heritage, 2007). In contrast to summative assessment, it is by definition ongoing, low stakes, and forward-looking. Formative assessment is not used to determine course placement or for external accountability purposes. Rather, by eliciting feedback about students’ current levels of understanding, formative assessment provides the data teachers need to inform instruction. Formative assessment can be planned in advance or occur spontaneously as an instructor monitors student performances during class. In either case, its defining characteristic is that it is part of a continuous process (Heritage, 2007).

Developing a system of formative assessment that provides feedback to both instructors and students about progress and areas of struggle will contribute greatly to effective teaching in the target language. To maximize effectiveness, planned formative assessments should build on those best practices that help engage students and support their efforts to communicate in the target language. For example, instructors can incorporate authentic materials or hands-on activities into their assessments. Even at the beginning level, this can involve comprehension and performance checks in all Communicative Modes. For example, students might show their interpretive listening skills by moving in response to verbal commands (often called Total Physical Response or TPR). To demonstrate their reading skills, students might interpret authentic texts found in familiar forms such as street signs. Similarly, students might demonstrate interpersonal communication skills by participating in short dialogues and display presentational writing skills by designing a short menu. The feedback instructors receive from all of these assessment activities can indicate whether students have absorbed the material or whether it needs to be repeated with a different approach.

As students progress in their language development, formative assessments can become more complex. Crucially, though, the results are still used to adapt future instruction rather than to make evaluative judgments about learning that has occurred. Research has indicated that ongoing formative assessment has a greater positive effect on student achievement than summative assessments such as comprehensive final exams and standardized national or district tests (Black & Wiliam, 1998). Building formative assessment into the curriculum thus yields multifold benefits in the target language classroom. Through regular formative assessment, instructors can ensure that students are following along and staying motivated, thereby bolstering efforts to stay in the target language and ultimately improving long-term learning outcomes.

Black, P., & Wiliam, D. (1998). Inside the black box: Raising standards through classroom assessment. Phi Delta Kappan, 80(2), 139-148.

Heritage, M. (2007). Formative assessment: What do teachers need to know and do? Phi Delta Kappan, 89(2), 140-145.


January 2011

Test Accommodations in the World Language Classroom
By Kate Riestenberg, Center for Applied Linguistics

Test accommodations, or changes to testing procedures designed to support students’ ability to show their skills, are intended to underscore the concept of “Education for All.” They are used to provide an opportunity for students with special needs to demonstrate their skills and knowledge without unfair limitations or restrictions. As a world language instructor, it is important to recognize that some students have learning disabilities that make it particularly difficult to learn a second language. Test accommodations are one way to ensure that these students are still able to show what they can do in the target language.

Under Section 504 of the U.S. Rehabilitation Act and the Americans with Disabilities Act, individuals with disabilities may not “be excluded from the participation in, be denied the benefits of, or be subjected to discrimination under any program or activity receiving Federal financial assistance” (Office of Civil Rights, 2009). Most state or local education authorities have established protocols for complying with Section 504, and it is important for instructors to be familiar with these procedures. Often, instructors will work with administrators to develop a “504 plan” for a student, and this plan may include test accommodations.

Students who could benefit from test accommodations in world language classrooms are those who show signs of distinct phonological, orthographic, or grammatical difficulties. For example, students who have a history of problems processing, retrieving, and producing sounds may find acquiring the new sounds of a foreign language extremely challenging. Some students have trouble identifying differences between similar sounds, while others may be able to cognitively sort sounds but have trouble transferring that understanding when using their speech organs to produce sounds. Other students struggle more with orthography; that is, making a connection between written symbols and the sounds or meanings that they represent. Still others have issues specific to the application of grammatical rules.

In order to address such potential problems among language learners, certain state or local educational policies allow for test accommodations and/or test modifications. Both are changes to the test administration procedures or testing environment, but test accommodations do not alter the test construct (i.e., what the test measures), whereas test modifications may actually alter the test construct. For example, a 504 plan might include a test accommodation that allows a student with orthographic issues more time to complete an assessment. This accommodation does not fundamentally alter what the test is measuring. Or, for a student with pronunciation issues, a 504 plan might include a test modification that emphasizes the parts of an assessment that are unaffected by the student’s disability (e.g., listening comprehension) and de-emphasizes the parts of an assessment that are directly affected by the disability (e.g., reading out loud). This modification does somewhat change what the test is actually measuring, but it is still a fair way to let students showcase their language abilities.

Recently, the question of which test accommodations are suitable for English Language Learners (ELLs) has been raised. It is important to distinguish the types of problems in learning a second language associated with the disabilities discussed above from difficulties ELLs sometimes face when learning a third language in school. Some ELLs may be very strong language learners; those who are already approaching bilingualism can often harness these skills when learning a third language (Sanz, 2000). However, for students who are still struggling with learning English, the additional burden of learning a third language may be overwhelming. When creating world language tests for ELLs, it is important to keep their level of English proficiency in mind. For example, if possible, you may choose to provide test instructions in the student’s native language instead of English (Abedi, Courtney, Mirocha, Leon, & Goldberg, 2005).

Understanding the purpose and proper implementation of test accommodations can be a determining factor in helping students succeed in your foreign language classroom. It is therefore critical to be familiar with your school’s 504 procedures and test accommodation policies. For more information on test accommodations in foreign language classrooms, you may want to consult your school’s health center or speech-language pathologist.

Abedi, J., Courtney, M., Mirocha, J., Leon, S., & Goldberg, J. (2005). Language accommodations for English Language Learners in large-scale assessments: Bilingual dictionaries and linguistic modification. ). Los Angeles: National Center for Research on Evaluation, Standards, and Student Testing, University of California, Los Angeles. Retrieved from http://www.cse.ucla.edu/products/reports/r666.pdf

Office of Civil Rights and U.S. Department of Education , (2009). ). Protecting students with disabilities: frequently asked questions about section 504 and the education of children with disabilities. ). Retrieved from http://www2.ed.gov/about/offices/list/ocr/504faq.html

Sanz, C. (2000). Bilingual education enhances third language acquisition: Evidence from Catalonia. ). Applied Psycholinguistics 21)., 23-44.


December 2010

Echoes from the 2010 East Coast Organization of Language Testers Conference
By Francesca Di Silvio, Center for Applied Linguistics

The East Coast Organization of Language Testers (ECOLT) held its ninth annual conference in October at Georgetown University in Washington, DC, supported by the National Capital Language Resource Center, Georgetown University, the Center for Applied Linguistics, and Second Language Testing, Inc. Consistent with this year’s theme, Innovations in Language Assessment, Dr. James Purpura of Teachers College, Columbia University, conducted a pre-conference workshop on assessing grammar and a plenary on learning-oriented classroom assessment. Over 130 language professionals, including test developers, employees of government agencies, researchers, and teachers, attended the one and a half day conference. ECOLT 2010 featured 12 papers and 12 posters on such topics as self-assessment, testing language for specific purposes, test accommodations, and testing policy, as well as reading, speaking, writing, and listening assessment.

One overarching theme of the conference was how specific tests focus on their target audience, a critical aspect of test validity that needs to be considered along with what teachers want to learn from test results. The various viewpoints of the presenters, whether they were discussing assessing adults, children, Native Americans, or English language learners, demonstrated that language professionals are striving to make fair and accurate assessments. It is heartening for language teachers to see the efforts being made in the field to ensure the respectful treatment of learners, and attending conferences like ECOLT provides the perfect opportunity to see such work in person. It is equally heartening for test developers to hear from classroom teachers at these events.

Readers of this column may be asking, “What can I as a language teacher take away from this testing conference now that it has passed?” First, you can visit the ECOLT 2010 website at http://www.cal.org/ecolt2010/ to view the full conference program and presentations available for download. Second, if you are in the DC area, please consider attending ECOLT 2011. It will be held in Washington in October or November, and more information will be posted in this newsletter and on the CAL website next year. Finally, think about other conferences and professional development opportunities that might be available to you to learn more about language testing.

You don’t have to attend a language testing conference to find interesting and informative presentations about testing materials and best practices. Be sure to check the programs of language education events to find opportunities to increase your understanding about testing, including conferences hosted by national and regional organizations of language teachers that target general audiences (American Council on the Teaching of Foreign Languages, Northeast Conference on the Teaching of Foreign Languages) as well as language-specific audiences (National Council of Less Commonly Taught Languages, American Association of Teachers of French, Chinese Language Teachers Association). Learning about what and how other language educators are testing, as well as the research underpinning the tests, will build your assessment literacy. This knowledge of testing concepts will help you to implement effective testing practices in your language classroom to accurately measure student outcomes. Then, using test results to inform your teaching will maximize student learning.


September 2010

Back to School: Sequencing and Prioritization
by Annie Donovan, Center for Applied Linguistics

A new school year presents an opportunity for a fresh start to design and implement an assessment strategy that will inform instruction and provide feedback to the instructor and the program. An assessment plan should mirror the instructional curriculum and what is happening in the classroom. Every classroom has instructional goals; assessment can measure student attainment of those goals and benchmark students’ progress along the way.

An assessment plan should involve both formative and summative assessment (Brown, 2004). A sample effective assessment strategy would involve formative assessments daily and weekly, and a summative assessment at the end of the marking period. Formative assessments, used during the course of instruction to check in on student progress and effectiveness of instruction, should be used often. The results can be put back to use in the classroom, showing areas on which to spend more time teaching, ways to build on student strengths, and ways to improve instruction. Short, daily formative assessments should target the key instructional goals toward which an instructor has been working. These quick checks should not seem like a “test” or even a “quiz” to students, but instead provide a snapshot of whether or not the students have grasped the material , which can be used to inform and adjust the subsequent lessons. The daily check-ins can culminate in a weekly formative assessment. Since this assessment is trying to capture the instructional goals of a full week as opposed to a single day, it will naturally be longer, but still should not be the focus of a single day’s instructional time. These assessments should blend seamlessly into instruction, becoming a part of the routine for students and aiding planning for teachers.

A summative assessment at the end of the marking period, semester, or school year captures the themes of the formative assessments. This is the assessment that will show to what extent the students have reached the overarching goals of instruction. An appropriate summative assessment is a reflection of the themes of the curriculum and the domains that have been covered. The results of the summative assessment may be used to determine if the program has been successful, if students can move on to the next level, and if a teacher has met his/her goals.

Frequent assessment can raise many red flags in a teacher’s mind. How will I find time for this? How will I assess everything that I teach? The key to answering these questions is prioritization. Going back to the instructional goals, examining lesson plans, deciding what needs to be known about students’ learning in order to plan for the next week and the next month – these actions will lead to assessments that are useful day-to-day as well as at the final stage. If a review of the curriculum shows that there is not enough time to assess everything that is being taught, it is a likely indicator that there is not enough time to teach everything that is planned.

When designing an assessment plan that will work for instructors and students, end goals should set the pace (Wiggins & McTighe, 2005). This strategy, referred to as backward design, uses a framework of overall instructional goals to inform how the pieces of an assessment plan fall into place. This framework should be shared with students to provide them with a roadmap of where they are starting and where they are going, enabling them to participate in their own personal goal-setting and self-assessment. Ultimately, the assessment plan that is implemented should be a collaboration between teachers and students, maximizing language learning.

References:

Brown, H.D. (2004). Language assessment: principles and classroom practices. White Plains, NY: Pearson Longman.

Wiggins, G. & McTighe, J. (2005). Understanding by design (2nd ed.) Alexandria, VA: Association for
Supervision and Curriculum Development.


June/July 2010

World Languages and Global Education
by Victoria Nier, Center for Applied Linguistics

“Global education” has become a buzzword in the news. Whether this should be seen as a sea change or one end of an unstoppable pendulum swing between isolationism and engagement is up for debate; the reality is that in the United States, right now, there’s a push for our children to get a “global education” that will prepare them to participate in the international marketplace of ideas and economics. For people who teach languages, this new or renewed push could be a very good thing. The push toward global education could be our chance to institute world language programs and principles in our local, state, and federal level educational policies – but we will only be able to accomplish our goals as language educators if we are smart about planning.

So what does this have to do with testing? Everything. Language assessment is one of the very best tools we have at our disposal to make the case for why world languages should be included in any global education program. Studies show the cognitive benefits of bilingualism,1 and the government has acknowledged the great national need for proficient speakers of languages other than English2, but as a nation, we have yet to commit our time, energy, and financial resources to teaching world languages in the K-12 curriculum. The necessary commitment will not happen without a groundswell of support for world language instruction within towns and districts and states and across the nation from parents, teachers, and community members. These arguments will be bolstered if they can prove one thing: world language instruction in the K-12 curriculum works to engage students, foster foreign language proficiency and stimulate greater academic success. And that’s where assessment comes in.

So often we think of assessment as a burden, a mandated practice that can be, at worst, a punitive force that increases educational and social stratification among members of our communities already at risk. But assessment is simply a tool, and like any tool, it can be wielded to positive and negative ends. In the case of proving the benefits of world language instruction at the K-12 curriculum, assessment can be our best friend, because assessment speaks a language that government officials understand. Planned and replicable assessment is simply the systematic and scientific gathering of information about a topic we want to know more about. In this case, it’s the organized and positive way of showing that world language instruction has benefits for students, families, schools, and communities. This kind of assessment won’t punish students or scare teachers – it will celebrate their success. It’s the kind of testing that everyone can get behind.

So what are you waiting for? Get behind it! If you are a teacher, administrator, parent, student, or concerned citizen, figure out a way to measure the great strides that your students are making and then share this information! Sharing the presentational component (a poster display or student skit, for example) of a summative, integrated performance assessment at a school board meeting or parents’ night would be one easy and powerful way to show that language instruction works. You can even help if you’re a language instructor at the post-secondary level. Students can ultimately attain more in university language classes if they progress through strong world language curricula in primary and secondary schools that are connected to university programs. Only when we all get involved will we be able to make a case for what we in the field already know – the benefits of making world language instruction a core subject in the global education of our nation’s children.

References:
1. Mechelli, A., et al. (2004). Neurolinguistics: Structural plasticity in the bilingual brain. Nature, 431(7010), 757.
2. U.S. Department of Education International Research and Studies Program. (n.d.). Consultation with federal agencies on areas of national need. Retrieved April 17, 2009, from http://www.ed.gov/about/offices/list/ope/iegps/consultation.pdf


May 2010

How learning other languages got me temporarily stranded in an English-speaking country
by Margaret Malone, Center for Applied Linguistics

I had all sorts of grand ideas for this month’s column, including serious references to the importance of testing students’ performance in world languages in order to support world language programs. Then, I got stranded in England for six days at the tail end of the annual Language Testing Research Colloquium (my favorite conference, including ECOLT, because this year I was not LTRC chair) in Cambridge, England due to Eyjafjallajokul, also known as the Iceland volcano.

Being stranded is not fun, in part because it interferes with all the plans, timelines and deadlines that my Type-A, language testing persona makes, in part because my poor family missed me desperately (and I them), and in large part because I felt guilty complaining that I was “stuck” in a lovely place, with intelligent and stimulating (language testing) company.

Before the volcano interrupted my plans, I presented a poster on five years (and more) of efforts in language assessment literacy as well as a paper on CAL’s work with the Educational Testing Service to investigate user beliefs about the TOEFL ® iBT. I attended a number of papers and works in progress on topics ranging from paired testing of oral proficiency to validation of scores. Breakfast in the dorm featured tea and coffee with testers from around the world in various stages of jet lag, lunchtimes were spent at membership or editorial board meetings and dinners included project meetings and discussions of deadlines and arguments about current issues in language testing. And then the volcano literally stopped air travel for six days.

As we wandered around Cambridge in search of cheaper hotels and even cheaper wi-fi, many of us reflected to each other and internally what had brought us there. And by “there” I mean—how did we end up in language testing? Many of us were fortunate enough to be born into bi- and multilingual families or began studying another language in childhood. As I ran by the River Cam every morning, I reflected on many things, but my language-related reflections surprised me. My primary reaction was how incredibly easy it was to be stranded in a country where I spoke the language and basically understand the culture. My second reaction was disappointment that I was stranded in another country where I had little opportunity to work on my second language proficiency. Although I did try to book a flight home from Madrid or Portugal (“Imagine how much Spanish I could practice!” I exclaimed to my skeptical spouse in D.C.), it turned out that such routes were financially and logistically impractical.

At the same time, being stranded with a group of language testing colleagues was an unexpected windfall. We were able to discuss in detail issues that we often must abandon after a three-day conference or three-hour dinner, and, more importantly, we were able to return to issues raised during the conference two, three, and even seven days after the issues emerged. I was lucky enough to have two of my early mentors, Charlie Stansfield and Charlene Rivera, in Cambridge (in fact, they got a group rate for us at their hotel), and we were able to reminisce about the work we’d done together in the 1990s.

My own interest in language brought me to linguistics and, later, my work with Charlie Stansfield at the first iteration of the Georgetown/CAL Language Resource Center brought me to language testing. Hired as a graduate intern, I thought I would “do” testing for a year and then return to language teaching. Twenty years later, I am still involved in language testing, because I believe it allows us to measure the progress of language study and find ways to improve programs as a result of these outcomes.


April 2010

Notes from a Novice Tester: Bilingualism, Heritage Language Learners, and Assessment
by Kate Riestenberg, Center for Applied Linguistics

Defining bilingualism is no easy task. Research on language acquisition has shown that the question, “Are you bilingual?” cannot be adequately answered with a simple yes or no. Language acquisition depends on an infinite number of factors that vary from person to person. No two language learning experiences are the same. As researchers such as Ortega (2010) have pointed out, it makes little sense to talk about native speakers versus non-native speakers when language abilities can fall anywhere along a vast and intricate spectrum. Is there a line that separates bilinguals from second language learners? Many language teachers in the U.S. wonder where on this spectrum their heritage language learners fall, for example. The term heritage language learner is used to describe anyone with family connections to a non-U.S. culture and/or language (Carreira, 2004); it encompasses a wide and varied array of language abilities.

How, then, can we fairly assess the language abilities of heritage language learners? Since starting my position as a Research Assistant in Language Testing at the Center for Applied Linguistics, I’ve thought about this question quite a bit. Fortunately, language testing experts have as well, and we can examine this question by thinking about best practices in language assessment. These practices take into account the complex nature of language. They stress the importance of ensuring that assessments are valid, meaning that they measure what they are supposed to measure within a given context.

To ensure that a test is valid for heritage language learners, we need to ask some questions about how it is being used. We can build a case for validity by addressing the following questions and considering how they relate to the tricky concept of bilingualism.

1) Does the test measure what it is supposed to measure? Tests meant for learners with little experience in the language may not be appropriate for learners who speak the language at home. We need to consider the backgrounds of the students we are testing along with the skills or abilities we want to find out about.

2) Do users take the test seriously? Heritage language learners may feel that a test meant for second language learners does not allow them to showcase their abilities. Students don’t take tests seriously when they are too hard or too easy, which can affect the validity of the test results.

3) Does the test reflect real-life use? The language a heritage learner uses may not may not reflect the language taught in a curriculum meant for second language learners. Second language curricula often focus on formal, standard language, whereas heritage learners may only be familiar with informal language or a certain dialect of the language. It is important that we value the heritage learner’s authentic language experience when we design tests for our classroom.

4) Is the test consistent with instruction? Someone who is familiar with more than one language cannot necessarily carry out all functions in all of the languages he or she knows. Therefore, it is always important that tests are aligned with what is taught in class.

5) Is the test reliable? All students, whether considered bilingual, heritage, or second language learners, deserve to have tests that give consistent results, regardless of when it is administered and scored and who is scoring the test.

Checking your test use against these five questions is always important in order to build a case for validity. It helps to address the needs of heritage language learners by ensuring that a test is really measuring what we want it to measure. In this way, conclusions that we draw from the test to inform instruction and evaluate our students are fair and beneficial to language learning.

Carreira, M. (2004). Seeking explanatory adequacy: A dual approach to understanding the term "heritage language learner". Heritage Language Journal, 2(1), Retrieved from http://www.international.ucla.edu/media/files/Carreira.pdf

Ortega, L. (2010). The Bilingual Turn in SLA. Plenary delivered at the Annual Conference of the American Association for Applied Linguistics. Atlanta, GA, March 6-9. Retrieved from http://www2.hawaii.edu/~lortega/.

 


What does it take to be accepted as a professional in the language testing field?
by Margaret Malone, Ph.D. Center for Applied Linguistics

This month’s newsletter focuses on what it takes to be accepted as a professional in the language teaching field. Therefore, I’d like to explore what I believe it should take to be accepted as a professional in the language testing field; for language testing, that means following the norms and guidelines developed for this field.

The International Language Testing Association (ILTA) has developed Guidelines for Practice. This document outlines the basic responsibilities of all stakeholders—from test developers to test takers—in the testing process. For many in language testing, understanding and following these Guidelines is and should be expected in order to be accepted as a professional in the field. I’ll address the basics of the Guidelines below; you may also access them at http://www.iltaonline.com.

The first part of the Guidelines focuses on what test developers and the institutions that administer tests must do. For example, the Guidelines demand that the test developer understand and clearly state the test construct, or what the test is measuring. As we often discuss in this column, all tests take place in a specific context and for specific purposes. Therefore, test developers must understand what is being measured and identify it; this avoids the potential danger of developing a test for one purpose and having it later used for a different purpose! This part of the Guidelines also emphasizes the importance of reliability in testing--tests must yield consistent results.

I believe that the Guidelines take a responsible and sensible approach to addressing the responsibilities and obligations of all involved in the language testing process. These responsibilities, among others, include defining the construct of the test clearly, trying out the test before it is released, and providing adequate training to scorers. In addition, test developers should implement a simple and transparent way to report test results, a process that is often mystifying to test takers (and their parents).

However, test developers and item writers are not the only population that has responsibilities indicated in the Guidelines. The Guidelines also address specific responsibilities of the institutions that administer tests and organizations that make the tests available. Perhaps my favorite line in the Guidelines is that such all who develop and administer tests should “make a clear statement of the construct the test is designed to measure in terms a layperson can understand.” This statement is important, because it suggests that testing should not be a mysterious process but rather one that is accessible to a non-specialist audience. In any field, it is altogether too easy to get caught up in jargon and buzz words and forget that it is important to explain what we do in a way that is helpful.

The second part of the Guidelines outlines the rights and responsibilities of test takers. While we may take for granted many of these rights (e.g., being treated with courtesy, and understanding the registration fees for a test), it is equally important to understand the responsibilities test takers have to treat others taking the test with respect during the testing process (for example, keeping interruptions to a minimum).

For me, being accepted as a professional in language testing means following the ILTA Guidelines in all aspects of test development and operation. For test users, good testing practice means being aware of why one is being tested and what one’s rights and responsibilities are with reference to the test.

International Language Testing Association (2007). Guidelines for practice. Accessed from http://www.iltaonline.com/code.htm on February 28, 2010.


February 2010

Notes from a Novice Tester: Authenticity in World Language Testing

By Francesca Di Silvio, Center for Applied Linguistics

This month’s NCLRC newsletter explores the term world languages that is being used in place of the term foreign languages. While foreign implies otherness and isolation, saying world languages embraces the interconnectedness of the global community and reflects the historical and growing multilingualism of the United States. Educational institutions across the country have adopted world languages in their curricula and standards to refer to any language used around the world.

How can we see the impact of this change in terminology and underlying perspective in what can be called world language testing? In assessing language learning, the understanding that foreign is always relative and often exclusionary is important to keep in mind. In order to reaffirm the world language theme of interaction among diverse peoples, teachers can use test tasks that are authentic to the language and culture of instruction. Authenticity reinforces the message that studying the languages of the world will help students in their future careers as well as their lives as global citizens (Jensen, J., Sandrock, P., & Franklin, J., 2007). To increase authenticity, test tasks should be similar to real-world tasks. They should feature natural language and topics to which test-takers can relate (Brown, H. D., 2004).

In the era of world languages, authentic test tasks must also account for different cultural norms and practices. For example, a test task involving ordering food in a restaurant is highly authentic because it represents a common real-world activity that is relevant and interesting for the language learner. Task authenticity increases when sequenced items build upon each other. For example, a test-taker first requests a menu, then asks a clarifying question about a dish, and finally places an order. However, a restaurant task that asks students to order dessert would not be authentic for Chinese learners because a sweet final course is not typical of the Chinese dining experience.

Another authentic test task might ask students to describe how they spent their weekend. This question is commonly asked in real-life situations among friends and co-workers every Monday in the U.S. However, many parts of the world have a typical workweek and weekend that differs from ours. In Arabic-speaking countries such as Egypt, Jordan, and Syria, the official workweek is Sunday through Thursday. It is important to take such differences into consideration when writing test tasks. Whenever possible, tasks should be reviewed by someone native to the target language and culture in order to ensure authenticity and appropriateness.

Incorporating authentic tasks into your assessment program better prepares students to perform the target language in real life. Authenticity can increase students’ understanding of the cultural context of the language as well as students’ motivation to learn the language, furthering the goals of today’s world language classroom.

Brown, H. D. (2004.) Language assessment: Principles and classroom practices. White Plains, NY: Pearson Education, Inc.
Jensen, J., Sandrock, P., & Franklin, J. (2007). The essentials of world languages grades K-12: Effective curriculum, instruction, and assessment. Alexandria, VA: Association for Supervision and Curriculum Development.

 


January 2010

How technology can help foreign language testing
 by Meg Malone, CAL

The theme of this month’s NCLRC newsletter is how technology can enhance world language teaching and whether such innovations can replace the language teacher. This column examines the issue from a testing perspective: what changes have technological innovations made in language testing and how do they help the language teacher?
                 
I remember back in the 1990s when we imagined that computer-based language testing would change the way we tested--it would make testing more efficient, eliminate the need for proctors and allow us to test all our students across all skill modalities more frequently. The reality is that computer-based tests, like paper- or tape-based tests, take time and effort to develop and pose new concerns in addition to any challenges they alleviate. This column explores both the opportunities and challenges inherent in computer-based testing.
                 
Opportunities inherent in computer-based testing are enormous. Theoretically, a student can take a test at any time in any place where a computer is available. In addition, using computers for testing can save paper. The computer can often grade tests for the teacher, particularly multiple-choice tests, therefore increasing the test’s reliability and saving the teacher time and energy. The teacher can also use the computer to randomize test items to reduce the temptation of students to copy each other’s work. Additionally, the teacher can keep track of student test scores on the computer, compute grades more often and even look at the test data to determine which test items work well and which ones should be changed or eliminated for future years.
                 
At the same time that opportunities have arisen from the use of computer-based testing, a number of challenges exist. The first challenge is that of availability for students: how many computers are available at the same time for all students to take a test; a related challenge is availability of computers for teachers to use to develop these tests. A second is that of technological savvy; that is, do teachers have enough technological skills to develop computer-based tests, and is developing such skills worthwhile in an era of constantly-changing software? A third issue is that of test security: how can we be sure that test data is as secure on a computer as it was in a locked file drawer?

Clearly, there are more questions than those described above, but I hope I’ve given you a taste of some of the issues that arise with computer-based testing. It is clear that computer-based testing can provide both opportunities and challenges for world language teachers and learners. One way that technology has provided opportunities in assessment for language instructors is through online professional development and distance learning. Goertler and Winke (2008) published a volume addressing issues in using distance learning for language learning and assessment. At CAL, we have developed a number of online courses that address issues from providing basic background in language assessment principles to providing online training to rate specific language tests. We have found that this medium works very well for some kinds of professional development, such as general background on oral proficiency (Cavella and Malone, 2008). The same issues that challenge world language teachers in using computers for their students also emerge in using distance learning for language purposes, however: availability of the correct hardware and software as well as accessibility of the content in a computer-mediated format.

In short, technology is like any new invention. It gives us opportunities to reach different audiences in new ways, but it is not a panacea.
 
Cavella, C., and Malone, M. (2008). Teaching Principles of Assessment Online. In S. Goertler and P. Winke (eds.), Opening Doors through Distance Language Education: Principles, Perspectives, and Practices. CALICO Monograph Number 7.


November 2009

How language testing can (help) save your language program
by Meg Malone, CAL

We live in an age of accountability; since No Child Left Behind (NCLB) was enacted, this accountability has resulted in increased testing in U.S. schools. Because foreign (or world) languages are not considered “core” subject areas, many states do not require testing of foreign languages. At the same time, the planned 2004 National Assessment of Educational Progress for Foreign Languages (NAEP-FL), which was engendered by the passage of NCLB, has not occurred nationwide. Therefore, we have no national documentation of our students’ language capacity based on a standard test administered across the United States.

The testing-phobes among us may say, “Good! At least our kids aren’t being over-tested.” While they do have a point, without assessment-- reliable, valid assessment that we have already discussed ad nauseum in this column-- ,we can’t know if our programs are effective or how they contribute toward developing the language capacity that the U.S. so desperately needs to meet our economic, diplomatic and security needs. Moreover, as language educators, we know that language learning is important above and beyond these needs. Through learning another language and culture, we appreciate the people, traditions and daily life of other citizens of the world and gain increased understanding of our own lives.

The cliché is: everything that can be counted doesn’t count, and you can’t count everything that matters. This may true in some cases, but until we start documenting and measuring our students’ progress (with reliable and valid instruments), we will be unable to make the case for the importance of foreign language learning in the United States.

What can you do, locally, to help assess your students?
· Examine your goals. What is your program’s goal in teaching languages? Is this clear to everyone--administrators, instructors, students, parents, and the greater community?
· Review your assessments. Are you assessing the goals of your program or simply using assessments that are available, whether they reflect appropriate outcomes or not?
· Communicate. Do administrators, students, instructors, parents and the community understand what the outcomes of assessment mean? More importantly, do they understand how each assessment builds to create a portrait of the goals of language education for your community?
· Tell the language community about what you’ve found. Present at local language conferences and national language conference.
· Publish your results. Publish your results so that other programs can compare their progress and establish realistic goals. We shouldn’t be competing; instead, we should be working together to determine logical outcomes for programs of different lengths for different grade levels in different languages.

What we do counts, and we need to remind our communities why.

Please also refer to http://www.cal.org/resources/languageframework.pdf for the paper “Building the Foreign Language Capacity We Need: Toward a Comprehensive Strategy for a National Language Framework”


October 2009

Notes from a Novice Tester

by Francesca Di Silvio

You may notice some changes to this month’s Testing Tips column:  new title, new author and, as always, new content.  As a new Research Assistant in the Language Testing Division of the Center for Applied Linguistics, I will occasionally submit monthly columns with fellow novice tester Kate Riestenberg.  In chronicling our journey to increased assessment literacy, we hope to share useful tips about testing terminology, trends and resources, written with an insider’s knowledge of the field of language testing but observed with an outsider’s eye.

In my early days as a language tester, I have been thinking about testing from the student viewpoint, an easy mindset for me to assume as an apprentice in this field not far removed from my days as a graduate student.  From the teacher’s perspective, understanding various test options and selecting a valid and reliable test appropriate to the learning situation is critical for informing subsequent instruction.  For the student, who deserves a fairly rated assessment that measures what it is intended to measure, foreign language educators’ assessment literacy is clearly important as well.

But language students are stakeholders in assessment beyond just as test-takers receiving grades, promotions and placements before moving on to another class.  Working directly to increase students’ assessment literacy in making transparent the test selection process and the principles and measures of the tests they are facing can greatly benefit their learning.  As Hughes notes, students can maximize the positive effects of assessment when they understand their assessment results (2003).  Given accessible information about test standards and results, students will be able to make self-assessments in tracking their language learning progress and understanding what elements they need to be improve to move to the next level. 

I suggest that in addition to applying what you learn from professional development in language testing to your assessment program, taking time in the classroom to share information about testing concepts, decisions and scoring at an age-appropriate level can ultimately improve language learning.  In contemplating the practicality, reliability, validity and impact of the assessments they undertake, foreign language students can feel engaged in the testing process and encouraged to connect their results to their learning.  Lessons on assessment are particularly fitting for the 21st century student who may thrive on the added feedback, critical thinking and active learning of understanding the testing process.  Finally, opening student eyes to the world of testing will allow them to see language assessment as more than just a standard to get through, but also a professional field worth exploring.


Hughes, A. (2003). Testing for language teachers (2nd edition). Cambridge: Cambridge
University Press.

March 2009

Testing Tips

As I searched for a topic for this month’s Testing Tips, my mind was too full of the conference I’m co-chairing in Denver this month to come up with any ideas. Which conference, you ask? I’m co-chairing the Language Testing Research Colloquium in Denver from March 16-20. It occurred to me that perhaps some of you have never heard of this conference and might wonder:

  • What is this conference all about?
  • Who goes to a conference devoted to the issue of language testing?
  • What do you talk about there?
  • How does this impact me as a teacher?

What is the Language Testing Research Colloquium?

According to the conference website (www.cal.org/ltrc2009), the Language Testing Research Colloqium (LTRC) is the annual conference of the International Language Testing Association (ILTA). ILTA is an international group of scholars whose research and dedication to the field of language testing are respected both within and outside the profession.

Who goes to a conference devoted to language testing?

All kinds of people go the conference! Psychometricians, language testers, language instructors, and linguists all come to LTRC. They are employed as university professors, researchers and test developers for for-profits and not-for profits. Some are graduate students, others are employed and many are both pursuing a degree and working. This year, we expect about 200 participants. While we work in different areas--language teaching, research, test development, evaluation and measurement--we all care about language testing and how it affects different participants in the test-taking process: students, teachers, administrators, educational agencies, policy makers, test developers and researchers. Participants come from all over the world, which makes discussion even more interesting. After all, how many of us have ever thought about how Australian language policy might be different from and yet inform US policies for NCLB? Well, at LTRC 2007, we talked about this!

What do we talk about?

During sessions, we talk about everything from research design to reliability to validity to social and political impacts of testing. Unlike many conferences, all sessions are plenary. This means there are no break-out or concurrent sessions, so all participants hear the same papers and comments. There are traditional papers, symposia, poster sessions and works-in progress. These different formats allow presenters to gain insight into their work from colleagues from around the world. After sessions, we sometimes leave the world of measurement to discuss other issues, like politics, food and music…..

How does this impact me, as a teacher?

The work presented, discussed and debated at LTRC reflects the many realities and challenges of language learning and teaching throughout the world. Because the participants come from many different backgrounds, it allows new ideas to be shared and later implemented. As an applied linguist and test developer, I often hear about new approaches for obtaining information from stakeholders about test impact. In turn, I often try out new approaches for obtaining data to improve test development. For example, when we were working on the Foreign Language Tutorial, we conducted focus groups with stakeholders to help us figure out how to improve the site and make it more useful. Similarly, we obtain feedback on all the tests we develop so that we have confidence that they best serve the populations that they are designed for.

Next month: Meg’s round-up of LTRC!

newsletter


Introducing the Foreign Language Assessment Directory and companion tutorial, Understanding Assessment

This month’s Testing Tips focuses on a new, free (!) resource for language instructors. The Center for Applied Linguistics (CAL) is pleased to launch the Foreign Language Assessment Directory (FLAD) and its companion tutorial, Understanding Assessment. We developed these two free, online resources with a grant from the U.S. Department of Education (#017A050033). They represent the culmination of collaboration between testing specialists at CAL and language educators around the country. The goal of these two resources is to provide foreign language instructors and administrators with free, online access to information on hundreds of available language tests as well as the tools and concepts needed to select an appropriate assessment.

Current trends in language education place increased emphasis on assessment and evaluation. Language instructors and administrators need resources for locating and selecting tests in an informed manner to help ensure that students are assessed with valid and reliable instruments. Beginning in 2005, we conducted background research on which types of resources would be most useful for educators. We held focus groups with foreign language instructors and administrators on how to adapt existing resources and create new ones to meet the needs of language professionals. We conducted focus groups throughout the research and development process, therefore allowing language educators to shape both the content and form of the final products. Focus group research showed that two existing resources could be updated to be more useful and accessible to educators: the Foreign Language Test Database (FLTD) and the Directory of K-12 Foreign Language Assessment Instruments and Resources (Directory). The FLTD was developed in the 1990s in collaboration with the National Capital Language Resource Center (NCLRC) and included information on foreign language assessments for secondary and post-secondary students. The Directory was developed in the late 1990s in collaboration with the National Foreign Language Resource Center (NFLRC) at Iowa State. The Directory consisted of information on foreign language assessments then being used in elementary, middle and secondary schools around the country. Our original work conducted with the NCLRC and the NFLRC created two separate directories that were both available on CAL’s website, but the current project sought to unify the two, updating information, eliminating redundancies and expanding test information. In doing so, we created a searchable directory of nearly 200 tests in over 90 languages other than English. Users can search the directory in different ways, including looking for tests by language, grade level, proficiency level, intended test use, skill tested and even by test name. The tests included in the directory are not endorsed by CAL, but represent the many options available for foreign language educators.

In addition to improving the FLAD’s format and content, focus group research also led to the creation of the interactive, moderated user review feature now available with the directory. The moderated user review increases the FLAD’s interactivity by allowing users to review a test that they have used, sharing their experience and providing advice to other users. Thus, the completed directory helps meet the needs of educators who must select assessments for their foreign language students, classes and programs.

Not only do educators need to know which tests are available, they also need to know how to best select reliable tests that are valid for their purposes, practical to implement, and will have a positive impact on their students. This knowledge, often called language assessment literacy, explains what language educators need to know about assessment (Boyles, 2005; Inbar-Lourie, 2008; Stiggins, 1997; Stoynoff and Chapelle, 2005). Understanding Assessment, the companion tutorial to the FLAD, addresses this need, providing users with an introduction to key concepts in assessment literacy. Users work through four modules devoted to these key concepts, each of which contains a real-life testing scenario to which they can apply the knowledge they have gained from the tutorial. At the end of the tutorial, users arrive at the FLAD, ready to search for an appropriate test. Because the tutorial discusses features of assessments described in each FLAD entry, the two resources work together to help users choose assessments that fit their needs. The resources section of Understanding Assessment includes a glossary of key assessment terms, links to online and print resources on language assessment and a repository of all of the forms from the tutorial. These forms include a needs assessment, a testing resources map and a methods of testing checklist, and we hope educators will use them for their own purposes. Because the tutorial is free and available online, users can work through it at their own pace.

Together, the FLAD and Understanding Assessment comprise a useful new resource for language educators who are interested in expanding their knowledge of assessment literacy. They provide users with an introduction to key concepts in language assessment and a chance to explore their own assessment needs, as well as a directory of foreign language assessments available to them. We hope that these resources provide opportunities to make well-informed decisions on test selection which will in turn improve assessment for instructors and students.

Boyles, P. (2005). Assessment Literacy. In Rosenbusch, M. (ed.) National Assessment Summit Papers. Ames, Iowa: Iowa State University, 11-15.
Inbar-Lourie, O. (2008). Constructing a language assessment knowledge base: A focus on language assessment courses. Language Testing, 25, 385-402.
Stiggins, G. (1997). Student Centered Classroom Assessment. Upper Saddle River, NJ: Prentice Hall. Stoynoff, S., & Chapelle, C. A. (2005). ESOL tests and testing: A resource for teachers and program administrators. Alexandria, VA: TESOL Publications.

Back to Top


Testing Tips: Announcing a Free Course on Oral Proficiency Assessment
By Margaret E. (Meg) Malone, Ph.D. - Center for Applied Linguistics

By now, many of us have settled into a routine for the school year (at least until weather and illnesses disrupt it!). Along with these routines come opportunities to assess students’ progress, as well as the challenges involved in regular assessment.

For many reasons, assessing students’ Interpersonal Communication and Presentational Speaking is among the greatest challenges instructors face. One challenge is that of time and student management; assessing Interpersonal Communication in a class of 25 students is difficult. Similarly, knowing what to listen for and grade students on while keeping the remaining 24 students engaged in meaningful tasks can be exhausting. Many of us have little or no background in assessment and even less in assessing speaking.

To help give language instructors a background in a nationally used rating scale, the National Capital Language Resource Center, through the Center for Applied Linguistics, is offering a one-month, five module course on oral proficiency assessment. The course, Assessment Training Online (ATOL), will be available November 1-30, 2008 and is offered free to the first 20 participants who sign up. The course includes the following five modules:

Module 1 introduceshe technology used throughout the course.

Module 2 provides an overview of oral proficiency testing.

Module 3 focuses on recognizing and describing the four major ACTFL levels.

Module 4 explores the structure of oral proficiency testing.

Module 5 reviews the nature of rating and holistic assessment.

You should expect to spend 3-5 hours per week on each module, including reading the material, listening to speech samples, participating in the weekly live chat and completing assignments. You must have a high-speed internet connection to participate.

Our hope in offering this course is to remove the mystery of oral proficiency testing and provide a solid background to language teachers.

Assessment Training Online (ATOL)

Purpose: The course provides language instructors with skills in oral proficiency assessment.

Dates: November 1 - 30, 2008

Cost: FREE to the first 20 participants

How to participate: Email Meg Malone . In the email, please include:

  1. Your name;
  2. Where you teach;
  3. The language(s) you teach; and
  4. One sentence about why you want to participate in the course.

Back to Top


Testing Tips Summer 2008
By Margaret E. (Meg) Malone, Ph.D. - Center for Applied Linguistics

Welcome to the latest issue of Testing Tips! In addition to other projects at CAL, I have spent much of the past couple of months conducting face-to-face and online workshops on assessments for language instructors. On issue that often emerges is what test (or tests) to use and whether one test can be used for every need you have. The more language teachers I meet, the more amazed I am by how much teachers accomplish every day, and how scarce the assessment resources are to help these teachers. As we’ve discussed before, it is important to align your purpose for testing with the test you choose. We’ve also talked about how to align the audience for testing with the test you choose. For example, if Mr. Enriquez wants to find out how well his students listen and read in Spanish, he should administer them a test of Spanish listening and reading. But that’s not enough; we need to make sure that the test is appropriate for his students’ and their age level, how much Spanish they have learned, and so on.

I receive calls from teachers like the mythical Mr. Enriquez every day. I often refer such calls to two major online resources. One resource is the Center for Advanced Research on Language Acquisition’s Virtual Assessment Center (VAC). The VAC includes a series of modules that will give you background, guidance and hands-on practice to develop your own or think about how to select your own test.
http://www.carla.umn.edu/assessment/VAC/index.html

The second resource is the Foreign Language Assessment Directory, which is a free, searchable directory of language tests. It provides information on over 150 language tests in over 60 languages other than English.
http://www.cal.org/CALWebDB/FLAD/

I recommend that you look at both resources. Then, go to the FLAD and look for a test that you think you could use with your students. Then ask yourself the following questions about the test,

Question Response
(Circle one)
Comments
Does the purpose of the test match my own purpose? Yes
No
 
Is the test appropriate for students of he same age and background as my students? Yes
No
 
Do I have the resources (time, equipment, materials, staff) to administer this test? Yes
No
 
Do I have the resources (time, equipment, materials, staff) to score the results of this test? Yes
No
 
Does this test measure the skills and knowledge that I want to measure? Yes
No
 
Can I report the results of this test to my students, administration and other stakeholders (such as parents)? Yes
No
 

Back to top


Testing Tips: Testing needs

Last month, I wrote about audiences for testing and focused on students and the testing process. This month, we're going to think about a larger range of audiences and their needs both in the actual testing situation and in reporting student outcomes. To maximize test results, it is most important to plan for assessment to make sure that the needs of all groups- students, instructors, administrators, parents, the business community- will be met by the results. It's often easier to imagine how to include different audiences in the testing process if the purpose and context of a test are clearly defined. Here’s an example: Ms. Parks, a middle school teacher, is testing her students. The purpose of the assessment is an end-of-year test in a sixth grade Spanish class to test how well students have learned the content of the course. The context is one middle school, in a mid-sized district which has been teaching a pilot Spanish course to sixth graders instead of waiting to begin language instruction in seventh grade.

Questions Ms. Parks may want to consider:
• Who might want information about the results of the test?
• How will the test outcomes potentially impact these groups??
Students have a direct relationship with their parents, and instructors have a direct relationship with the administrators. Therefore, to clarify the issue, we might wish to list all the audiences for the test and its results, and describe their needs for both the actual assessment process and for reporting results of assessments.

Group Assessment Needs Information Needs
Instructor

• Short assessment (one period or less)
• Can be conducted with multiple classes
• Limited computer access in class (2 computers)
• Limited time to administer tests and score tests because students go on a class trip for a week prior to the end of school

• Are sixth grade students ready for the next level of language?
• Did the students learn what I taught them?
• What will the results tell my boss about how well I did?
• How can I explain this information to my students and their parents?

Students

• Short assessment
• Hard to study the week before sixth grade class trip
• School air conditioning is iffy, and sometimes it's hard to focus in class

• What's my grade?
• Can I take the next level of the language?

Parents

• Must make sure that kids get to school on the morning of the assessment

• Was this experiment worthwhile?
• Will my rising seventh grader be able to continue learning this language?

Administrators
(Language Chair)

• Wants to select valid and reliable test
• Wants to observe students taking tests
• Limited budget

• Did I make the right decision?
• Will the results support my request for additional faculty next year?

Administrator
(Principal)

• Limited budget

• Was the experiment a success?
• How will we schedule all these students into classes next year?
• Are any of the language teachers dually-certified so they can fill other needs in the school?

Administrator
(Superintendant)

• Limited budget

• Why did we do this?
• Are the taxpayers happy?
• If the program is successful, should we spread it to the other middle schools?

Local
(Community/Business)
 

• Will these kids be able to speak this language well enough to communicate with customers when they are old enough to work?

If you want more information on this topic, I have listed a few good references below for thinking about assessment and stakeholders. Now that we've discussed all the stakeholders' needs, do you think there is one test that will meet the purpose and address everyone's needs? Stay tuned...

Annotated References:
Stoynoff, S. & Chapelle, C. (2005) ESOL tests and testing. Alexandria, VA: TESOL.
Short, accessible book which introduces basic concepts of language testing and reviews 20 English language tests

Bachman, L. & Palmer, A. (1996). Designing and developing useful language tests. Oxford: Oxford University Press.
Book with in-depth information on measurement, language test uses and methods, reliability, and validity

Brown, H.D. (2003). Language assessment: principles and classroom practice. New York: Pearson ESL.
A practical guide to developing your own classroom assessments

Hughes, A. (2003). Testing for language teachers (2nd edition). Cambridge: Cambridge University Press.
A book which provides a thorough but accessible overview of foundational concepts in language testing

McTighe, J. & Wiggins, G. (2005). Understanding by design (2nd ed). Alexandria, VA: Association for Supervision and Curriculum Development.
Handbook which explains the principles of backward design for classroom assessment

Back to top


Audiences for Testing

Welcome to the latest issue of Testing Tips! I've met many of our newsletter readers face-to-face at the whirl of conferences that comprises the late winter and early spring. As I have met many of you and discussed testing with our colleagues, I thought a great deal about audiences for testing.

I promised to write about audiences for testing this month. Audience includes the students you're testing, in addition to all those who are impacted by the results: parents, students, you, other teachers, administrators, the community, and more. This month, I will focus on the student audience, and next month, we'll talk about everyone else.

WHOM am I testing ? That may seem very straightforward, because we're usually testing our students. However, it's important to think about all of our students and all of their needs. Here are the questions we should ask about our students. Number of Students in a class. This is important for you and for them. It's important to design and select tests for the right number of students. With some of our oral proficiency tests, we find that students have difficulty concentrating if they are sitting too close to another student who is also responding aloud.

Student developmental level. In mid-March, I was trying out test items with students in grades 1-11. It amazed me, as it always does, to see how differently students understand what they are supposed to do on a test based on their age, developmental level and experience. Developmental growth, especially reading, is especially important for directions. Sometimes, students can't (or don't!) read the directions, so they respond incorrectly. Then, we don't know if the student didn't know the material or if s/he jus didn't understand the directions. This is especially important if you are working with a new group of students or if you are using a standardized test. It's important to make sure that students understand what they are supposed to do so that they can show you how much they have learned. Experience. What kind of experience do students have with testing? If we're presenting them with the first oral proficiency test of their life, we need to provide them with support and experience so that they can show what they can do. If they are taking a computer-based test, we need to make sure that they know how to use the computer.

Special needs. If our students have special needs, we need to consider these and accommodate them in our tests as well as our classes. This is a lot to think about. Next months, we'll talk about how to incorporate students' needs with your needs, as well as our other audiences: parents,, other teachers, administrators, the community, the school board.....

Back to top


The Process of Making a Test

As promised, this month's Testing Tips discusses the process of making a test. Teachers all over the world create tests, grade them, record the results, return the results to students and field questions from students, parents and administrators about these results. Organizations and assessment professionals also design tests. A great deal of time goes into test development, administration, scoring, recording, reporting and revising the tests.

So, what do you do first? This column deals with the first step in selecting or creating a test, which is to define your purposes for testing. Next, you can explore the type of test that might work for your situation.

Define your purpose. Before any person, group or organization makes a test, the first step is to define the purpose for testing. The first question is basic: why are you testing your students? Classroom teachers have a number of reasons for conducting assessments, and they may use test results to assign grades, place students in a course, reflect on their own teaching or make other decisions about their students and programs. Here are some example responses, based on emails I receive from language teachers:

  1. I want to know if students have learned what I taught;
  2. My program needs to place students from all over the country in the language program;
  3. I'm hiring someone, and I need to make sure they can communicate in the language; and
  4. I need to show the taxpayers that my program is working.

Before we talk about some basic types of tests, reflect on whether we should use the same kind of test for (1) as for (3). Would we use the same test for (2) as (1)? The answer: it depends.

Types of tests. Once you’ve defined why you are testing your students, you can begin to think about the type of test that might match your needs. There are many types of tests, and I won't describe them all (although the short bibliography refers to colleagues who have defined tests much better than I ever could).

  • Do you want to know how much students have learned (or to what extent they have internalized the wisdom of their teacher) in a day, week, chapter, unit or course? An achievement test provides information about how much students have learned and is directly related to the content of a course or program. Many classroom tests are achievement tests. #1 above is an example of an achievement test.

  • Do you want to know how students can perform a function in a close-to-real-life setting, such as (my personal favorite) ordering a meal or buying a ticket in the target language? Or, how about the example of interpreting for a doctor, nurse or other medical personnel to make sure that the patient understands her diagnosis? A performance test requires examinees to demonstrate knowledge or a skill through an activity or performance.

  • Do you want to know which level of language class is right for students entering one institution from various feeder institutions ? Placement tests are used to make decisions about student placement in a course or program. #3 is an example of a placement test; another is placing students from a number of middle schools into high school language classes.

  • Do you want to find out about students’ general ability in the target language? Proficiency tests measure general language ability. A proficiency test may be appropriate for example #4 because the results would show students’ general ability rather than achievement in a specific course.

As I mentioned in the first column, one size does not fit all in testing. It’s important to match the reason you’re testing with the type of test you select or create.

Next month, we'll discuss audiences for test results and how that may affect the testing process

References

- Bachman, L. & Palmer, A. (1996). Designing and developing useful language tests. Oxford: Oxford University Press.
- Brown, H.D. (2003). Language assessment: principles and classroom practice. New York: Pearson ESL.
- Hughes, A. (2003). Testing for language teachers (2nd edition). Cambridge: Cambridge University Press.
- International Language Testing Association Code of Practice. http://www.iltaonline.com/code.htm
- McTighe, J. & Wiggins, G. (2005). Understanding by design (2nd ed). Alexandria, VA: Association for Supervision and Curriculum Development.
- Stoynoff, S. & Chapelle, C. (2005) ESOL tests and testing. Alexandria, VA: TESOL.

Back to top


Testing Tips
By Margaret E. (Meg) Malone, Ph.D. - Center for Applied Linguistics

Welcome to NCLRC’s newest offering "Testing Tips!" Offered every month in the NCLRC newsletter, it will provide insight into foreign language testing for teachers, administrators and researchers. I hope you will send me your questions, and I will try to address them here.

Why is the newsletter focusing on testing? Since long before the passage of No Child Left Behind, testing has been important to provide feedback to learners and teachers. Since No Child Left Behind, K-12 testing has moved from the classroom to the front page. In this column, I’ll write about good testing resources (many available from our sister Language Resource Centers), the basics of testing, and your questions. Note that I haven’t said I would answer your questions but address them, because testing is not a “one size fits all” endeavor. Instead, it should be tailored to the specific context in which it occurs. Next month, I’ll write about the testing process and what goes into making a test.

If you have a testing question for Dr. Malone, please email it to her at: info@nclrc.org

®2009 National Capital Language Resource Center

Home | Professional Development | Newsletter | Culture Club | Contact Us | Site Map