<?xml version="1.0" encoding="ISO-8859-1"?><article xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<front>
<journal-meta>
<journal-id>1409-4703</journal-id>
<journal-title><![CDATA[Actualidades Investigativas en Educación]]></journal-title>
<abbrev-journal-title><![CDATA[Rev. Actual. Investig. Educ]]></abbrev-journal-title>
<issn>1409-4703</issn>
<publisher>
<publisher-name><![CDATA[Instituto de Investigación en Educación, Universidad de Costa Rica]]></publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id>S1409-47032014000200006</article-id>
<title-group>
<article-title xml:lang="en"><![CDATA[The testing of listening in bilingual secondary schools of Costa Rica: bridging gaps between theory and practice]]></article-title>
<article-title xml:lang="es"><![CDATA[Evaluación de la destreza auditiva en los colegios bilingües de Costa Rica: acortando brechas entre teoría y práctica]]></article-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Gamboa Mena]]></surname>
<given-names><![CDATA[Roy]]></given-names>
</name>
<xref ref-type="aff" rid="A01"/>
</contrib>
<contrib contrib-type="author">
<name>
<surname><![CDATA[Sevilla Morales]]></surname>
<given-names><![CDATA[Henry]]></given-names>
</name>
<xref ref-type="aff" rid="A02"/>
</contrib>
</contrib-group>
<aff id="A01">
<institution><![CDATA[,Universidad de Costa Rica  ]]></institution>
<addr-line><![CDATA[ ]]></addr-line>
</aff>
<aff id="A02">
<institution><![CDATA[,Universidad de Costa Rica  ]]></institution>
<addr-line><![CDATA[ ]]></addr-line>
</aff>
<pub-date pub-type="pub">
<day>00</day>
<month>08</month>
<year>2014</year>
</pub-date>
<pub-date pub-type="epub">
<day>00</day>
<month>08</month>
<year>2014</year>
</pub-date>
<volume>14</volume>
<numero>2</numero>
<fpage>155</fpage>
<lpage>179</lpage>
<copyright-statement/>
<copyright-year/>
<self-uri xlink:href="http://www.scielo.sa.cr/scielo.php?script=sci_arttext&amp;pid=S1409-47032014000200006&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://www.scielo.sa.cr/scielo.php?script=sci_abstract&amp;pid=S1409-47032014000200006&amp;lng=en&amp;nrm=iso"></self-uri><self-uri xlink:href="http://www.scielo.sa.cr/scielo.php?script=sci_pdf&amp;pid=S1409-47032014000200006&amp;lng=en&amp;nrm=iso"></self-uri><abstract abstract-type="short" xml:lang="en"><p><![CDATA[Despite listening being one of the most crucial skills in the process of communication, research shows that it has been neglected in most English as a Foreign Language (EFL) programs, both worldwide and in Costa Rica. Worse yet, mismatches between theory and practice often result in poor listening assessment in many institutions. Thus, this article examined current listening testing practices by Ministry of Public Education (in Spanish, MEP) in-service teachers ranked C1 according to the Common European Framework (CEF) in bilingual secondary schools of the West Area of Costa Rica. Listening tests created and administered by those teachers were analyzed for their compliance with both theory on listening assessment and MEP’s guidelines on assessment. The study revealed that even though teachers had previously received training on testing, the tests they created do not fully comply with both MEP’s guidelines and theoretical principles on listening assessment. Findings expand conclusions drawn by Gamboa and Sevilla (2013) in previous research on listening assessment and provide several contributions to the current bulk of literature on listening testing practices in Cos ta Rica. Such conclusions also reveal areas of listening assessment that need to be further tackled through teacher training.]]></p></abstract>
<abstract abstract-type="short" xml:lang="es"><p><![CDATA[Pese a su importancia en el proceso de la comunicación, la habilidad auditiva ha sido ignorada en muchos programas de inglés como lengua extranjera (en inglés, EFL). Peor aún, discrepancias entre teoría y práctica conllevan a un inadecuado proceso de evaluación de esta habilidad lingüística. Ante ello, el presente estudio examina las prácticas en la evaluación auditiva de docentes de inglés en servicio del Ministerio de Educación Pública (MEP) ubicados en la banda C1 según el Marco Común Europeo (en inglés, CEF) en colegios bilingües de la Región de Occidente de Costa Rica. Se analizaron pruebas de escucha diseñadas por el personal docente participante en la investigación en términos de su cumplimiento con la teoría sobre evaluación auditiva y los lineamientos de evaluación del MEP. El estudio reveló que, a pesar de que los docentes habían recibido capacitación en materia de evaluación auditiva, las pruebas que ellos diseñaron no cumplen a cabalidad con los lineamientos de evaluación del MEP ni con losprincipios teóricos en evaluación auditiva. Los resultados ampliaron conclusiones postuladas por Gamboa y Sevilla (2013) en estudios anteriores sobre el tema y vigorizan el estado de la cuestión sobre la evaluación del componente auditivo en Costa Rica. Dichos resultados también dilucidaron áreas de la evaluación de las destrezas auditivas que aun deben trabajarse mediante la capacitación de docentes.]]></p></abstract>
<kwd-group>
<kwd lng="en"><![CDATA[Listening Skills]]></kwd>
<kwd lng="en"><![CDATA[English]]></kwd>
<kwd lng="en"><![CDATA[Teaching English]]></kwd>
<kwd lng="en"><![CDATA[Teachers]]></kwd>
<kwd lng="en"><![CDATA[Bilingual High Schools]]></kwd>
<kwd lng="en"><![CDATA[West Area]]></kwd>
<kwd lng="en"><![CDATA[Ministry Of Public Education]]></kwd>
<kwd lng="en"><![CDATA[Costa Rica]]></kwd>
<kwd lng="es"><![CDATA[Destreza Auditiva]]></kwd>
<kwd lng="es"><![CDATA[Inglés]]></kwd>
<kwd lng="es"><![CDATA[Enseñanza del Inglés]]></kwd>
<kwd lng="es"><![CDATA[Profesorado]]></kwd>
<kwd lng="es"><![CDATA[Colegios Bilingües]]></kwd>
<kwd lng="es"><![CDATA[Región de Occidente]]></kwd>
<kwd lng="es"><![CDATA[Ministerio de Educación Pública]]></kwd>
<kwd lng="es"><![CDATA[Costa Rica]]></kwd>
</kwd-group>
</article-meta>
</front><body><![CDATA[ <div style="text-align: justify;">     <div style="text-align: center;"><font  style="font-family: Verdana; font-weight: bold;" size="4">The testing of listening in bilingual secondary schools of Costa Rica: bridging gaps between theory and practice</font>    <br> </div> <font style="font-family: Verdana;" size="2"></font>    <br>     <div style="text-align: center;"><font  style="font-family: Verdana; font-weight: bold;" size="4">Evaluaci&oacute;n de la destreza auditiva en los colegios bilinng&uuml;es de Costa Rica: acortando brechas entre toer&iacute;a y pr&aacute;ctica</font><font  style="font-family: Verdana; font-weight: bold;" size="3"> </font>    <br> </div> <font style="font-family: Verdana;" size="2"></font>    <br>     <div style="text-align: center;"><font style="font-family: Verdana;"  size="2">Roy Gamboa Mena<sup><a href="#1">1</a><a name="3"></a>*</sup></font><font  style="font-family: Verdana;" size="2"> Henry Sevilla Morales<sup><a  href="#2">2</a><a name="4"></a>*</sup></font>    <br> </div> <font style="font-family: Verdana;" size="2"></font>    <br> <small><span style="font-family: Verdana;"><a name="Correspondencia2"></a>*<a  href="#Correspondencia1">Direcci&oacute;n para correspondencia</a></span></small><a  href="#Correspondencia1">:</a>    ]]></body>
<body><![CDATA[<br> <hr style="width: 100%; height: 2px;"><font  style="font-family: Verdana;" size="2"></font><font  style="font-family: Verdana; font-weight: bold;" size="3">Abstract</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Despite listening being one of the most crucial skills in the process of communication, research shows that it has been neglected in most English as a Foreign Language (EFL) programs, both worldwide and in Costa Rica. Worse yet, mismatches between theory and practice often result in poor listening assessment in many institutions. Thus, this article examined current listening testing practices by Ministry of Public Education (in Spanish, MEP) in-service teachers ranked C1 according to the Common European Framework (CEF) in bilingual secondary schools of the West Area of Costa Rica. Listening tests created and administered by those teachers were&nbsp; analyzed&nbsp; for&nbsp; their&nbsp; compliance&nbsp; with&nbsp; both&nbsp; theory&nbsp; on&nbsp; listening&nbsp; assessment&nbsp; and&nbsp; MEP&#8217;s&nbsp; guidelines&nbsp; on assessment. The study revealed that even though teachers had previously received training on testing, the tests they created do not fully comply with both MEP&#8217;s guidelines and theoretical principles on listening assessment. Findings expand conclusions drawn by Gamboa and Sevilla (2013) in previous research on listening assessment and provide several contributions to the current bulk of literature on listening testing practices in Cos ta Rica. Such conclusions also reveal areas of listening assessment that need to be further tackled through teacher training.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2"><span  style="font-weight: bold;">Keywords</span>: Listening Skills, English, Teaching English, Teachers, Bilingual High Schools, West Area, Ministry Of Public Education, Costa Rica</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="3">Resumen</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Pese a su importancia en el proceso de la comunicaci&oacute;n, la habilidad auditiva ha sido ignorada en muchos programas de ingl&eacute;s como lengua extranjera (en ingl&eacute;s, EFL). Peor a&uacute;n, discrepancias entre teor&iacute;a y pr&aacute;ctica conllevan a un inadecuado proceso de evaluaci&oacute;n de esta habilidad ling&uuml;&iacute;stica. Ante ello, el presente estudio examina las pr&aacute;cticas en la evaluaci&oacute;n auditiva de docentes de ingl&eacute;s en servicio del Ministerio de Educaci&oacute;n P&uacute;blica (MEP) ubicados en la banda C1 seg&uacute;n el Marco Com&uacute;n Europeo (en ingl&eacute;s, CEF) en colegios biling&uuml;es de la Regi&oacute;n de Occidente de Costa Rica. Se analizaron pruebas de escucha dise&ntilde;adas por el personal docente participante en la investigaci&oacute;n en t&eacute;rminos de su cumplimiento con la teor&iacute;a sobre evaluaci&oacute;n auditiva y los lineamientos de evaluaci&oacute;n del MEP. El estudio revel&oacute; que, a pesar de que los docentes hab&iacute;an recibido capacitaci&oacute;n en materia de evaluaci&oacute;n auditiva, las pruebas que ellos dise&ntilde;aron no cumplen a cabalidad con los lineamientos&nbsp; de&nbsp; evaluaci&oacute;n&nbsp; del&nbsp; MEP&nbsp; ni&nbsp; con&nbsp; losprincipios te&oacute;ricos en evaluaci&oacute;n auditiva. Los resultados ampliaron conclusiones postuladas por Gamboa y Sevilla (2013) en estudios anteriores sobre el tema y vigorizan el estado de la cuesti&oacute;n sobre la evaluaci&oacute;n del componente auditivo en Costa Rica. Dichos resultados tambi&eacute;n dilucidaron &aacute;reas de la evaluaci&oacute;n de las destrezas auditivas que aun deben trabajarse mediante la capacitaci&oacute;n de docentes.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"><span  style="font-weight: bold;">Palabras clave</span>: Destreza Auditiva, Ingl&eacute;s, Ense&ntilde;anza del Ingl&eacute;s, Profesorado, Colegios Biling&uuml;es, Regi&oacute;n de Occidente, Ministerio de Educaci&oacute;n P&uacute;blica, Costa Rica</font>    <br> <hr style="width: 100%; height: 2px;"><font  style="font-family: Verdana;" size="2"></font><font  style="font-family: Verdana; font-weight: bold;" size="3">1.Introduction</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">For many decades, teachers and researchers believed that listening was a passive skill because it could not be observed (Alderson and Bachman, 2001, in Buck, 2001); but for many decades, they were wrong. Such beliefs can be traced back to the audiolingual times when&nbsp; listening&nbsp; assessment&nbsp; was&nbsp; supposed&nbsp; to&nbsp; be&nbsp; assessed&nbsp; through&nbsp; the&nbsp; discrete&nbsp; point approach, where learners were assessed for their capacity to isolate language rather than to grasp its meaning in context (Coombe, Folse, and Hubley, 2007, p. 91). Today, however, second language researchers and academics agree that it is a very active skill, for the &#8220;students receive, construct meaning from, and respond to spoken messages&#8221; (Coombe et al., 2007, p. 90). They also believe that, besides comprehending the language, learners should be able to take that input and bring it to use in real-life contexts. Unfortunately, not always are these principles brought to practice in L2 teaching; and to worsen the scenario further, gaps exist between listening assessment theory and practice, as well as between &#8220;listening research and practice&#8221; (Osada, 2004, p. 57).</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">In the context of the Costa Rican public education system, research suggests serious mismatches between theory on language assessment, the assessment guidelines teachers are demanded to follow, and what they in fact do in their classrooms (Gamboa and Sevilla, 2013a, p. 24). Gamboa and Sevilla (2013) believe that the issue stems, in part, because concrete guidelines have yet not been provided by MEP; the only document available is the procedures for test design in general, not for language testing. Arguably, it is not surprising to find teaching scenarios where the testing of listening is conducted poorly, and more than that, often in an instinctive fashion. As a result of this, teachers face a number of limitations that range from not knowing the format type to follow in the construction of their tests to more serious issues such as poor content validity of their examinations. Directly, these limitations affect their teaching as a whole and, more critically, the learning process. If a test measures what it is not supposed to, for example, then learners are likely to perform poorly in the test. Thus, the study herein arises from the need to expand prior research on listening assessment conducted in Costa Rica.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Arguably, in an age of multilingualism, where efforts are being made to put Costa Rica at the forefront of international communication, and where English skills are paramount in reaching such a goal, &#8220;research on listening assessment proves not only relevant but also crucial as a way to provide insights on how to conduct better teaching in the context of English as a Foreign Language&#8221; (Gamboa and Sevilla, 2013b, p. 187). In this manner, this study will enrich the existing bulk of literature on the testing of listening and will, in turn, help teachers and academics confront the challenges L2 listening testing has brought about in recent decades.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Thus, upon analyzing the importance of the topic, the question that emerges is: To what extent is training on listening assessment that includes both theory and MEP&#8217;s testing guidelines correlative to effective testing practices in bilingual middle schools of the West Area of Costa Rica? In order to respond to that question, the goal of this paper is to study the extent to which training on listening assessment has an incidence on the actual testing practices of teachers ranked C1 in bilingual secondary schools of the West Area of Costa Rica. To that purpose, teacher-created tests were analyzed quantitatively for their compliance with both theory on listening assessment and MEP&#8217;s guidelines on general assessment. The resulting data underwent a process of triangulation with qualitative annotations made to each of the eleven categories evaluated in the tests. Finally, the results of the tests analysis were contrasted against those of previous studies on listening assessment practices in the West Area of Costa Rica.</font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="3">2. Literature Review</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">2.1 A Brief History of Listening Assessment</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">The history of listening assessment can be traced back to as early as the mid twentieth century, when experts devised two approaches to language teaching: The audiolingual and the communicative approaches (Gamboa and Sevilla, 2013b, p.187). From the audiolingual times to the advent of the communicative approaches to language teaching (from the 50s to the 80s), three approaches to listening assessment were developed: the discrete point approach, the integrative approach, and the communicative approach. According to Coombe et al. (2009), the first departed from the notion that in order to be able to measure a learner&#8217;s mastery of the language, it was necessary to break it down into small segments or units (p. 91). Thus, typical exercises in a listening test would include phonemic discrimination or paraphrase recognition,&nbsp; and&nbsp; the&nbsp; learners&nbsp; were&nbsp; not&nbsp; expected&nbsp; to&nbsp; understand&nbsp; language in discourse. The second was based on the idea that &#8220;the whole of language was seen as being better than the sum of its parts&#8221; (as cited in Gamboa and Sevilla, 2013b, p. 187), and that learners had to be able to &#8220;use many bits [of language] at the same time&#8221; (Oller, 1979, p. 7). </font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Typical exercises using this approach included dictation and cloze. The last approach to listening assessment is found within the domains of the communicative approach, whose rationale was that language had to be comprehended in discourse and then used in a contextualized fashion (Oller, 1979, p. 7). Suggested exercises for assessment using this approach include, among many others, communicative question formats that are authentic in nature. As suggested by current theory, testing practices should be oriented by this last approach.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">2.2 Methods for Assessing Listening Comprehension</font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">In an attempt to come across ways to effectively assess listening, research has suggested a number of ways to do it. In this respect, Nunan (2002, p. 176) has suggested two approaches, bottom-up processing and top-down processing. In bottom-up processing, comprehension occurs when the listener successfully decodes the spoken text. Thus, sounds can range from the smallest meaningful units to complete texts, and comprehension occurs when students take in a word, decode it, and link it with other words to form sentences&#8212; ultimately to form meaningful texts. In top-down listening, the listener is directly involved with constructing meaning from input. The student uses background knowledge of the context and situations to make sense of what is heard. According to Nunan (2002, p. 176), the two modes are important and must therefore be taught in the class and later on assessed.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">2.3 Theoretical and Practical Gaps</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Despite all the bulk of emerging literature within the scope of listening comprehension and, in particular of listening assessment, research concludes that still much remains to be done to fill gaps between both theory and practice. Vandergrift (1997), for instance, referred to listening as being &#8220;the Cinderella of communication strategies&#8221; (in Gamboa and Sevilla, 2013a, p. 22). He has asserted that listening has been a neglected component in many EFL programs, and that answers must therefore be sought as to how to conduct better listening teaching practices. Along the same lines, Osada has stated that (as cited in Brown, 1987) listening in language teaching and learning is for the most part undermined. Osada (2004) has gone on to add that, despite recent awareness on its importance, listening &#8220;remains a somewhat neglected and poorly taught aspect of English&#8230;&#8221; in many ESL and EFL programs (p. 57). It follows then that, in light of Costa Rica&#8217;s goal for multilingualism, these issues deserve more attention than we generally realize. In this respect, Presidencia de la Rep&uacute;blica (2007) has stated that Costa Rica&#8217;s main goal regarding multilingualism is to provide the country with a population whose communicative competences enable its overall progress so that individuals have access to better education and employment.</font>    <br>     <br> <font style="font-family: Verdana;" size="2">Arguably, this somewhat big goal poses challenges that need to be met through high quality education and willpower, which have not yet been fully addressed in the context of the Costa Rican public education system. Then again, when it comes to the assessment of listening skills, in the program for III cycle in bilingual secondary schools, Presidencia de la Rep&uacute;blica (2007) has dictated six &#8220;principles for assessing listening&#8221; (p. 25), but, as argued previously by Gamboa and Sevilla (2013a), these principles do not concretely orient teachers in designing their listening tests.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">2.4 Recent Studies within the Context of Costa Rica&#8217;s Public Education</font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">System</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">It is evident that little research has been done in Costa Rica regarding current in-service MEP teachers&#8217; listening assessment practices. Recently, however, there has been an increased interest in conducting research within this area. Perhaps the most recent studies in this line&nbsp;have been&nbsp;conducted&nbsp;by Gamboa&nbsp;and&nbsp;Sevilla (2013).&nbsp;Their&nbsp;two&nbsp; investigations <span style="font-style: italic;">Assessment of Listening Skills in Public Schools of Costa Rica: The West and Central Pacific Case</span> (2013a), and <span  style="font-style: italic;">The Impact of Teacher Training on the Assessment of Listening Skills</span> (2013b) have shed some light as to what is being done in terms of assessment in the areas of San Ram&oacute;n, Palmares, Alfaro Ruiz, Valverde Vega, Esparza, Puntarenas, Barranca, and other regions of the West and Central Pacific Costa Rica.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">In the first study, the authors compared current listening assessment practices and beliefs of MEP teachers ranked B1 according to the CEF via analyzing listening tests created by them for their compliance with both MEP&#8217;s assessment guidelines and current theory on the assessment of listening skills. The researchers found, among other results, that mismatches exist &#8220;between what teachers think they do in terms of assessment and what their actual practices are&#8221; (Gamboa &amp; Sevilla, 2013a, p. 24) for the two areas inquired. They also concluded that &#8220;further training on the application of assessment principles is needed so that is closed between the teachers&#8217; beliefs and their current practices in terms of creation and administration of listening assessment&#8221; (Gamboa &amp; Sevilla, 2013a, p. 25). The results suggest that, as agreed by Vandergrift (1997) and by Osada (2004), listening is still being undermined and that more needs to be done to rescue the &#8220;Cinderella&#8221; of language teaching and learning.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">In the second study, Gamboa and Sevilla (2013b) analyzed the impact that teacher training has on MEP teachers&#8217; listening assessment practices. To this end, they offered fifteen teachers ranked B1 from the west area an eight-hour workshop where listening assessment theory and MEP&#8217;s assessment guidelines were studied. At the end of the workshop, these teachers created listening tests by applying the assessment principles discussed in the workshop. The researchers later analyzed the tests for compliance with both MEP&#8217;s guidelines on assessment and theory on listening assessment via an adaptation of the checklist used in their first study (i.e., Assessment of Listening &#8230;), and the results were later compared with those obtained by analyzing tests of a control group (which did not receive the training).&nbsp;In&nbsp;general,&nbsp;the&nbsp;researchers&nbsp;have&nbsp;concluded&nbsp;that&nbsp;&#8220;better&nbsp;listening&nbsp; test-design practices could be achieved by simply providing teachers with some training on listening assessment&#8221; (Gamboa &amp; Sevilla, 2013b, p. 196). Nonetheless, they have also concluded that there are areas that need improvement such as &#8220;the writing of general instructions, specific instructions, the inclusion of general test objectives, and the improvement of listening test techniques&#8221; (Gamboa &amp; Sevilla, 2013b, p. 196).</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">2.5 Gaps in Recent Research on Listening Assessment in the Context of Costa Rica</font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Despite these recent efforts to elucidate listening assessment practices in Costa Rica&#8217;s public education system, it is evident that a dramatic gap still needs to be filled between theory and practice. Research has not yet explored the listening practices of teachers ranked C1 and who have received while-in-service training on listening assessment. Furthermore, Gamboa and Sevilla (2013) experienced some limitations in their previous studies, which included, among others, &#8220;examining the listening passages [of the tests], deal[ing] with the time&nbsp;constrictions experienced in&nbsp;[their]&nbsp;study&#8221;,&nbsp;and&nbsp;conducting&nbsp;similar&nbsp;studies&nbsp;with populations ranked in different levels (Gamboa &amp; Sevilla, 2013b, p. 196).</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Another important gap to highlight here is the fact that although Gamboa and Sevilla&#8217;s training workshop in their last study (i.e., 2013b) proved generally successful, no tests collected from the natural teaching setting have yet been analyzed in the West Area, namely because test design at the workshop occurred in a controlled environment that did not entirely resemble the conditions in which tests are usually created for regular classroom use. This implies that there is a degree of likelihood for those tests to have been created as the mere act of complying with the requirements of a workshop, and that results may have been influenced&nbsp;by&nbsp;what&nbsp;Porte&nbsp;calls&nbsp;&#8220;THE&nbsp;HAWTHORNE&nbsp;EFFECT&#8221;;&nbsp;a&nbsp; condition&nbsp; in&nbsp; which participants &#8220;react in a way that is related to their pleasure at being included in a study rather than to any treatment involved&#8221; (Porte, 2010, p. 103). The tests collected for the present study, on the contrary, were tests previously designed and administered by C1 in-service English teachers who were not told about their participation in any study before they created them.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">This section has presented the most relevant theory that comprises the backbone of the present study. Firstly, it presented a short historical account of listening&nbsp; assessment. Secondly, it gave a review of two models of listening proposed by Nunan (2002), highlighting the need for more research on listening assessment as suggested by research. Lastly, this section has outlined two of the most recent contributions to the field of listening assessment in the context of national education of Costa Rica. Hence, the pages that follow will deal with the methodology and the procedures that will support the development of the research project herein described.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="3">3 Methodology</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">3.1 The Research Design</font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">The present study used a mixed paradigm, namely theoretical principles of both quantitative and qualitative research designs as proposed by Roberto Hern&aacute;ndez Sampieri (1991,&nbsp; p.&nbsp;755).&nbsp;It&nbsp;presents&nbsp;features&nbsp;of&nbsp;the&nbsp;qualitative&nbsp;method because&nbsp;it involves&nbsp;&#8220;the collection, analysis and interpretation of comprehensive and [&#8230;] visual data to gain insights into&#8221; (Gay, Mills, &amp; Airasian, 2009, p. 7) the listening assessment practices of a group of teachers from the West area of Costa Rica; also, because &#8220;the natural setting is the data source&#8221; (Tuckman, 1988, p. 388) in this study; or as Gay, Mills, &amp; Airasian propose, in quantitative research &#8220;as much as possible, data collection occurs in a naturalistic setting&#8221; (p. 7). Our study also uses tools particular to the quantitative method to convey findings and results. </font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">According to Hern&aacute;ndez (1991), a mixed approach is one which combines tenets of both quantitative and qualitative approaches to conduct research. He explains that, in the past, researchers (mainly fundamentalists) believed that these two approaches were unmatchable, and that, therefore, research had to be conducted following one explicit approach. However, as he continues to explain, tendencies have changed over the past decades and, today, people see the value in using a mixed approach.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">3.2 Participants</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Participants in this study were fourteen MEP teachers who work in Costa Rican bilingual secondary schools of the West Area, ranked C1 according to the CEF, and who participated in a 15-week&nbsp; course on language assessment&nbsp; which&nbsp; included&nbsp; the&nbsp; study of listening assessment. This was a ninety-hour, teacher-fronted course offered by MEP- CONARE to MEP in-service teachers. The purpose of this training experience was to prompt the development of evaluation and assessment skills on in-service teachers so that they are able to objectively assess their students&#8217; proficiency through the English learning process. The main objective of the course was to enable participants to use theoretical and applied fundamentals of assessment to evaluate the students&#8217; acquisition and communicative use of English. Through the course both theoretical and applied principles of assessment for each language skill were studied including those pertaining to the assessment of listening skills. Thus, participants had the chance not just to discuss theory on the assessment of listening but also were able to create listening tests that were reviewed both by peers and the instructor as a way for them to become equipped with hands on knowledge on listening assessment. The course took place during the second semester of 2011.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">3.3&nbsp;&nbsp;Materials</font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">The instrument used to analyze the teacher-created tests was a checklist (see<a href="/img/revistas/aie/v14n2/a06a2.jpg"> appendix 2</a>) previously adapted by Gamboa and Sevilla (2013a). In total, the checklist includes eleven criteria&nbsp; that&nbsp; seek&nbsp; to&nbsp; assess&nbsp; the&nbsp; degree&nbsp; of&nbsp; compliance&nbsp; of&nbsp; the&nbsp; tests&nbsp; with&nbsp; the&nbsp; theoretical principles that will be discussed in the workshop2. These criteria included: test format, test heading, general test objective, general instructions, credits, balance of item difficulty, specific instructions, listening test techniques, scoring key, face validity, beneficial backwash, and listening passage.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">3.4&nbsp;Procedure</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">The first step in the design of the present research included the review of literature pertinent to the research topic in order to build a theoretical platform upon which to rest the study. Then participants were contacted to request tests that they had created and used in their actual classes (see letter of consent in <a  href="/img/revistas/aie/v14n2/a06a1.jpg">appendix 1</a>). Thus tests were collected, and then they were analyzed quantitatively by using the checklist described in subsection 3.3 above. In order to cross-check information and assure validity, triangulation was done at two different levels. First, the data resulting from the quantitative analysis were confronted with qualitative annotations in the tests. A table of codes was developed to ensure participants&#8217; confidentiality and to aid the inclusion of qualitative data. Then, the results were compared with recent studies about listening assessment in the West Area of Costa Rica. Such analyses led to the interpretation of the findings and the drawing of conclusions.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="3">4 Analysis of the Results</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">4.1 Analysis of Tests created by teachers ranked C1</font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Data from the analysis of the tests was contrasted with the results of previous studies, as well&nbsp;as&nbsp;with&nbsp;current&nbsp;theory on&nbsp;listening&nbsp;assessment&nbsp;and&nbsp;the&nbsp;MEP&nbsp;guidelines&nbsp;on assessment as described in the materials provided by MEP. Results are presented in the form&nbsp;of&nbsp;graphs and tables in the pages that follow.&nbsp;Lastly, conclusions&nbsp; were&nbsp; drawn&nbsp; by contrasting results against the research question of the study.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">For the sake of analysis, data were grouped into three categories according to the degree of compliance of tests with the assessment principles as dictated by the MEP and listening assessment theory. Thus, these categories are operationalized as follows:&nbsp;<span  style="font-style: italic;">The highest third</span>, which groups criteria ranking between 85 and 100%, included test heading, test format, face validity, and beneficial backwash; <span style="font-style: italic;">the middle third</span>, which groups criteria ranking between 70 and 84%, comprised specific instructions and listening test techniques; and <span style="font-style: italic;">the lowest third</span>, which sets criteria ranking below 69%, included general test objective, credits, general instructions, and balance of item difficulty. </font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-style: italic;" size="2">4.1.1. Table of Codes</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">In order to ensure anonymity of the informants, the researchers have designed codes referring to the teacher created tests analyzed in the study. In the chart below, the first column depicts the informants of the study; the second one shows the name of the instrument for data collection; and the third one, the code used for the sake of anonymity in the analysis of results, where T stands for Teacher, C stands for Created, and T stands for Test. The number on the right indicates the test number. Thus, TCT-001 will refer to data gathered through the test provided by the first participant (numbers have been assigned randomly); TCT-002, to the test provided by the second one, and so forth, up to TCT-014, which will denote the test provided by participant number fourteen. The details explained herein can be summarized in <a href="/img/revistas/aie/v14n2/a06t1.gif">table 1</a> below.</font>    <br>     <br> <font style="font-family: Verdana;" size="2">Regarding the highest third, quantitative data show that the degree of compliance with theoretical assessment principles of test heading was 86,5%. Qualitative data, on the other hand, reveal that the areas of improvement with regard to these criteria have to do with the inclusion of data to be tested and of a line for the rater&#8217;s name. As recorded in TCT-001, TCT-003, TCT-008, &#8220;areas to improve include data to be tested and a line for the rater&#8217;s name&#8221;. As for format, quantitative analysis depicts a total of 95, 19% degree of achievement. Qualitative data suggest that flaws in test format having to do with margins and the numbering of pages could be responsible for the tests not meeting the specifications in these criteria. As recorded in the annotations made by the researchers, &#8220;[&#8230;] margins need to be adjusted to testing requirements dictated by the MEP&#8221; (TCT-001, TCT-002, TCT-003, and TCT-008). As for the case of page numbering, TCT-010 reveals that &#8220;pages were not numbered which might cause test takers to have difficulty in following the test sequence&#8221;. Lastly, 100% degree of achievement was recorded for both face validity and beneficial backwash. </font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Consequently, training on listening assessment positively impacted the teachers&#8217; practices regarding the categories discussed in the highest third, which suggests&nbsp; that teachers internalized the assessment principles to an acceptable degree. Results for this category are depicted in the <a  href="/img/revistas/aie/v14n2/a06i1.jpg">figure 1</a> below.</font>    <br>     <br> <font style="font-family: Verdana;" size="2">Taken together, data show generally positive results in the categories ranked as middle third. Quantitative data reveal that the degree of achievement for the compliance with assessment principles is 84,61% for specific instructions. Insights derived from qualitative analysis suggest that &#8220;not sufficient context for the task to be accomplished was provided&#8221; (TCT-002, TCT-005, and TCT-011), and also that &#8220;the total number of points and individual value&nbsp;of&nbsp;each correct&nbsp;item&nbsp;are not&nbsp;included&#8221; (TCT-003, TCT-004,&nbsp; and&nbsp; TCT-005),&nbsp; which explains the why a full degree of achievement was not reached for specific instructions. Regarding listening test techniques, the quantitative analysis depicts a total of 78% degree of achievement, while quantitative information exhibits that &#8220;advance organizers were not used to introduce each new section of the text&#8221; (TCT-001, TCT-03, TCT-005, TCT-010, and TCT-011) and that &#8220;tasks only partially reflect real-life situations&#8221; (TCT-004, TCT-005, TCT-008, and TCT-011), which, again, explains why listening test techniques have not met the desired criteria as dictated by the MEP. </font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">The above suggests that training on listening assessment partially impacted teachers&#8217; testing practices regarding the group of criteria comprising the middle third. This implies that even though the results are seemingly acceptable, improvement needs to be made in terms of test creation, especially because both specific instructions and listening test techniques have a direct incidence on the students&#8217; performance on tests. Results for this category are depicted in <a  href="/img/revistas/aie/v14n2/a06i2.jpg">figure 2</a> below.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">With regard to the lowest third, quantitative data shows that scores for the degree of achievement are 0% for general test objective; 23, 07% for credits; 30, 76% for scoring key; and 53, 85% for balance of item difficulty. By comparison, in qualitative terms the findings were&nbsp;as&nbsp;follows: Regarding&nbsp;test&nbsp;objectives,&nbsp;data&nbsp;reveals&nbsp;that&nbsp; &#8220;no&nbsp; test&nbsp; objectives&nbsp; were included&#8221; (TCT-012, TCT-011, TCT-008, TCT-05, TCT-04). Concerning credits, it was concluded that &#8220;[this] criterion was not observed&#8221; (TCT-001, TCT-002, TCT-004, TCT-005). With reference to scoring key, it was found that &#8220;no answer key was provided&#8221; (TCT-012, TCT-011, TCT-008, TCT-005). In relation to balance of item difficulty, it was evidenced that some of the &#8220;tests included only two parts (activities)&#8221; (TCT-003, TCT-004) which by principle hinders the achievement of balance of item difficulty.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Taken together, the data here suggest that: a) being this the group of criteria where the lowest degree of compliance was recorded, future teacher training could be oriented in this direction; and b) because balance of item difficulty directly affects student performance, improvement is paramount as a way to prompt greater chance for students&#8217; success in assessment. Results for this category are depicted in the <a href="/img/revistas/aie/v14n2/a06i3.jpg">figure 3</a> below. </font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">In order to provide an overall view of the degree of achievement for all criteria inquired, we have arranged the data shown in the three figures above in <a href="/img/revistas/aie/v14n2/a06t2.gif">table 2</a> below.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">4.2 Comparison between the results of the present study and those in the</font><font  style="font-family: Verdana; font-weight: bold;" size="2"> 2013 paper</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">This section contrasts the results in the present study and those in the 2013 paper by Gamboa and Sevilla on <span  style="font-style: italic;">The Impact of Teacher Training on the Assessment of Listening Skills</span>. In their 2013 paper the authors studied the correlation between teacher training and the listening assessment practices of MEP teachers of Costa Rica. To such an end, they analyzed tests created by teachers who had never received any while-in-service training on listening assessment and tests created by a group of teachers who participated in an eight- hour workshop on listening assessment.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">In that paper the authors were able to conclude that tests created by teachers who had undergone training on listening assessment were significantly better than those created by teachers&nbsp;who&nbsp;received&nbsp;no training.&nbsp;They reported&nbsp;significant&nbsp; improvements in&nbsp;beneficial backwash, face validity, test format, test heading, and listening test techniques in the tests created by teachers who had undergone training. Arguably, results in the present study parallel&nbsp;the&nbsp;aforementioned,&nbsp;since,&nbsp;like in&nbsp;the&nbsp;former,&nbsp;results&nbsp;in&nbsp;the later&nbsp;show&nbsp; high compliance (between 85 and 100%) in criteria such as test heading, test format, face validity, and beneficial backwash.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Contrastingly, in their 2013 study the authors reported the need for improvement of teacher created tests in such areas as general instructions, specific instructions, general test objectives, and listening test techniques. Similarly, in the present study the authors found that general instructions, specific instructions, general test objective and listening test techniques need improvement since the ranking reported for these criteria is lower that 70%.</font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="3">5 Conclusions, discussion and Implications</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">This study set out to examine the extent to which training on listening assessment has an impact&nbsp;on the actual testing&nbsp;practices of&nbsp;teachers ranked&nbsp;C1&nbsp;in&nbsp;bilingual secondary schools of the West Area of Costa Rica. Based on the findings, the following conclusions are drawn:</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Firstly, it is concluded that training on listening assessment has had the greatest impact on test heading, test format, face validity, and beneficial backwash. This implies that training efforts have rendered the expected outcomes in regard to these test criteria. The researchers suspect that this will translate into better assessment instruments, which, would, in turn, provide&nbsp;a more&nbsp;accurate&nbsp;impression&nbsp;of&nbsp;the&nbsp;students&#8217;&nbsp;actual&nbsp;language&nbsp;competencies&nbsp;in secondary schools of Costa Rica.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Furthermore, training on listening assessment did not contribute to improving the areas of specific instructions and listening test techniques, general test objective, credits, scoring key, and balance of item difficulty as expected. This was contrary to expectations since, as participants had undergone extensive training that included the testing of listening, it was expected for them to rank higher in all areas. It needs to be acknowledged, though, that there were different degrees of low compliance, which ranked from 0 to 84, 61%. Thus, future teacher training should address these issues in more detail. Then, until renewed and theoretically-informed assessment practices are not incorporated, the uncertainties behind listening assessment will continue to hinder effective assessment practices in these institutions.</font>    <br>     <br> <font style="font-family: Verdana;" size="2">Another conclusion is that training on listening assessment given to date to in-service teachers has proved successful for test components such as test heading, test format, face validity, and beneficial backwash but has proven insufficient or unsuccessful for other test criteria, namely, general instructions, specific instructions, general test objective, and listening test techniques. This implies that new training efforts need to tackle the latter set of test criteria in more depth and by means of renewed strategies that prove more effective and handy to teachers as they create their listening tests. Because quality assessment can and should not respond to just some test elements at the expense of others, future training needs to equip teachers to prepare assessment instruments that comply more thoroughly with all these testing criteria.</font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Finally, listening assessment training programs help close the existing gap between neglected&nbsp;listening&nbsp;assessments&nbsp;in&nbsp;classrooms,&nbsp;MEP&#8217;s lack&nbsp;of listening&nbsp;assessment guidelines, and teachers&#8217; beliefs on what listening assessment involves.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="2">5.1&nbsp;Limitations and Future Research</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">While this study has filled some of the gaps between theory and practice in terms of listening assessment, there are some limitations that we need to be aware of. In first place, the authors were not able to analyze the listening passage used by the participants in their tests.&nbsp;Also,&nbsp;the study&nbsp;did&nbsp;not study&nbsp;the&nbsp;correlation&nbsp;between&nbsp;test quality&nbsp;and teacher proficiency level. Lastly, the researchers analyzed only one test by participant, which might not fully represent the participant&#8217;s ability to create tests that meet the standards proposed by theory and by the MEP. The authors recommend that all these limitations be addressed in future research.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Future studies should also focus on three crucial areas of inquiry. The first is one deals with the assessment of other language skills such as speaking, reading, and writing. The second should look into the assessment of either listening assessment or of other skills assessment but at a national level as a way to help curricular authorities better orient their training efforts and, eventually, guide the allocation of funding for it. Lastly, the assessment of culture should be explored. This would&nbsp; allow the&nbsp; completion&nbsp; of&nbsp; the&nbsp; entire&nbsp; spectrum for language assessment within the Costa Rica&#8217;s public education system context.</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">Over the decades subsequent to the advent of communicative approaches to language teaching and learning, listening assessment has remained the &#8220;Cinderella&#8221; of the four macro skills of English. Upon completion of this paper, the researchers propose that teacher training programs on language assessment be, in a metaphorical manner of speaking, the means to rescue such long neglected Cinderella.</font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana; font-weight: bold;" size="3">Citas y notas</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br> <font style="font-family: Verdana;" size="2">2 see Gamboa and Sevilla (2013b) for expansion on listening assessment principles</font> <hr style="width: 100%; height: 2px;"><font  style="font-family: Verdana;" size="2"> </font><font  style="font-family: Verdana; font-weight: bold;" size="3">6. References</font>    <br> <font style="font-family: Verdana;" size="2"></font>    <br>     <!-- ref --><div style="text-align: left;"><font style="font-family: Verdana;"  size="2">Buck, Gary (2001). <span style="font-style: italic;">Assessing Listening</span>. New York: Cambridge University Press.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186838&pid=S1409-4703201400020000600001&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font>    <br> <font style="font-family: Verdana;" size="2"></font>    <!-- ref --><br> <font style="font-family: Verdana;" size="2">Coombe, Christine A.; Folse, Keith S., and Hubley, Nancy J (2007). <span  style="font-style: italic;">A Practical Guide to Assessing English Language Learners</span>. Ann Arbor, Mich: University of Michigan.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186841&pid=S1409-4703201400020000600002&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font>    <br> <font style="font-family: Verdana;" size="2"></font>    <!-- ref --><br> <font style="font-family: Verdana;" size="2">Gamboa, Roy and Sevilla, Henry (2013a). <span style="font-style: italic;">Proceedings of the 11<sup>th</sup> Hawaii International Conference on Education: Assessment of Listening Comprehension in Public High Schools of Costa Rica: The West and Central Pacific Case</span>. 06-11 Jan. 2013, Honolulu, Hawaii.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186844&pid=S1409-4703201400020000600003&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font>    <br> <font style="font-family: Verdana;" size="2"></font>    <!-- ref --><br> <font style="font-family: Verdana;" size="2">Gamboa, Roy and Sevilla, Henry (2013b). <span style="font-style: italic;">Proceedings of the I Congreso Internacional de Lingu&iacute;stica Aplicada, Universidad Nacional: The Impact of Teacher Training&nbsp; on&nbsp; the Assessment of Listening Skills</span>. 04-06 Feb. 2013, P&eacute;rez Zeled&oacute;n, Costa Rica.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186847&pid=S1409-4703201400020000600004&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font>    <br> <font style="font-family: Verdana;" size="2"></font>    <!-- ref --><br> <font style="font-family: Verdana;" size="2">Gay, Lorraine R.; Mills, Geoffrey E. and Airasian, Peter W. (2009). <span style="font-style: italic;">Educational Research: Competencies for Analysis and Applications. </span>Upper Saddle River, N.J: Merrill/Pearson.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186850&pid=S1409-4703201400020000600005&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font>    ]]></body>
<body><![CDATA[<br> <font style="font-family: Verdana;" size="2"></font>    <!-- ref --><br> <font style="font-family: Verdana;" size="2">Hern&aacute;ndez,&nbsp; Roberto;&nbsp; Collado,&nbsp; Carlos,&nbsp; y&nbsp; Baptista,&nbsp; Pilar&nbsp; (1991).&nbsp; <span style="font-style: italic;">Metodolog&iacute;a de la Investigaci&oacute;n</span>. M&eacute;xico: McGraw-Hill.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186853&pid=S1409-4703201400020000600006&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --> </font>    <br> <font style="font-family: Verdana;" size="2"></font>    <!-- ref --><br> <font style="font-family: Verdana;" size="2">Nunan, David. (2002). <span  style="font-style: italic;">Listening in language learning</span>. In J.C. Richards &amp; W.A. Renandya (Eds.), Methodology in language teaching: An anthology of current practice. UK: Cambridge University Press.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186856&pid=S1409-4703201400020000600007&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font>    <br> <font style="font-family: Verdana;" size="2"></font>    <!-- ref --><br> <font style="font-family: Verdana;" size="2">Oller, John W. (1979). <span  style="font-style: italic;">Language Tests at School: A Pragmatic Approach</span>. London: Longman.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186859&pid=S1409-4703201400020000600008&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --> </font>    <br>     ]]></body>
<body><![CDATA[<!-- ref --><br> <font style="font-family: Verdana;" size="2">Osada, Nobuko (2004). Listening comprehension research: A brief review of the past thirty years. <span  style="font-style: italic;">Dialogue, 3</span>, 53-66.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186862&pid=S1409-4703201400020000600009&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font>    <br> <font style="font-family: Verdana;" size="2"></font>    <!-- ref --><br> <font style="font-family: Verdana;" size="2">Porte, Graeme. (2010). <span  style="font-style: italic;">Appraising Research in Second Language Learning: A Practical Approach to Critical Analysis of Quantitative Research</span>. Philadelphia: John Benmamins.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186865&pid=S1409-4703201400020000600010&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --> </font>    <br>     <!-- ref --><br> <font style="font-family: Verdana;" size="2">Presidencia de la Rep&uacute;blica (2007). <span style="font-style: italic;">Costa Rica Multiling&uuml;e</span>. Costa Rica.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186868&pid=S1409-4703201400020000600011&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font>    <br> <font style="font-family: Verdana;" size="2"></font>    <!-- ref --><br> <font style="font-family: Verdana;" size="2">Tuckman, Bruce W. (1988). <span  style="font-style: italic;">Conducting Educational Research</span>. San Diego: Harcourt Brace Jovanovich.    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186871&pid=S1409-4703201400020000600012&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --></font>    <br> <font style="font-family: Verdana;" size="2"></font>    <!-- ref --><br> <font style="font-family: Verdana;" size="2">Vandergrift, Laurens. (1997). The Cinderella of communication strategies: Reception strategies in interactive listening. <span  style="font-style: italic;">The Modern Language Journal</span> (81), 494-505</font>    &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;[&#160;<a href="javascript:void(0);" onclick="javascript: window.open('/scielo.php?script=sci_nlinks&ref=186874&pid=S1409-4703201400020000600013&lng=','','width=640,height=500,resizable=yes,scrollbars=1,menubar=yes,');">Links</a>&#160;]<!-- end-ref --><br> </div> <font style="font-family: Verdana;" size="2">    <br> <a name="Correspondencia1"></a><a href="#Correspondencia2">*</a>Correspondencia a:    <br> </font><font style="font-family: Verdana;" size="2"></font><font  style="font-family: Verdana;" size="2"> Roy Gamboa Mena:</font><font style="font-family: Verdana;" size="2">Profesor de la Carrera de Bachillerato y Licenciatura en la Ense&ntilde;anza del Ingl&eacute;s de la Universidad de Costa, Sede de Occidente. Magister en la Ense&ntilde;anza del Ingl&eacute;s como Lengua Extranjera. Direcci&oacute;n Electr&oacute;nica: gamboa.roy@gmail.com</font>    <br> <font style="font-family: Verdana;" size="2">Henry Sevilla Morales</font><span  style="font-family: Verdana;">: </span><font  style="font-family: Verdana;" size="2">Profesor de la Carrera de Bachillerato y Licenciatura en la Ense&ntilde;anza del Ingl&eacute;s de la Universidad de Costa Rica, Sede de Occidente. Licenciado en la Ense&ntilde;anza del Ingl&eacute;s como Lengua Extranjera. Direcci&oacute;n electr&oacute;nica: al_deron@hotmail.com</font>    <br> <font style="font-family: Verdana;" size="2"><a name="1"></a><a  href="#3">1</a> Profesor de la Carrera de Bachillerato y Licenciatura en la Ense&ntilde;anza del Ingl&eacute;s de la Universidad de Costa, Sede de Occidente. Magister en la Ense&ntilde;anza del Ingl&eacute;s como Lengua Extranjera. Direcci&oacute;n Electr&oacute;nica: gamboa.roy@gmail.com</font>    <br> <font style="font-family: Verdana;" size="2"><a name="2"></a><a  href="#4">2</a> Profesor de la Carrera de Bachillerato y Licenciatura en la Ense&ntilde;anza del Ingl&eacute;s de la Universidad de Costa Rica, Sede de Occidente. Licenciado en la Ense&ntilde;anza del Ingl&eacute;s como Lengua Extranjera. Direcci&oacute;n electr&oacute;nica: al_deron@hotmail.com</font>    <br> <hr style="width: 100%; height: 2px;">     ]]></body>
<body><![CDATA[<div style="text-align: center;"><font  style="font-family: Verdana; font-weight: bold;" size="2">Art&iacute;culo recibido: 5 de setiembre, 2013 Devuelto para correcci&oacute;n: 7 de mayo, 2014 Aprobado: 15 de mayo, 2014 </font></div> </div>      ]]></body><back>
<ref-list>
<ref id="B1">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Buck]]></surname>
<given-names><![CDATA[Gary]]></given-names>
</name>
</person-group>
<source><![CDATA[Assessing Listening]]></source>
<year>2001</year>
<publisher-loc><![CDATA[^eNew York New York]]></publisher-loc>
<publisher-name><![CDATA[Cambridge University Press]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B2">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Coombe]]></surname>
<given-names><![CDATA[Christine A.]]></given-names>
</name>
<name>
<surname><![CDATA[Folse]]></surname>
<given-names><![CDATA[Keith S.]]></given-names>
</name>
<name>
<surname><![CDATA[Hubley]]></surname>
<given-names><![CDATA[Nancy J]]></given-names>
</name>
</person-group>
<source><![CDATA[A Practical Guide to Assessing English Language Learners]]></source>
<year>2007</year>
<publisher-loc><![CDATA[Ann Arbor^eMich Mich]]></publisher-loc>
<publisher-name><![CDATA[University of Michigan]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B3">
<nlm-citation citation-type="">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Gamboa]]></surname>
<given-names><![CDATA[Roy]]></given-names>
</name>
<name>
<surname><![CDATA[Sevilla]]></surname>
<given-names><![CDATA[Henry]]></given-names>
</name>
</person-group>
<source><![CDATA[Proceedings of the 11th Hawaii International Conference on Education: Assessment of Listening Comprehension in Public High Schools of Costa Rica: The West and Central Pacific Case]]></source>
<year>2013</year>
<publisher-loc><![CDATA[Honolulu^eHawaii Hawaii]]></publisher-loc>
</nlm-citation>
</ref>
<ref id="B4">
<nlm-citation citation-type="confpro">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Gamboa]]></surname>
<given-names><![CDATA[Roy]]></given-names>
</name>
<name>
<surname><![CDATA[Sevilla]]></surname>
<given-names><![CDATA[Henry]]></given-names>
</name>
</person-group>
<source><![CDATA[]]></source>
<year>2013</year>
<conf-name><![CDATA[ I Congreso Internacional de Linguística Aplicada]]></conf-name>
<conf-date>04-06 Feb. 2013</conf-date>
<conf-loc>Pérez Zeledón </conf-loc>
</nlm-citation>
</ref>
<ref id="B5">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Gay]]></surname>
<given-names><![CDATA[Lorraine R]]></given-names>
</name>
<name>
<surname><![CDATA[Mills]]></surname>
<given-names><![CDATA[Geoffrey E.]]></given-names>
</name>
<name>
<surname><![CDATA[Airasian]]></surname>
<given-names><![CDATA[Peter W.]]></given-names>
</name>
</person-group>
<source><![CDATA[Educational Research: Competencies for Analysis and Applications]]></source>
<year>2009</year>
<publisher-loc><![CDATA[Upper Saddle River^eN.J N.J]]></publisher-loc>
<publisher-name><![CDATA[Merrill/Pearson]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B6">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Hernández]]></surname>
<given-names><![CDATA[Roberto]]></given-names>
</name>
<name>
<surname><![CDATA[Collado]]></surname>
<given-names><![CDATA[Carlos]]></given-names>
</name>
<name>
<surname><![CDATA[Baptista]]></surname>
<given-names><![CDATA[Pilar]]></given-names>
</name>
</person-group>
<source><![CDATA[Metodología de la Investigación]]></source>
<year>1991</year>
<publisher-name><![CDATA[McGraw-Hill]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B7">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Nunan]]></surname>
<given-names><![CDATA[David]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[Listening in language learning]]></article-title>
<person-group person-group-type="editor">
<name>
<surname><![CDATA[Richards]]></surname>
<given-names><![CDATA[J.C.]]></given-names>
</name>
<name>
<surname><![CDATA[Renandya]]></surname>
<given-names><![CDATA[W.A.]]></given-names>
</name>
</person-group>
<source><![CDATA[Methodology in language teaching: An anthology of current practice]]></source>
<year>2002</year>
<publisher-name><![CDATA[Cambridge University Press]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B8">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Oller]]></surname>
<given-names><![CDATA[John W]]></given-names>
</name>
</person-group>
<source><![CDATA[Language Tests at School: A Pragmatic Approach]]></source>
<year>1979</year>
<publisher-loc><![CDATA[^eLondon London]]></publisher-loc>
<publisher-name><![CDATA[Longman]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B9">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Osada]]></surname>
<given-names><![CDATA[Nobuko]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[Listening comprehension research: A brief review of the past thirty years]]></article-title>
<source><![CDATA[Dialogue]]></source>
<year>2004</year>
<volume>3</volume>
<page-range>53-66</page-range></nlm-citation>
</ref>
<ref id="B10">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Porte]]></surname>
<given-names><![CDATA[Graeme]]></given-names>
</name>
</person-group>
<source><![CDATA[Appraising Research in Second Language Learning: A Practical Approach to Critical Analysis of Quantitative Research]]></source>
<year>2010</year>
<publisher-loc><![CDATA[^ePhiladelphia Philadelphia]]></publisher-loc>
<publisher-name><![CDATA[John Benmamins]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B11">
<nlm-citation citation-type="">
<collab>Presidencia de la República</collab>
<source><![CDATA[Costa Rica Multilingüe]]></source>
<year>2007</year>
</nlm-citation>
</ref>
<ref id="B12">
<nlm-citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Tuckman]]></surname>
<given-names><![CDATA[Bruce W.]]></given-names>
</name>
</person-group>
<source><![CDATA[Conducting Educational Research]]></source>
<year>1988</year>
<publisher-loc><![CDATA[^eSan Diego San Diego]]></publisher-loc>
<publisher-name><![CDATA[Harcourt Brace Jovanovich]]></publisher-name>
</nlm-citation>
</ref>
<ref id="B13">
<nlm-citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname><![CDATA[Vandergrift]]></surname>
<given-names><![CDATA[Laurens]]></given-names>
</name>
</person-group>
<article-title xml:lang="en"><![CDATA[The Cinderella of communication strategies: Reception strategies in interactive listening]]></article-title>
<source><![CDATA[The Modern Language Journal]]></source>
<year>1997</year>
<numero>81</numero>
<issue>81</issue>
<page-range>494-505</page-range></nlm-citation>
</ref>
</ref-list>
</back>
</article>
