This paper explores the letter-writing component on the English portion of Myanmar’s University Entrance Examination, also referred to as the Matriculation Examination. Very little has been published to date regarding this examination in general, and this paper appears to be the first to address the letter-writing portion of this exam in detail. 165 letter-writing prompts used on the actual exams from 2009 to 2013, as well as six marking schemes used from that same period to assess examinee responses were analyzed. Inconsistencies in the difficulty level of the prompts as well as potential issues with the reliability of the current marking scheme(s) are discussed. The researchers ultimately advocate that the Ministry of Education and Myanmar’s Board of Education revise their current approaches to creating letter-writing prompts and consider developing a more contextually-specific and dynamic assessment instrument capable of serving in a formative capacity in classroom-based instruction while also serving the assessment needs of examiners. *MEXT in-service Teacher Training Course(2013) yield similar results. If these alternate forms do consistently yield similar results it is considered reliable (Cohen, Manion, & Morrison, 2011). Rater reliability includes inter-rater reliability and intrarater reliability. Addressing inter-rater reliability, Stemler (2004) reminds us that i t “must be demonstrated anew for each new study, even if the study is using a scoring rubric or instrument what has been shown to have high inter-rater reliability in the past.” (p.66). Addressing intra-rater reliability, Brown, Bull, and Pendlebury (1997) cite “the lack of consistency of an individual marker” (p.235, as cited in Jonsson & Svingby, 2007) as a significant threat. In order to mitigate threats to the reliability of scoring procedure, educators often turn to rubrics for assessment purposes. The degree to which rubrics created in and for ESL/EFL contexts can be successfully adapted to meet the needs of other language learning environs is debatable (East, 2009; Sasaki & Hirose, 1999), however there is significant empirical evidence suggesting that the use of rubrics in assessing writing increases the reliability of scoring. In a review of 75 empirical studies on rubrics, Jonsson and Svingby (2007) concluded that “the reliable scoring of performance assessments can be enhanced by the use of rubrics, especially if they are analytic, topic-specific, and complemented with exemplars and/or rater training.” (p.130). The latter part of this conclusion is critical, as it qualifies the benefits provided by rubrics as contingent upon the both the type of rubric used (analytic as opposed to holistic), and the manner in which the rubric is introduced to assessors. RESEARCH QUESTIONS Very little research has been published regarding Myanmar’s National English Exam in general. Kirkpatrick and Hlaing (2013) provided a succinct overview of the English test overall, concluding among other th ings that , at the ver y leas t , “differences and disparities among regions and a multiple-version test creates doubt about the test’s ultimate reliability.” (p.14). Seemingly nothing about the letter writing portion of the exam specifically has been published. Therefore the following research is primarily exploratory and descriptive in nature as it investigates the letter-writing task on the national exam and how it is assessed. To that end, this paper looks at the actual letter-writing prompts themselves, as well as the marking schemes used by exam raters as primary sources of data capable of fleshing out the requirements of this portion of the exam. The reliability of the letter-writing prompts, raters, and marking scheme(s) used by Myanmar’s Ministry of Education will be discussed in general terms, and some suggestions for improving both the task and assessment process will be offered. METHODS Materials The letter-writing component of the English exam is worth a maximum of 10 marks (10% of the total exam). There are 11 official testing states/regions in Myanmar, and a different version of the exam is used in each of these states/regions, so 11 different versions of the exam are created each year. Therefore, between 2009 and 2013 there were a total of 55 English tests created for the Matriculation Examination. All matriculation examinations used in Myanmar from 2009-2013 have been published in their entirety by Myanmar’s Ministry of Education in book form (Ministry of Education, Myanmar Board of Examination, 2013). For this research only the letter-writing prompts were analyzed. As each exam’s letter-writing task always provides three prompts for examinees to choose from, there were 165 total letter-writing prompts used during this fiveyear span. Each version of the exam has a corresponding marking scheme. The researchers obtained six marking schemes from different states and years. These marking schemes are generally not published, however their contents are not considered confidential after the matriculation exams have been evaluated, and the researchers have been given permission to reproduce and discuss the marking schemes in this report. Aaron Sponseller, Hau Khan Mang and Seiji Fukazawa