The Sorry State of Thai Education – Part 2: Test Scores, Standards and Accountability

“F” for the explanations of ridiculous O-NET questions

This year’s ridiculous O-NET exam questions caused much kerfuffle which had the executives of the National Institute of Educational Testing Service (NIETS) come out to give their yearly explanation for their ridiculous exam questions.

In reference to the question about transvestic behavior, the NIETS Chairman Mr. Somwang Phithiyanuwat conceded that some questions might have come across as “too strong” but that “wasn’t the intention of the NIETS.” That seems to be the extent to which the NIETS executive was willing to concede. He then affirmed that the content of the question was from the curriculum and “fell within the scope which the NIETS had announced before the exam…. The question evaluated students’ memory of the content from the textbooks.”

Well, if that wasn’t good enough, try this explanation by the NIETS Director Dr. Samphan Phanphruek. “The panels that designed the questions had clear objectives… but of course, we welcome all comments.” As to the most infamous question on what to do when having a sexual urge, he confirmed that choice a) Call a friend to go play football, was the “correct” answer.

The NIETS Director said the question was intended to check whether the students understood the nature of sexual desire and how to control or respond to it. The question was “a part of the content about sex education and family life.” Unfortunately, the reporter did not ask if the response applies to both boys and girls and if they could go out to the same football field.

Mocking such a line of reasoning might be amusing. But there is no fun in the rate of sexually transmitted infection among Thai teenagers which has increased three-fold over the past five years and that teenagers are one of the leading risk groups for HIV infection in Thailand. Ineffective sex education has been widely blamed for Thai teens’ lack of knowledge about safe sex and risky sexual behavior.

Evidently teaching denial and suppression of sexual urges is not the kind of sex education that is helping Thai youth to live a healthy life in the present reality. Are Thai youth being well served by the Thai education system? The answer at least from the perspective of sex education is a resounding “No.”

Sex education is just one among many problems in the Thai education system. It’s just the tip of the iceberg and a symptom of problematic mindset, methodology, management, and competency.

National standardized tests’ credibility
Since its implementation the national “standardized” tests like O-NET (and A-NET – Advanced National Educational Test and others) have been widely criticized. There is much doubt whether O-NET and A-NET are effective measurement tools of students’ scholastic abilities. Thai students’ O-NET scores are so appalling that it begs the question about the validity of the test itself. How can students’ performance be so incredibly inferior, so wildly inconsistent and disperse? (See examples of O-NET score distribution at the primary P.6 and lower-secondary M.3 levels.)

The majority scoring in the 0-30% range in key subjects as was the case for M.6 O-NET is simply unacceptable and too suspicious. Scores at all three O-NET levels in core subjects have also been declining over the past three years. At the same time students have been complaining about frequent changes in styles of questions which confused them. Many students say they feel like guinea pigs in one bad experiment after another. You can’t really blame them for feeling that way if you see the kind of scores achieved in the past three academic years. (Scores given in Tables 1- 3 for O-NET are average scores out of 100%.)

Table 1. Primary-level (P.6) O-NET average scores (2009-2011)

P.6 (primary) Level








Social Science














Health Education






Vocational Education



Note: Between 800,000-950,000 students took the tests each year.

 Table 2. Lower secondary-level (M.3) O-NET average scores (2009-2011)

M.3 (lower secondary) Level








Social Science
















Health Education






Vocational Education



Note: Roughly 800,000 students took the tests each year.

 Table 3. Upper secondary-level (M.3) O-NET average scores (2009-2011)

M.6 (upper secondary) Level








Social Science
















Health Education








Vocational Education




Note: Roughly 350,000 students took the tests each year.

As seen in Tables 1-3 the scores in many subjects were quite erratic year on year. The scores differently marked represent the changes from the previous year as follows:

  • xx.xx scores = increase or decrease by 5-10% points
  • xx.xx scores = increase or decrease by >10-20% points
  • xx.xx scores = increase or decrease by >20% points.

Consider the drastic drops of scores in English and Mathematics at all three levels. The scores for both subjects suffered a steep drop in 2011:

  • English: 10% points or one-third of the score in the previous year at the P.6 level; 15% points or half of the score two years before at the M.3 level; and 11% points or one-third of the score two years before at the M.6 level.
  • Mathematics: 9% points or one-fifth of the 2009 score at the P.6 level; 8.5% points or 26% of the 2009 score at the M.3 level; and 21% points or 58% of the 2009 score at the M.6 level.

There were huge fluctuations in the Science scores especially at the M.3 level and to a lesser extent the P.6 level. The scores for Health Education also had sharp drops and sharp increases at all three levels, involving as much as 39% points or a 120% increase over the previous year at the M.3 level.

Such highly volatile scores call into question the consistency and validity of the tests. Students’ scholastic abilities were unlikely to vary so drastically (by over 10 percentage points) in just one year or two years. Given consistent numbers of exam takers in each level, such widely fluctuating average scores (up to as much as 20-39% points!) scream methodological problems.

O-NET – where are the standards and accountability?

O-NET was created to replace the old university entrance examinations (to public universities), which were considered to be tough. But as reported by The Nation:

Many universities became so worried about the O-NET ability to select qualified students for some of their fields that they allocated fewer and fewer seats for the central admission system, which has used O-NET scores as admission criteria.

So what is then the value of O-NET, if universities are weary of relying on it as a measurement standard for students’ qualification?

As we have seen in some O-NET questions, it is not only that the quality standard of O-NET is questionable, but the way the questions were framed also shows the glaring lack of skills on the part of the exam designers as well as their inability to distinguish between normative values from knowledge based on objective and scientific facts. How can Thai students be expected to develop critical learning skills necessary for their intellectual development given the kind of knowledge being taught and tested? And how will such evaluation tools help or hurt students and the quality of Thai education?

Thailand Development Research Institute (TDRI) academic Dr. Dilaka Lathapipat said Thailand’s O-NET is “stuck in the 20th century.” The test designs of O-NET and PISA are very different, as seen in the graphic below.

(PISA is the OECD Programme for International Student Assessment  which assesses 15-year-old students in three key subject areas, Reading, Mathematics and Science, every three years. Part 3 of this series will discuss Thai students’ PISA scores.)

“Good tests should determine children’s ability to apply knowledge to their daily life,” said Dr. Dilaka. No argument there. “Tests are tools to evaluate not just students’ academic performance but also the performance of teachers and schools.” We might add that tests also reflect the quality of the testers and the education system as a whole.

“If the test designers cannot provide reliable and efficient tests, teachers and students will lack trusted indicators of their performance. Relevant agencies, in that case, will also find it hard to check which areas they should concentrate on to improve for the country’s educational system,” said Dr. Dilaka.

The question that’s on everyone’s mind after seeing the O-NET ridiculous questions is: how in the world did that kind of questions manage to pass through so many brains and pairs of eyes? What was the question design process?

Over the past few years both the former and current NIETS directors gave similar (defensive) explanations on the O-NET test design process. The previous NIETS Director assured the public that O-NET questions were “well designed” and went through a rigorous process. The current NIETS Director similarly explained:

[The] NIETS had designed the questions for O-NET in line with the curriculum of the Office of Basic Education Commission (OBEC) and in response to indicators highlighted by OBEC. Based on [OBEC guidelines], specialists developed test blueprints and item specifications…. School teachers, school directors, chiefs of academic subjects at schools were recruited for workshops where they could improve the test blueprint and specific items. OBEC teachers have taken part in the designing of the tests in all subjects….

OBEC chose question designers from its pool of teachers from various regions. After the teachers design the questions, university lecturers with expertise in those subjects then step in as speakers to advise and screen the questions. After that, test design and screening committees for each subject will work out the final sets of questions for the tests.

So there were a test blueprint and indicators, and design and validation processes. But the explanation didn’t tell us how the quality standard was ensured. Did the specialists who developed the test blueprints have sufficient expertise? How exactly did the NIETS recruit teachers to review and validate the O-NET exam questions. What were the criteria for selection? Were the criteria based on the necessary competencies or rank?

And what has been done with the wild score movements in the past three years? One hopes that the NIETS or OBEC would be at least secretly worried and are working hard to put this highly problematic national standardized testing business in order—if not for the sake of the children and Thai education, at least for their own credibility.

But the NIETS Director did not sound very worried. After having explained the test design process above, he assured the worried exam takers that right after this year’s O-NET exam the test design committee was given the O-NET exam again to review and double check the answer keys. He added that experts and “bright individuals” were invited to take the exam and they had “no problems at all.” He was “confident” that this time around the announcement of the O-NET test scores would be “smooth sailing and problem-free,” and there would be “no complaints about wrong answers.”

How is that for quality assurance?

The English-language brochure of the NIETS says that besides organizing testing systems and measurement tools for national education, the NIETS also provides professional capacity building services for teachers, including examinations and certification in education measurement and evaluation. (Care to imagine the types of exam questions for the teachers?)

Like the students, the schools and teachers are assessed by the Education Ministry and the NIETS. But who is assessing the assessors? Apparently the NIETS is subject to an internal audit. But has there been any proper audit? And if so, is the auditing process credible or sufficient?

Ammar Siamwalla, an economist at TDRI recently said, “A key factor behind poor education quality in this country is a lack of accountability.” And this applies to teachers, school directors, executives, all the way up to education ministers, he stressed. The Bangkok Post reported:

Mr. Ammar said almost all schools have passed external quality assessment tests conducted by the Education Ministry and teachers have higher salaries and gained academic standing from presenting hundred plus-page reports. “But the performance of students has become poorer in both national and international tests, especially in sciences and maths. A new approach is needed.”

Amen to that.

More articles in this Thai Education series:

Part 1: Ridiculous O-NET questions

Part 3: Thai students’ PISA Scores, a challenge for the 21st century (forthcoming)

Part 4: Thais’ dismal English, how to improve it? (forthcoming)


This article was originally published for Siam Voices on Asian Correspondent on 27 February 2012.

3 responses to “The Sorry State of Thai Education – Part 2: Test Scores, Standards and Accountability

  1. Pingback: The Sorry State of Thai Education – Part 3: PISA scores & a challenge for the 21st century | Thai Woman Talks – Language, Politics & Love·

  2. Pingback: The Sorry State of Thai Education – Part 4: Dismal English-language education | Thai Woman Talks – Language, Politics & Love·

  3. Pingback: What’s hair got to do with child rights — in Thailand? | Thai Woman Talks – Language, Politics & Love·

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s