by Joel Sawat Selway
Figure 1. The Nation's Predictions for Seats in the 2023 Thai General Elections
Source: https://www.nationthailand.com/thailand/40027280
Thai polling firms seem intent on breaking records for sample sizes this election year! The pollster with the lowest average sample size, NIDA, has polled between 2,000-2,533 respondents. In contrast, the typical sample size for US polls—in a country almost five times larger—is 1,000. But NIDA’s sample size is modest in comparison to the others. Thairath has the next lowest average—a single survey with 14,140 respondents. Next comes Daily News/Matichon, which published two surveys with 39,687 and 78,583. The fourth highest average sample size goes to the Nation with its two surveys of 39,687 and 114,457 respondents. And finally is Suan Dusit, whose two surveys were 10,614 respondents and one with 162,454. From the perspective of US election polling these sample sizes are simply staggering! But do larger sample sizes equate to better predictions?
In my previous post, I rated the pollsters in the 2019 elections, concluding that Bangkok Poll had the lowest average absolute error (AAE)—the standard measure for gauging how accurate a pollster’s predictions were. None of the pollsters I examined had a survey size larger than 10,000. The sample sizes this year are monstrous in comparison!
What are the pollsters predicting and which one should we believe the most? From the most recent survey of each of these pollsters, with survey dates ranging from 6-20 April to 24 April – 3 May, the percentage of votes for Pheu Thai (expected to be the largest vote getter) ranges from 33.65% (Daily News/Matichon) to 41.37% (Suan Dusit). The second largest party (according to polls) is predicted to be Move Forward, ranging from 19.32% (Suan Dusit) to 50.29% (Daily News/Matichon). That is a huge difference amongst survey predictions! The Democrats range from 1.05% (Daily News/Matichon) to 7.30% (Suan Dusit); Bhumjaithai from 0.70% (Daily News/Matichon) to 9.55% (Suan Dusit); and Palang Pracharat, the head of the outgoing government coalition with 1.28% (NIDA) to 7.49% (Suan Dusit).
In general, the Suan Dusit poll has reported the highest numbers for conservative parties, with Palang Prachart (PPRP) and the Democrat Party (DP) with a combined 14.79%. That’s twice as high as the next pollster, Nation Poll, who puts their combined tally at 7.15%! Daily News/Matichon has those same two parties at just 3.51%. Not surprisingly, Suan Dusit also has the two “progressive” parties with the lowest combined vote tally, just 60.69% compared to Daily News/Matichon, which has them at 83.94%.
Clearly, there is huge disagreement amongst these polls! Pollsters often disagree, though not to this extent. When pre-election polls disagree, the discussion comes down to the fine details of each agency’s survey methodology. In this post, I look at four key technical aspects that determine survey accuracy and score the five polling firms that predicted the vote share of parties on these measures. In addition, Super Poll has regularly predicted the number of seats parties will win. While this will make their AAE hard to compare (votes do not translate easily to seats), we can still evaluate their survey based on these four criteria.
1. How did the firm compile the sample of respondents?
2. How was the survey administered?
3. What efforts were made to identify likely voters? Do they include post-survey weightings to maximize representativeness?
4. How transparent are they in their methodology?
1. How did the firm compile the sample of respondents?
How you identify the people to poll is crucial. Pollsters often use the term “nationally representative” to describe their samples, but this is no guarantee that their methodology conforms to accepted practices. Polling firms can’t survey everybody—that's what the election is for—so they have to rely on a sample of people. The quality of polling essentially comes down to this simple detail: how well does the sample represent those who will actually go out and vote on election day?
So, how do Thai pollsters compile their samples? The answer ranges all the way from “we have no idea” all the way to working with the civil registration database of the Department of Provincial Administration, Ministry of Interior. The two that do this are Super Poll and likely Suan Dusit (I could not find any information on its surveys this year, but the ones they took during the last election reported this information). Perhaps a step below that is NIDA, who work with a proprietary “master” sample. This master sample could in fact be compiled starting from the same source, but we are simply given no further details on what their master sample consists of. The Nation reports that they targeted “eligible voters”. This provides a little more information than NIDA in one way, but since it gives us no idea whether they have any kind of master list or how they identified eligible voters, we can only make guesses. Fortunately, the Nation provides a lot more information on their sampling method that suggests their voter list could be just as good as the other three.
Table 1. Scorecard of Polling Firms’ Voter Lists
How do firms select respondents from this preliminary voter list? Most pollsters do not provide this information. NIDA works selects from their master list using simple random sampling. Suan Dusit provides similar information in survey documents from the last election, though I could not find the information from this year’s documents. Super Poll provides no information on this, though we could assume they are similar to Suan Dusit given they work with the same source (the Civil Registration Database). The Nation provides the best description of their sampling methodology and, as is appropriate for face-to-face interviews, use multi-stage stratified random sampling techniques. We could assume NIDA, Suan Dusit, and Super Poll all use random sampling, but none of them mention this in their documents.
Why did I score no polling firm with an A? Mostly, I am comparing what lists pollsters use in the US: voter files. These are datasets created by commercial organizations using official, publicly available government records of who is registered to vote and who cast ballots in past elections. This is perhaps unfair because I am not sure if Thai law allows either of those pieces of information to be made public. There are other methods that firms use to identify likely voters anyway, a topic I cover in section 3 below.
2. How was the survey administered?
US polling firms no longer survey people face-to-face, preferring instead three methods: live telephone interviewing (produces the least error), robocalls, and online surveying. One Thai firm (Nation Poll) still does face-to-face interviewing. This can have pros and cons. The pros are that they can likely reach demographics that are not reachable by the other methods. However, respondents may be less likely to tell the truth in a face-to-face interview, especially in a political environment like Thailand’s. For that reason, I grade NIDA’s live telephone method as the best. Super Poll does not provide detailed information, but in a tour of Super Poll’s facilities on a previous occasion and discussion with Super Poll, it was revealed that they use face-to-face and telephone methods. These methods accord with their statement about contacting "respondents who have and don’t have cellphones".
Table 2. Scorecard of Polling Firms’ Administration Method
3. What efforts were made to identify likely voters?
Although Thai polling firms do not have a voter file to work with, they could still ask whether a respondent intends to vote and exclude those who say they likely or definitely will not. Other methods are to weight those who say they will definitely vote more than those who say they are just likely to do so. The gold standard is to apply post-survey weights based on a model of voter likelihood based on demographic variables and perhaps political interest and engagement in other kinds of political activity.
No Thai polling firms appear to do this, and only one firm, Super Poll even reports that they asked whether the respondent intends to vote. It is still not clear if they exclude those who said they will not. Many of the firms do provide information on the sample that could be used in post-survey weighting, however. None of the models would be highly sophisticated based on those variables alone. Two firms do mention that they engage in post-survey weighting. Nation Poll says specifically that they adjust for sample weights and population data. Daily News/Matichon mentioned that they worked with professors at Thammasat University to incorporate academic standards, but any weighting does not seem to deal with their most glaring problem—the extremely high number of Move Forward supporters. This of course accords with the bias of their sample towards young people in Bangkok and its environs. Some kind of post-weighting would surely have moderated that number.
Table 3. Scorecard of Polling Firms’ Method to Deal with Likelihood of Voting
4. How transparent are they in their methodology?
In the US, polling firms can register with a couple of organizations that encourage transparency. Firms that do tend to have less error on average. Thus, transparency is an important indicator of survey quality. On this measure, Thairath comes in dead last. There is no information provided at all on any of these issues, not even the survey’s margin of error and no undecided voters are reported. There is no reason to trust the Thairath survey at all. I place three firms next and score them a B-. Each of them excludes some key piece of information. For Super Poll it is the sampling method that is missing. Their administration method is also vague from publicly-available information. For Suan Dusit their preliminary voter list is vague and they provide no information on their administration method. Daily News/Matichon provides no margin of error and makes reference to some opaque consultation with university professors. NIDA comes slightly higher, with slightly more concrete answers in a couple of areas. Nation Poll is the highest on transparency, but the fact that they still only score a B+ underscores a main takeaway: all Thai pollsters need to increase the transparency of their methodology.
Table 4. Scorecard of Polling Firms’ Transparency in Reporting Methodology
[1] https://nidapoll.nida.ac.th/data/survey/uploads/FILE-1683025930254.pdf
2 https://www.thairath.co.th/news/politic/2689786
3 https://www.matichon.co.th/politics/news_3950707
4 https://www.nationthailand.com/thailand/40027280
5 https://suandusitpoll.dusit.ac.th/UPLOAD_FILES/POLL/2566/PS-2566-1682739647.pdf
6 https://www.thaipbsworld.com/super-poll-predicts-pheu-thai-will-not-win-a-landslide-but-will-win-most-seats/, and link to pdf posted here: https://www.superpollthailand.net/%E0%B8%9C%E0%B8%A5%E0%B9%82%E0%B8%9E%E0%B8%A5.
So, whose poll is best?
All the Thai pollsters seem to fall short on some important area of methodology. However, there are some results that I would not trust at all. Thairath, for whom we are given no information, is as good as worthless to me. Daily News/Matichon seems to produce predictions that are heavily biased, given its online voluntary sample, with no measures to adjust for those biases post-survey. Suan Dusit’s failure to even report a margin of error feels mis-leading to me, and it seems to have the results most unlike the three other polls that I like better.
Even though the other three score better there are still serious concerns, but these polls may provide good indications of the general trends of the Thai electorate. Though NIDA’s telephone method might be the best method in the US, the remoteness of many Thai rural voters may mean that at least some face-to-face interviewing might be good. There’s a tradeoff, however, with the truthfulness of answers. NIDA’s modest sample size actually seems to inspire more confidence in me than Nation Poll’s monstrous sample size, but Nation Poll’s high transparency makes up for that. NIDA does have a very low undecided rate, which gets into issues I haven’t even addressed here, such as precise language of the question, and ease with which a respondent can decline to answer, or respond “Undecided”. The undecided rates of Nation and Super Poll are more similar to Bangkok Poll’s from last election, which ended up with the lowest average error.
In sum, those three polls are the ones I would adhere most closely to in poll predictions. It is thus not surprising to me that Nation and NIDA produces fairly similar vote percentages. Both Nation and Super Poll are slightly more proximate to the election, and as I showed in my last post that can make a difference. Though Super Poll does not predict vote percentages, it calculates its seat predictions based on constituency-based survey results. NIDA does not calculate seat numbers.
As such, I finish with a table on their predictions, both vote percentages and seats.
Neither Nation Poll’s nor Super Poll’s calculations for their seat predictions are made transparent. Presumably they have vote percentages for each constituency, but they do not reveal that. And they have quite significant differences in seat predictions for most of the parties. Since Nation Poll’s vote percentages are somewhat close to NIDA’s, that means that two of the three polls seem to be more in agreement, and that gives me confidence in my final say to look at those two in particular.
As always, polling in environments that are not entirely free or fair brings in additional errors that polling firms will struggle to account for, though the level of uncertainty seems lower this year than in 2019. As a final caveat, I am analyzing their reported methodology. I have no way to assess whether they actually follow that reported methodology.
In sum, a bigger sample is not necessarily better. A sample of 500 carefully constructed respondents is better than a poorly constructed sample of 10,00 voters.
Table 5. Predictions of my top-three pollsters
First number listed for vote percentages is for the constituency vote, second is for the party list vote.
Comments