skip to Main Content
bitcoin
Bitcoin (BTC) $ 95,832.68 1.46%
ethereum
Ethereum (ETH) $ 3,333.66 1.62%
tether
Tether (USDT) $ 0.998424 0.13%
xrp
XRP (XRP) $ 2.20 3.19%
bnb
BNB (BNB) $ 671.76 0.79%
solana
Solana (SOL) $ 184.03 1.18%
dogecoin
Dogecoin (DOGE) $ 0.315612 1.80%
usd-coin
USDC (USDC) $ 0.999548 0.09%
staked-ether
Lido Staked Ether (STETH) $ 3,328.16 1.56%
cardano
Cardano (ADA) $ 0.894972 2.04%

ChatGPT politically biased toward left in the US and beyond: Research

Even after facing evidence that ChatGPT has a political bias, the chatbot continued to insist that it and OpenAI were unbiased.

158 Total views

2 Total shares

ChatGPT politically biased toward left in the US and beyond: Research

ChatGPT, a major large language model (LLM)-based chatbot, allegedly lacks objectivity when it comes to political issues, according to a new study.

Computer and information science researchers from the United Kingdom and Brazil claim to have found “robust evidence” that ChatGPT presents a significant political bias toward the left side of the political spectrum. The analysts — Fabio Motoki, Valdemar Pinho Neto and Victor Rodrigues — provided their insights in a study published by the journal Public Choice on Aug. 17.

The researchers argued that texts generated by LLMs like ChatGPT can contain factual errors and biases that mislead readers and can extend existing political bias issues stemming from traditional media. As such, the findings have important implications for policymakers and stakeholders in media, politics and academia, the study authors noted, adding:

“The presence of political bias in its answers could have the same negative political and electoral effects as traditional and social media bias.”

The study is based on an empirical approach and exploring a series of questionnaires provided to ChatGPT. The empirical strategy begins by asking ChatGPT to answer the political compass questions, which capture the respondent’s political orientation. The approach also builds on tests in which ChatGPT impersonates an average Democrat or Republican.

Data collection diagram in the study “More human than human: measuring ChatGPT political bias”

The results of the tests suggest that ChatGPT’s algorithm is by default biased toward responses from the Democratic spectrum in the United States. The researchers also argued that ChatGPT’s political bias is not a phenomenon limited to the U.S. context. They wrote:

“The algorithm is biased towards the Democrats in the United States, Lula in Brazil, and the Labour Party in the United Kingdom. In conjunction, our main and robustness tests strongly indicate that the phenomenon is indeed a sort of bias rather than a mechanical result.”

The analysts emphasized that the exact source of ChatGPT’s political bias is difficult to determine. The researchers even tried to force ChatGPT into some sort of developer mode to try to access any knowledge about biased data, but the LLM was “categorical in affirming” that ChatGPT and OpenAI are unbiased.

OpenAI did not immediately respond to Cointelegraph’s request for comment.

Related: OpenAI says ChatGPT-4 cuts content moderation time from months to hours

The study’s authors suggested that there might be at least two potential sources of the bias, including the training data as well as the algorithm itself.

“The most likely scenario is that both sources of bias influence ChatGPT’s output to some degree, and disentangling these two components (training data versus algorithm), although not trivial, surely is a relevant topic for future research,” the researchers concluded.

Political biases are not the only concern associated with artificial intelligence tools like ChatGPT or others. Amid the ongoing massive adoption of ChatGPT, people around the world have flagged many associated risks, including privacy concerns and challenging education. Some AI tools like AI content generators even pose concerns over the identity verification process on cryptocurrency exchanges.

Magazine: AI Eye: Apple developing pocket AI, deep fake music deal, hypnotizing GPT-4

Loading data ...
Comparison
View chart compare
View table compare
Back To Top