ChatGPT and Journalism in Pakistan

The increased use of artificial intelligence (AI) in journalism poses an unprecedented challenge for journalists in Pakistan. The potential of ChatGPT to spread misinformation and disinformation has raised concerns over the truth and reliability of AI-generated articles, and the role they may play in Pakistan's media, writes Lubna Jerar Naqvi.

Screens display the logos of ChatGPT, and its developer OpenAI on January 24. Credit: Lionel Boneventure / AFP

'A lie can travel halfway around the world while the truth is putting on its shoes' as the saying goes. Both disinformation and misinformation are increasingly dangerous across social media and the digital world; the spread of unreliable and untrustworthy information online has been seen to increase during elections, natural disasters, conflicts, wars, and the COVID-19 pandemic. Worryingly, people struggle to distinguish misinformation from facts.

Misinformation and disinformation are severe in Pakistan and are often used as a tool to discredit people, including journalists. In the past, misleading and malicious stories and rumours against journalists, especially female journalists, have resulted in many leaving social media or engaging in self-censorship. Ambiguous and controversial cyber laws have proved to be inadequate in protecting journalists against personal attacks and disinformation.

It is a tedious job to follow up on a constant stream of defamatory content; technological advancements continue to disadvantage journalists. ChatGPT, a Generative Pre-trained Transformer (GPT) chatbot created by OpenAI, may pose unprecedented challenges for journalists working against rumours and fake news.

ChatGPT has been seen as a game-changer for many professions, especially in journalism. Created in 2015, ChatGPT’s first version was released in 2019 but was formally launched in November 2022. Since its launch, the chatbot has captured the interest of professionals globally. It will be interesting to see if technology will be a double-edged sword for journalists, depending on how they use it.

Jenna Burrell, Director of Research at Data & Society, discussed the benefits and challenges presented by AI in a recent ICFJ Pamela Howard Forum on Global Crisis Reporting webinar. Burrell identified how ChatGPT can assist in preparing materials and understanding information, but can display untrustworthy or false results.

Journalists need to understand that “AI cannot replace you. But it can remove some of the everyday tedium so you can focus on what you do best: actual journalism” as the introduction to Marcela Kunova's article Eight tasks ChatGPT can do for journalists announces. Comforting? Maybe!

Kunova outlined how ChatGPT can assist journalists in their work by generating summaries and context for large bodies of information, translation, social media presence, and writing emails, quotes and headlines.

For a country like Pakistan where disinformation and misinformation have been restricted to social media platforms, ChatGPT’s potential to create long-form articles and news is alarming. These articles may be better in terms of readability, comprehensiveness in language facts which will have a wider appeal than the well-crafted journalist pieces.

The frequency of mixing pieces of information to create an article using ChatGPT will be more than a well-researched piece done by a human. Will the journalist be able to compete with the AI, or will citizen journalism become ChatGPT-generated from now on?

If this happens, will it lead to an increase in biased content creation, adding to well-intended misinformation and disinformation by the people? Journalism, particularly citizen journalism, may become chaotic, with facts becoming a matter of debate on social media platforms.

Many Pakistani journalists are yet to use ChatGPT for work, but it will be interesting to see how they react to it. Unless ChatGPT begins to use local languages for the user, like Urdu in Pakistan’s mainstream media; it may take some time before it progresses into the everyday work for Pakistani journalists.

However, content created by AI with human-like expressions might become popular on social media. This practice poses challenges for people to differentiate whether the content against journalists has been created and shared by humans or AI. To further complicate the problem, if the content created by AI is shared by bots on social media platforms, it will be difficult to hold anyone responsible.

Many issues can arise with the increasing usage of AI in media content creation. The legal position of content that is intended to spread disinformation, misinformation and even defamatory against media personnel, becomes ambiguous. If content is written by an AI program, and spread by bots on social media, who should be held responsible?  How will Pakistan’s legal system hold the spreaders of misinformation accountable?

It is interesting the AI’s capability and degree of reliance on AI for journalists to work in the future.

It is interesting to consider the AI’s capabilities, and imagine the future reliance on AI in the media industry. Autocorrect and editing software are already everpresent, and help create quality work. Will ChatGPT only assist in writing, or will it become the writer and thinker, taking over not only the byline but the mind of the journalist? Will journalists give credit to ChatGPT instead?

Technology improves work efficiency in the media industry by taking care of tedious mistakes. Journalism will probably improve as far as the language is concerned, with better headlines and verified facts, which is a good thing. Will the journalism of the future be worth anything, if journalists themselves are not editing their own work?  Will this be authentic, ethical, and balanced journalism if we rely solely on AI to check our work? Should AI content be considered plagiarism, if their content is ‘original’ in its generation?

ChatGPT will not only be replacing the original ideas and content of journalists but, in a way, take the place of editors and fact-checkers. Only time will tell how ChatGPT will used in the media industry, and if disinformation and misinformation have been allowed to spread.

Lubna Jerar Naqvi is a freelance journalist and the Pakistan Gender Coordinator of the IFJ Asia Pacific.