Policy on Use of Artificial Intelligence Tools

Artificial Intelligence (AI) continues to bring about revolutionary changes to many areas in society as well as to our personal and professional lives. Debates are ongoing regarding the proper use of AI, potential risks, and how to draw healthy boundaries while recognizing that AI is here to stay.

In journalism, AI can bring great benefits in increasing efficiency in some areas, while posing great risks to accuracy and credibility when applied wrongly. Therefore, Christian Daily International’s editorial team wants to be transparent about what is permissible and what is not permissible use of AI when it comes to news reporting on its website.

Permissible use

AI tools can be very helpful to transcribe audio-recordings of interviews, speeches, presentations, etc. While accuracy can be very high, journalists still have to ensure that the quotes used from AI transcripts match the actual recording of what was said.

Tools like ChatGPT and similar can also be helpful for proofreading or improving English expressions, especially for journalists for whom English is a second language.

Similarly, AI can be helpful in summarizing long texts, such as reports, to help the journalist more easily grasp what is said. The journalist is still required to study the relevant parts of the report to ensure any information used in the article, whether as summary or direct quote, accurately reflects the report’s findings.

AI tools may be used to translate material from one language to another, but needs to be used with great caution as translations may not always be accurate in conveying the tone or meaning of the content. Especially if information, facts, quotes, etc. touch on a sensitive subject, they should be verified with someone who is proficient in the original language. Complete article translations (such as between CDI’s different language editions) always require human review before publishing.

Journalists may also use various AI tools to help in their everyday tasks that are unrelated to the actual writing of stories.

AI tools may be used for creating visuals where no suitable photos are available, e.g. to convey a theme or visualize something that is abstract. AI generated visuals may not make the impression of being real photos and must be labeled as ‘AI generated’ where appropriate.

AI tools may also be used as voice over in general video narration, but may not resemble or impersonate a particular individual.

Impermissible use

AI tools may not be used to write entire articles (news or opinion) from scratch. Apart from being an unethical practice for a professional journalist, columnist or other contributor, there is an inherent risk that AI may use copyrighted material and plagiarize from existing sources without proper attribution.

Furthermore, information, such as quotes, data, facts, references, etc. provided by AI may not be used without verifying original sources to verify their accuracy, as AI can ‘hallucinate’ and make things up that do not exist.

AI may not be used to create visuals that could mislead the audience, such as visuals that resemble photographs of real people or situations or to alter real photographs beyond simple corrective improvements (such as contrast or brightness).


V1.0, June 20, 2024 -
This policy will be reviewed and updated periodically as new AI tools and functionalities become available that are relevant for news media.