Working smarter with AI: Strengthening communications processes while protecting purpose
By Lisa Ritchie, Communications Strategist and Founder of EngagingDev
For many of us working in development program communications, the emergence of AI is both exciting and unsettling. It is a new tool at our disposal with the potential to improve communications, but there are also concerns about quality, competency and factual content.
AI is being used by program communications personnel across development programs, but we are all learning-as-we-go and building our skills. We can deepen that learning and consolidate how we best utilise AI in development program communications by sharing our experiences.
So this is my first step towards that.
Curiosity over fear
When I first began testing AI tools, I was curious to know:
What could it support?
What efficiencies could it offer?
Would it risk undermining the quality or credibility of the work I produced?
To date, the most practical uses I have found have been in brainstorming and translating complex technical content into plain-language points. These points serve as an early ‘input’ into the process of developing communications products or tools, but not the ‘output’.
AI can be a game-changer for brainstorming, especially for communications personnel who work remotely or as a ‘team-of-one’ within a team.
My suspicion of AI sat heaviest in relation to the third point – quality and credibility. I was, and remain, wary of publishing anything produced solely by AI as quality is undoubtedly an issue and credibility takes care to build and can be lost in an instant.
Process not products
So far, I have found AI to add the most value to communications processes, not products.
For example, I have used ChatGPT to support the drafting of key messages for new program strategies. Roughly, that process looks like this:
Instruct ChatGPT to summarise publicly available, content-heavy documents such as program designs, donor and partner development strategies and policies, media releases and statements (I still read them all; the time is saved in summarising).
Instruct ChatGPT to suggest a set of key messages, with very clear prompts provided on senders and receivers.
I cross-check each suggested key message word-for-word for accuracy against relevant documents and will redraft to ensure alignment with conversations I have had with teams and donors.
This draft of key messages then serves as the basis for a workshop with the program team, donor and other relevant partners.
Through this workflow, AI does not produce the messages, it enhances the process for developing them.
Final decisions on what to emphasise, what to omit, and tone can only come from those with the strategic vision of what the messages are designed to support, and a situational understanding of the program environment.
Learning how to prompt
Crafting a good prompt requires the same skills as crafting a strong message: clarity of purpose, understanding your audience and being explicit about tone and context. This is a skill that I am still learning, and one that programs should invest in through training – for all staff, not just communications teams.
Fact-checking is non-negotiable
Every AI-generated line, every word, must be fact-checked against program data, policy documents and donor guidance. Never include a reference or source data provided by AI without checking it. AI ‘hallucinates’; it makes things up to improve content.
While AI can support the content development process, communications specialists who choose to use AI must be held accountable for the accuracy and the ethics.
Levelling the language playing field
Donor standards for reporting and visibility products are high. For English-speaking donors, AI tools can help refine grammar, adjust tone and improve readability while allowing communicators to focus on substance.
Fact checks, re-drafting and reviews are still needed, but the process is now more inclusive. Programs should be taking steps to ensure their teams are equipped with the skills and knowledge needed to navigate AI and leverage it for good without risking program – or donor – credibility.
The takeaway
AI is not perfect, and it is not going away, but testing the tools, sharing lessons, building ethical approaches and supporting team development can ensure it supports, rather than substitutes, good program communications practice. We must always think critically and never lose sight of the fact that AI tools are just that – tools.
So, while I appreciate the value of AI, I will not be handing over full control of my strategic or creative outputs anytime soon. I hope to support programs to maintain control of the voice, values and intent that ensure the strategic value of their communications remains intact, while improving communications processes to enhance quality and productivity.