blur-bg

What role can Generative Artificial Intelligence play for not for profits?

Categories:

Generative ai use cases Ai Not for profit Llm Ngo

A colourful, stylized illustration depicts a dynamic office scene. Diverse individuals are shown working together and independently. In the foreground, two people work at desks with laptops. The middle ground shows a group collaborating on a project with sticky notes, while in the background others analyse data and brainstorm. The art style is modern and abstract, with intricate details suggesting a hub of creativity and teamwork.

In our last blog aimed at the not for profit sector, we talked about how we can use Artificial Intelligence (AI) to help measure the impact of interventions. The techniques we talked about in that blog fall under the category of ‘extractive’ AI techniques. In this blog, we’ll focus instead on the latest generation of ‘generative’ AI tools, and how not for profits can make best use of these tools.

What is Generative AI?

Until relatively recently almost all AI was ‘extractive’. This means that: given a collection of data the AI would extract relevant information, for example mentions of people, places, topics, and themes or applying labels to the document, for example MeSH tags.

Generative AI is different. Generative AIs are capable of synthesising high quality content or data, and the release of models such as OpenAI’s ChatGPT has unlocked a whole range of new use cases from convincingly human-like chatbots to synthetic data generation.

We’ll look at three use cases in this blog post that are of particular relevance to not for profits. All involve the use of Large Language Models (LLMs) like ChatGPT: a type of Generative AI model that can be used to understand and synthesise text.

Support for Grants

For grant giving organisations there are a number of use cases for LLMs around the grant application and evaluation process.

The first of these is to help applicants to write grant applications. We’re a long way from LLMs being able to write applications from start to finish — and this may not ever be feasible or desirable. But it is certainly the case that these tools can assist applicants in a number of ways, for example by synthesising drafts of generic text such as the problem background, or suggesting structure for longer free text fields that must be completed as part of the application.

The widespread use of ChatGPT by academics to help write parts of sometimes onerous research grant applications was the subject of a recent nature article. Whilst many applicants are using these tools on their own accord, pro-active grant funding organisations can build generative AI directly into the application process.

Pro-active grant funding organisations can build generative AI directly into the application process.

For example, the application process can be streamlined by creating tools such as chatbots to walk applicants through it, making suggestions about how to answer questions in a way that is helpful to the funder, and providing some early instant feedback to the applicant to help them refine the application before it is seen by a human reviewer. LLMs are also very good at producing summaries and synthesising text from a broad range of supplied sources, and could produce lay summaries and other free text answers based on reference material automatically, or when prompted.

This leads to the other obvious application: evaluating grant applications. There has been some debate recently surrounding the use of LLMs for reviewing academic articles — some positive, some negative.

The short answer is that it looks like LLMs have a place in checking for consistency of argument, clarity and style (all of which could instantly be fed back to the applicant), but probably not for assessing novelty, and providing a deep and nuanced critique of scientific methods. This will likely change over time as more specific LLMs for these tasks are produced.

We can extrapolate this into fields other than science: there will need to be human evaluator, but the LLM can assist by assessing grants for softer criteria, and help the evaluator by summarising or improving clarity of the application.

Finally, generative tools can be used to identify the most relevant reviewer, and provide them with a summary of other similar grants that have previously been funded, saving much time in reviewing databases of funded grants, ensuring that funding aligns with the organisation’s strategy.

generative tools can be used to identify the most relevant reviewer, and provide them with a summary of other similar grants that have previously been funded

In summary, Generative AI can be used to chip away at the grants application process from both sides, helping to make a smoother and quicker process for all concerned.

Dynamic Information Retrieval

One of the benefits of LLMs is that for many use cases they do not need to be trained — the process by which we ‘teach’ a model about a certain topic. Compare this to the WellcomeBertMesh model which we developed with the Wellcome Trust which applies an ontology of tags (Medical Subject Headings) to free text documents such as grant summaries: this model was trained for multiple days and required very specific data to be prepared in advance.

Imagine a different scenario: a not for profit that has sent out a survey, or has a dataset describing the multiple ways its money is being used at the community organisation level. If we want to identify cases where vulnerable adults, or minority groups are mentioned using AI, we would need to collect and annotate data and train a model for the task which would take days. With LLMs we can just ask the model to find the information we’re looking for, and it will probably do a pretty good job out of the box.

With LLMs we can just ask the model to find the information we’re looking for, and it will probably do a pretty good job out of the box

That’s not to say that we can be complacent and assume that the model is right, or that it doesn’t need to be assessed for bias — it does — but that is the same as any model. What LLMs offer us is a way to get started very quickly, and for generic use cases like this, it is likely to perform very favourably against trained models, or simpler naive approaches such as keyword searches.

Marketing and Education

The final use case for generative AI is probably the most popular across all industries at the moment: content creation. This has two obvious use cases for Not for Profits: marketing, and education.

For marketing, GenAI tools have proven to be very adept at producing marketing content to help get the message of the organisation out to a wider audience via social media. This is nothing new — many businesses are experimenting with LLMs for this use case.

A more specific use case for not for profits is education. Many organisations have a stated aim of increasing awareness of an issue, for example a medical condition, or a humanitarian situation, and others wish to educate, e.g. providing educational materials for sufferers of rare diseases.

In this educational use case LLMs can be used to provide content targeted to the user through mediums that the user is familiar with, such as text message, Whatsapp, or even synthesised phone calls. For example, chatbots can be built on top of the organisation’s existing knowledge base of blog posts and other authoritative material to create a conversational interface which allows users to get answers to specific questions asked in their own words.

chatbots can be built on top of the organisation’s existing knowledge base of blog posts and other authoritative material to create a conversational interface which allows users to ask specific questions in their own words

There are dangers, of course, in ensuring accuracy and appropriateness of the messaging, but these are being overcome in a number of ways as the technology matures, such as the use of Retrieval Augmented Generation (RAG) systems.

This has been a brief tour of possible use cases for Large Language Models, which are a type of Generative AI, in the not for profit sector. This is just one facet of Generative AI, there are likely to be many more use cases presenting themselves, for example as large vision models (LVMs) — models which can process and interpret images and videos — become more widely available and useful.

We’ll continue this series by looking at more concrete examples in the coming weeks. In the meantime, if anything in this post has chimed with you or what your organisation is trying to do, feel free to get in touch for a chat at hi@mantisnlp.com.

If you’re a decision maker and you’re wondering how you can plan for Generative AI at a strategic level in your organisation, you might also be interested in our workshop package Generative AI for Decision Makers, which will equip you with the tools you need to speak authoritatively about AI, and make informed decisions about how to integrate it into your organisation.


What role can Generative Artificial Intelligence play for not for profits? was originally published in MantisNLP on Medium, where people are continuing the conversation by highlighting and responding to this story.

Next Article

How we’re thinking about Generative AI: Proprietary vs Open Source

Caption: Thinking deeply about Large Language Models. Produced by DallᐧE 3 In …

Read Post

Do you have a Natural Language Processing problem you need help with?

Let's Talk