BLOG

AI-POWERED TOOLS AND GRANT WRITING

Our Elevate Colleagues Weigh In With Advice

At Elevate, we’ve been thinking a lot this year about Artificial Intelligence (AI) and how it could influence the way that we perform our work supporting nonprofits in building more sustainable grants programs. AI is increasingly relevant in everyday business operations across numerous industries, and it has been applied for many years in ways that many people don’t realize –  mapping apps like Google Maps, voice assistants like Alexa and Siri, even the autocorrect feature in your texting app or word processing software are powered by artificial intelligence. 

Some interesting trends have emerged in 2023, which hint at AI’s potential while also raising significant ethical and information security considerations. A recent Forbes Advisor survey reports that: 

  • ChatGPT, an AI powered large language model, had 1 million users within the first five days of being available. 
  • 54% of survey respondents believe that AI tools like ChatGPT can improve written content by enhancing text quality, creativity, and efficiency in various content creation contexts. 
  • AI is expected to see an annual growth rate of 37.3% between now and 2030. 
  • The majority of consumers are concerned about business use of AI. 

 

Accordingly, Elevate is treading optimistically – yet cautiously – when it comes to AI! While we are exploring how AI can create efficiencies in our work, we are also committed to maintaining the highest quality and privacy standards for our clients. 

I recently spoke with a handful of my most forward-thinking and tech-savvy Elevate colleagues, who offered their thoughts on what AI applications are and aren’t helpful in aspects of their work advising nonprofit clients. We share their insights here for your consideration as we all navigate this brave new world. 

 

ChatGPT is NOT your next grant writer 

Because many of the questions Elevate receives about AI are about ChatGPT specifically, we begin with the 411 on this tool. 

Even if you haven’t yet used it yourself, you’ve undoubtedly heard the buzz about ChatGPT. ChatGPT is a free, natural language processing tool that can answer questions and support users with tasks such as composing emails, essays, and code. It can spew out responses in a matter of seconds. It does this by analyzing your question or prompt, then – using the dataset it was trained on – predicting the next word or series of words based on what you’ve entered. 

But is it savvy enough to write sophisticated, nuanced, and winning grants? 

Our colleagues were unequivocal in their response: Not even close. 

This is because ChatGPT lacks the context, experience and judgment to handle such complex work. Even when I asked ChatGPT “What are the Pros and Cons of Using ChatGPT for grant writing?” it didn’t disagree! While ChatGPT praised its speed and ability to maintain consistency in “tone, language, and messaging” across various sections of a grant proposal and to polish language, it cautioned that it may not fully appreciate the “nuances” of grant guidelines, that it has limitations in understanding the context of an organization’s work and history, and that it could also produce plagiarizing text. 

The text that ChatGPT generates in response to a question or prompt might not even be factual – there are absolutely no assurances that the information is accurate or true. 

What’s more, because ChatGPT and other AI models draw upon existing content, AI can reflect underlying societal biases, perpetuating stereotypes and white supremacist notions. At Elevate, we know that historically marginalized communities are not “vulnerable” objects of charity, but agents and partners of the social change that they desire to see. This level of social context is far too complex for an AI-powered language model to appropriately reflect. 

So, what are the appropriate uses of ChatGPT? 

If you do want to experiment with ChatGPT in your writing tasks, we suggest using it for simpler, less analytical tasks, such as condensing word count, identifying alternative phrasing to avoid repetition, or summarizing the main points of your research into more readable language. 

ChatGPT also has the potential to provide administrative support for your work, and can be harnessed to: 

  • Organize a to-do list
  • Summarize meeting minutes 
  • Brainstorm ideas

 

But whatever you do, do NOT rely on ChatGPT to produce your next grant proposal. 

I know what you are thinking: It’s no surprise that a grant writing firm is telling me not to use an AI tool to write grants! But we are not just saying this because we want to be your grant writers. (Though we DO want to be your grant writers!) 

At Elevate, we firmly believe that good grant writing is a thoughtful, strategic exercise that requires skill, nuance, and informed decision making. ChatGPT – like other AI tools – is neither  thoughtful nor strategic. It lacks discernment of nuance, and is incapable of making reasoned choices about how to present an organization’s work to a funding partner. 

Simply producing large volumes of content – that may or may not be factual! – is NOT the point of grant writing. And this is truly all that ChatGPT is doing: generating text. 

 

AI Tools CAN Help You Take Notes, Summarize Content, and Find Information  

At Elevate, some of our staff are experimenting with the use of AI-powered tools such as Simon Says AI and Fathom Notetaker to capture meeting notes and provide summaries of important conversations that they need to refer back to later or share with colleagues who couldn’t attend meetings. By using AI tools for more administrative tasks, you can free up some of your own time and energy for tasks that require thought and strategy – something AI can’t do! 

As a tool developed by Google, Bard can interface with Google Workspace tools, if you choose to connect these. This means, you can ask Bard to find dates, taks, or other information in gmail, or to summarize a report a colleague shared via Google docs. 

Interested in exploring more options for what you can do with AI tools? Check out FutureTools.io, which aggregates AI tools suited for different purposes. 

 

Please Please Please: Inform yourself about privacy! 

If you take only one thing away from this article, I hope it is this: get informed about the privacy of the information you share with AI tools, and take precautions to protect your information. 

When using any cloud-based technology platform, it is imperative that consideration be given to the way these tools use, store, and share information. Depending on your privacy settings, information you share with tools like Bard and ChatGPT may be used to improve its own language model. This means your data may not only be available to its creators (OpenAI), but also to others who use the platform. 

For instance, when first accessing Bard, users are notified that Google will collect conversations and other information like the user’s location, store this data for a period of time, and use it to refine the tool. Furthermore, users are informed that “human reviewers read, annotate, and process your Bard conversations,” and they are warned to not share confidential information.

For these reasons, think carefully about what information you share with AI tools. Remember that a grant application may include information about your organization, programs, staff, and future plans that might be considered private. A good rule of thumb is, if you wouldn’t want a piece of data or information on your public website for anyone to find, you should not share that information with an AI tool. 

How is your organization using AI powered tools, and what have you found useful, scary, hopeful, or exciting about these tools? We invite you to share! 

Are you still feeling overwhelmed, or do you want to learn more? Here are a few sources the team at Elevate is using to stay informed: 

About the Author:

x