Why Write On champions human-created content in the rise of AI

“Read Only,” is our mantra for using AI

By Lindsay Jordan, Write On Founder + CEO

It seems like every day I receive a new email, article, or invitation to a webinar touting the benefits of AI in fundraising. And at each and every one, I crinkle my nose.

It’s not that I don’t believe AI can have benefits for my team and clients - we already use AI to streamline our work and proofread our writing. Tools like Grammarly have helped newer writers overcome some long-standing habits (like using commas in the wrong spot or having a penchant for verbose prose) and more quickly onboard AP Style, our preferred method of writing. We also regularly use AI to build out project management forms, templates, and workflows.

But AI writing grant applications? Building relationships with donors? Actually making an ask? I scoff. And with good reason.

Recently, Write On Fundraising launched our own internal AI Usage Policy as a blueprint for exactly how and when AI is appropriate for fundraising. The policy was created after nearly 12 months of diligent research and discussion with other industry experts. While many on our team were initially excited for the use of AI in our processes and systems, our research led us toward a much more conservative adaptation of the new technology.

Why the skepticism of a tool many others are flocking to?

  • First and foremost, because a central part of our mission is to bring equity to philanthropy, amplify voices in communities, and invest in people. A human-centric approach to AI is demanded by our mission. People are at the center of fundraising, and at the center of mission work.

  • Second, feeding Write On-produced content into the AI machine to see what it pops out is a breach of contract with our clients. Unlike other creative agencies, Write On clients own the final products (language, applications, marketing materials) that are created for them in fundraising - Write On owns the processes and tools. Right now, systems like ChatGPT keep the information they are fed (without the owner’s consultation or compensation) and this would mean breaking our own confidentiality policies.

  • AI is biased. The information it produces is based only on the information that has been aggregated, and much of that information is discriminatory. Take, for instance, bias in an algorithm used to determine rent, screen tenants, or make loan decisions. As The Chronicle of Philanthropy points out, social service organizations see danger in AI-driven content.

  • AI “lies.” According to Gayle Roberts, chief development officer interviewed by The Chronicle of Philanthropy, “It’s [AI] kind of a sycophant. It wants to give you an answer, and if it doesn’t know the answer, it will make one up for you and won’t tell you it’s making it up.”

  • AI sources are untrustworthy. Perhaps the earliest and loudest criticism of the tool is that its sources cannot always be trusted. Diligent writers must confirm the data provided in the original document and never take an AI statistic or figure for the truth.

  • AI is already out of date. Information loaded into the system ended in September, 2021.

  • AI threatens to decrease trust and people’s sense of agency. If donors know that ChatGPT could be used to write that warm and friendly thank-you letter, then why should they believe that you wrote it? Or that your nonprofit values them? Or that they should continue giving to an organization that has replaced genuine human empathy with cold, mimicking technology?

This doesn’t mean, of course, that AI doesn’t have a place in fundraising. It simply requires intention and restraint. “Read Only” is Write On’s adopted mantra with any AI-produced content. AI can be effective in overcoming writer’s block, gaining inspiration, or acquiring new sources to research when drafting content from scratch. It can be useful in comparing and contrasting various theories of change or gaining historical context on a given subject.

AI cannot, however, provide lived experience. It does not tell a story. It does not demonstrate impact. It does not project vision into the future. It does not draw context or meaning. It is not a subject matter expert. It has no markers of originality. It creates no long-term value for the reader.

AI is, in short, just another tool in our toolbox. It is, IMO, the next cupcake or yogurt shop (does anyone else remember when there was a cupcake shop on every corner and then - poof! - all gone?): useful on occasion, but not that disruptive to fundraising on the whole.

Write On Fundraising content is proudly human-created. For the purpose of this blog, I asked ChatGPT my first-ever question: Is human-generated content better than ChatGPT content? While the system coyly sidestepped naming one over the other, this is what it shared with me:

“Human-generated content brings with it expertise, nuance, creativity, judgment, and personalization. ChatGPT-generated content offers scalability, speed, consistency, and assistance.”

Thanks, ChatGPT. We agree. And we know which pathway is a better fit for nonprofits, donors, and fundraisers (all humans, by the way). It’s people. It’s always people.

Previous
Previous

Director of Development on Demand: What does that mean?

Next
Next

Establishing Credibility for a Non-Profit Organization