Appropriate AI for nonprofits
We have a spectrum of people from those that use it often to those that don't use it at all. How do the tools fit into our worldview? AI is barreling into mass integration. We’re at the beginning of something that's going to have a big impact.
What makes us angry about it? Concerned about it?
- Verifiability. How do you know what's true?
- Complexity of policy creation.
- Bias, LLM’s were trained on colonial
- AI used in predicative policing
- Replacing writers and other threats to jobs.
- Environmental cost with GPU power cost.
- Equitable distribution of tools. Doesn't exacerbate the digital divide.
- Advantaged toward big tech and furthers big tech wealth gap.
- Want to have the ability to say no to new technologies.
- Creating tools that remove jobs rapidly and not having an equitable system to reskill those folks
- Enforced group think. Homogenization of voices and lack of originality. Breeds complacency.
- It's connected to dominate tech jobs competing for the biggest lot.
Non-profits have roll outs for CRM, but have not had anything for AI. How do we create frameworks, or what do we think about non-profits using AI? Is AI going to meet their needs? How to get staff involved about it and share their opinions? How do we assess platforms?
Having the scaffolding to know what an affect donor letter or other output is a necessary skill to have before using generative AI. We need to make sure folks have the skills to evaluate the output.
- As a developer, something that would take 3 months, took me a week.
- Give indigenous people access to funding. Transcribed audio and crossed it with a grant form, which resulted in direct funding to a the community
- Someone on the spectrum uses it edit his emails.
- Generate material for a funder on a short timeline. The output was excellent
There are certain kinds of tasks that AI can excel at, but humans need to be in the loop. Sometimes humans can suspend judgment in the face of a machine. We need to manage expectations that AI is a tool and needs heavy review.
How about ways to check for bias?
Image generators will repeat the bias that's includes. Non-profits don't have large data sets to counter act the large public ones.
Shift of power of individual performing a job. It's saving time but introducing risk.
You might not have the knowledge to do a thing. You've just got a prompt to generate the output to do the thing.
Writers original work being used to generate derivative work.
When working with vulnerable populations, will the machine have the same concerns a human would? Would it protect privacy?
Writers suing LLM owners for appropriation. AI can analyze speech patterns of famous authors and poets, and it can copy it. Will this extend to anyone that uses AI?
Need to create guidelines for using AI.
How did the AI get to an answer? Citations can also be incorrect.
Why is authenticity of data important? Why do we want to assess this?
This opens onto a larger historical problem.
Perspectives can influence what the truth value it for each
Does it take more time to validate it the output and than to just create it yourself?
What is the provenance of the data? Understand the systems behind the collection of the data. The original purpose of the data model would help evaluate it.
This seems a lot list an acceptable use policy. Should use it more as a writing tools than a research tool.
Need to educate everyone of risks.
Promote the understanding of co-botting, so it always stays human centered.
Need to have clear policies, like HR policy or acceptable use.
Retraining if people's jobs are affected.
Professionalism. Not enough caution about the deployment. Not enough testing of the usage.
The standards can be more lax when using AI as an individual than they should be in a professional context. AI can be used to start a draft but shouldn’t be used as the final draft.
Equity issues are very glaring for POCs, queer and poor folks, The most privileged people can use it much easier.
Need to generate a collection of recipes.
The early stages of the introduction of this new tech tool make this harder to know what to do.