3 Comments
User's avatar
⭠ Return to thread
ALT's avatar

Artificial Intelligence doesn't do things in the optimal method, it does them with the biases programmed into it.

Have actual people do the synthesis. They'll be biased too, but they might at least be biased differently from each other. Or have multiple AI (say, Google's and Elon Musk's) run the same analysis and then have people synthesize the AI responses...

I rather like your ideas on data collection. Most parishes don't have flocknote, but a link

and individual codes could be passed out to churchgoers on Sunday - effectively excluding those who don't care enough to show up for an hour a week.

Expand full comment
meh's avatar

AI's have biases, but they come out more if you try to ask it "what do you think about x" and it draws from everything it's been trained on to respond. If you give it a set of comments and ask it to find common themes it's probably going to be less biased than your average human.

Expand full comment
ALT's avatar

I have heard of ChatGPT lying about whether various statements existed in a searcheable document, repeatedly.

Additional biases can be programmed in if desired. It will then be more biased than your average human.

Expand full comment