
"When a scientist feeds a data set into a bot and says "give me hypotheses to test", they are asking the bot to be the creator, not a creative partner. Humans tend to defer to ideas produced by bots, assuming that the bot's knowledge exceeds their own. And, when they do, they end up exploring fewer avenues for possible solutions to their problem."
"I've been working with a test of rapid creative thinking called the divergent association task (DAT), which I think points to a way forwards. Participants have four minutes to come up with ten nouns that differ from each other semantically as much as possible. This is a hard task. Once someone thinks of 'queen', for example, they tend to call to mind related words such as king or knight - a phenomenon known as thought anchoring."
Artificial intelligence can function as a creative partner by triggering ideas, visualizing concepts and exploring information across domains, but bots often flatten creativity when treated as sole creators. When scientists ask bots to generate hypotheses, humans tend to defer to bot-produced ideas and then explore fewer solution avenues. The divergent association task (DAT) measures rapid creative thinking by asking participants to list ten semantically diverse nouns in four minutes, revealing thought anchoring. On average, bots and humans score similarly on the DAT. When bots supply a process—such as generating ten categories and selecting one word from each—human performance improves by reducing anchoring and preserving diversity.
Read at Nature
Unable to calculate read time
Collection
[
|
...
]