How to Teach Critical Thinking When AI Does the Thinking
Briefly

How to Teach Critical Thinking When AI Does the Thinking
"Big Four consulting firm Deloitte just repaid $291,000 to the Australian government after admitting it used ChatGPT to produce a compliance review riddled with errors. The report contained nonexistent references, fabricated citations, and invented court cases. University of Sydney academic Christopher Rudge said that there were multiple "hallucinations" that appeared unsupported by any actual evidence. This wasn't a student cheating on a homework assignment with ChatGPT."
"AI wasn't the failure here. It did what it always does and completed the user's request. The consultants failed because they didn't know how to think with the tool. They treated it like what Paulo Freire called a "banking education" system: deposit your request, withdraw your answer, never question the transaction."
Deloitte repaid $291,000 after admitting that a ChatGPT-produced compliance review contained nonexistent references, fabricated citations, invented court cases, and multiple hallucinations unsupported by evidence. Highly-paid consultants outsourced their expertise to an algorithm, producing professionally formatted but erroneous work. AI performed as instructed; the failure lay in users' lack of critical engagement and thinking. Consultants treated AI like a "banking education" system: deposit a request, withdraw an answer, never question the transaction. Many educational institutions similarly automate grading: professors automate 48.9% of grading interactions despite rating AI-assisted grading as least effective. Dialogic prompting requires interrogation of AI outputs, turning AI into a thinking partner.
Read at Psychology Today
Unable to calculate read time
[
|
]