Scientists create AI that 'watches' videos by mimicking the brain
Briefly

"The brain doesn't just see still frames; it creates an ongoing visual narrative," says senior author Hollis Cline, PhD. "Static image recognition has come a long way, but the brain's capacity to process flowing scenes requires a much more sophisticated form of pattern recognition." This statement underlines the critical need for AI to go beyond still images and emulate human-like scene interpretation actively, which MovieNet successfully offers.
To create MovieNet, Cline and first author Masaki Hiramoto examined how the brain processes real-world scenes as short sequences, similar to movie clips. This research involved studying tadpole neurons, as they have a sophisticated visual system capable of responding efficiently to moving stimuli, reinforcing the inspiration behind MovieNet's development.
Read at ScienceDaily
[
|
]