
"While the service binds to the localhost address at 127.0.0[.]1:11434 by default, it's possible to expose it to the public internet by means of a trivial change: configuring it to bind to 0.0.0[.]0 or a public interface."
"Of the observed hosts, more than 48% advertise tool-calling capabilities via their API endpoints that, when queried, return metadata highlighting the functionalities they support. Nearly half of observed hosts are configured with tool-calling capabilities that enable them to execute code, access APIs, and interact with external systems, demonstrating the increasing implementation of LLMs into larger system processes," researchers Gabriel Bernadett-Shapiro and Silas Cutler added."
An unmanaged, publicly accessible layer of AI compute spans 175,000 unique Ollama hosts across 130 countries, concentrated most heavily in China at just over 30%. These systems operate across cloud and residential networks and run outside default platform guardrails and monitoring. Ollama binds to localhost by default (127.0.0.1:11434) but can be exposed publicly by configuring it to bind to 0.0.0.0 or a public interface. More than 48% of observed hosts advertise tool-calling capabilities that enable code execution, API access, and interaction with external systems. Local hosting outside enterprise perimeters creates new security concerns and requires methods to distinguish managed from unmanaged AI compute.
Read at The Hacker News
Unable to calculate read time
Collection
[
|
...
]