
"A publicly exposed database, without password protection or encryption, was discovered by Cybersecurity Researcher Jeremiah Fowler and contained 1.6 million audio files. These audio files included phone calls and voice messages collected between 2020 and 2025, belonging to a range of gyms and fitness centers across the United States and Canada. However, the database itself appears to belong to Hello Gym, a third-party contractor."
"Many exposed audio recordings contained personally identifiable information (PII) such as names and phone numbers. If such information was leaked to a malicious actor, it could leave individuals at risk of targeted spear-phishing or social engineering campaigns. Fowler also noted the potential for a "man-in-the-middle" attack, in which a malicious actor could call a target, impersonate gym or health facility staff, and convince the target to share their credit or debit card information."
"The fact that AI models are capable of cloning voices with a high level of accuracy is terrifying, especially considering that many companies often record our conversations. (We have all called a business or organization and heard the message 'Your call may be recorded for quality and training purposes.') In such instances, consumers or clients generally have little or no control over how long those files are kept - not to mention the potential outcomes should"
An unprotected, unencrypted database containing 1.6 million gym and fitness center audio recordings from 2020–2025 was discovered, reportedly owned by third-party contractor Hello Gym. The recordings include phone calls and voice messages from facilities across the United States and Canada. The researcher notified the contractor and access was restricted within hours, but the duration of exposure and potential malicious access remain unknown. Many recordings contain personally identifiable information such as names and phone numbers, enabling spear-phishing and social engineering. Exposed calls included alarm system credentials, creating physical security risks. The recordings also raise concerns about AI voice cloning and impersonation abuse.
Read at Securitymagazine
Unable to calculate read time
Collection
[
|
...
]