
"Anyone at the university, or a large number of people at least-including me-can see a number of projects [people have] been working on with ChatGPT. In addition to the projects, Rocher says he could see how many times users interacted with ChatGPT on a given project and when those conversations began. From that metadata, Rocher was able to piece together that an Oxford student was working on an article for submission using OpenAI's tools."
"In terms of the width of different people that can access each other's behavioural data, that is quite worrying. However, the researcher acknowledges that the data exposure is internal and, while broad, limited in depth. I suspect that might be why the data protection team haven't reacted as quickly as if it was a public-facing thing."
A University of Oxford researcher discovered that ChatGPT Edu users' metadata is visible to large numbers of colleagues at their institutions through Codex Cloud Environments. The exposure includes names and metadata from public and private GitHub repositories connected to ChatGPT Edu accounts, along with interaction frequency and conversation timestamps. While actual private code remains protected, the visible metadata reveals meaningful patterns of user activity and project work. The researcher identified specific instances where this information disclosed what students were working on, including article submissions. Although the exposure is limited to internal university access rather than public-facing, the breadth of visibility across institutional colleagues raises significant privacy concerns regarding behavioral data accessibility.
#chatgpt-edu-privacy #metadata-exposure #university-data-security #behavioral-data-visibility #github-repository-access
Read at Fast Company
Unable to calculate read time
Collection
[
|
...
]