College student's "time travel" AI experiment accidentally outputs real 1834 history
Briefly

College student's "time travel" AI experiment accidentally outputs real 1834 history
"Grigorian's project joins a growing field of researchers exploring what some call "Historical Large Language Models" (HLLMs) if they feature a larger base model than the small one Grigorian is using. Similar projects include MonadGPT, which was trained on 11,000 texts from 1400 to 1700 CE that can discuss topics using 17th-century knowledge frameworks, and XunziALLM, which generates classical Chinese poetry following ancient formal rules. These models offer researchers a chance to interact with the linguistic patterns and thought processes of past eras."
"For the past month, Grigorian has been developing what he calls TimeCapsuleLLM, a small AI language model (like a pint-sized distant cousin to ChatGPT) which has been trained entirely on texts from 1800-1875 London. Grigorian wants to capture an authentic Victorian voice in the AI model's outputs. As a result, the AI model ends up spitting out text that's heavy with biblical references and period-appropriate rhetorical excess."
Hayk Grigorian developed TimeCapsuleLLM, a small AI language model trained exclusively on London texts from 1800–1875 to reproduce an authentic Victorian voice. The model generates language rich in biblical references and period-appropriate rhetorical flourishes. A prompt beginning "It was the year of our Lord 1834" produced a passage describing London streets filled with protest and petition, which Grigorian later verified corresponded to real 1834 events. TimeCapsuleLLM joins a class of Historical Large Language Models such as MonadGPT and XunziALLM that enable interaction with historical linguistic patterns, period knowledge frameworks, and era-specific literary forms.
Read at Ars Technica
Unable to calculate read time
[
|
]