Research
I am interested in natural language processing, language modeling (both large and small models),
and deep learning in general. My PhD research is primarily focused on separating semantic and
episodic information in language models, as well as developing improved memory architectures
for modeling long sequences with Transformers.
In addition, I am an active member of Turkish Data Depository project,
an open-source repository for data and tools for the Turkish language. I am responsible for managing
TDD's dataset hub and for overseeing
Mukayese, a collection of Turkish language benchmarks, including
the newly released MukayeseLLM leaderboard.
|