About me
I am PhD student at the Language Technology Lab at the University of Amsterdam supervised by Dr. Vlad Niculae. My research is focused on controllable generation, language modeling and (in part) on code generation. I study distributional properties of language model outputs, aiming to make the generations more diverse and to satisfy given contraints.
News
- Sep 2025 » I presented our work Control the Temperature: Selective Sampling for Diverse and High-Quality LLM Outputs at COLM 2025 in Montreal! We notice that there are “risky” generation timestamps, when sampling is likely to deteriorate the quality. As a solution, we develop an inference time sampling method that dynamically switches between greedy and high temperature sampling depending on the timestamp.
- Aug 2025 » Our paper, On the Low-Rank Parametrization of Reward Models for Controlled Language Generation, was accepted to the TMLR journal! By looking closely at the parameterization of the reward models (q-style vs v-style), we show that q-style reward models have limited rank-capacity, namely, they cannot model all reward functions in context.
- Jun 2025 » I excited to start my internship at Qualcomm AI Amsterdam, where I will work on agentic code fixing diving into multi-turn RL techniques!
- Jul 2024 » I attended LOGML summer (deep learning and geometry) school, where I worked on a project for bayesian inference of parameters of differential equations.
- 5 Dec 2023 » My paper is accepted to the TMLR journal ! We propose a new non-isotropic distribution for probabilistic modeling on Riemannian manifolds.
- Jul 2023 » I spent a great time at the LxMLS 2023 summer school in Lisbon!
- Jun 2023 » I attended OxML summer school (MLx Finance & NLP), which took place at Oxford.
- 8 Dec 2022 » We presented our workshop paper on BlackboxNLP 2022 where we probed deep learning models for source code.
- 1 Dec 2022 » I started my PhD at the Language Technology Lab at the University of Amsterdam supervised by Dr. Vlad Niculae!
- 29 Apr 2022 » We presented our spotlight paper on CodeBPE on subtokenization for source code models at the
DL4Code workshop of ICLR 2022
(online)!
- 22 Aug 2021 » We with Nadezhda Chirkova presented our paper
Empirical Study of Transformers for Source Code
at ESEC/FSE 2021 (online).
Here is our code
- 7 Jun 2021 » A Simple Approach for Handling Out-of-Vocabulary Identifiers in Deep Learning for Source Code paper presented at NAACL 2021 with Nadezhda Chirkova