Publications

On the probability–quality paradox in language generation generation
Estimating the Entropy of Linguistic Distributions
Analyzing Wrap-Up Effects through an Information-Theoretic Lens
Naturalistic Causal Probing for Morpho-Syntax
Revisiting the Uniform Information Density Hypothesis
Phone-level Uniform Information Density across and within Languages
On Homophony and Rényi Entropy
Keyword2Text: A Plug-and-Play Method for Controlled Text Generation
Conditional Poisson Stochastic Beams