1

Analyzing Wrap-Up Effects through an Information-Theoretic Lens
Estimating the Entropy of Linguistic Distributions
On the probability–quality paradox in language generation generation
Conditional Poisson Stochastic Beams
Keyword2Text: A Plug-and-Play Method for Controlled Text Generation
On Homophony and Rényi Entropy
Phone-level Uniform Information Density across and within Languages
Revisiting the Uniform Information Density Hypothesis