Being capable of offering a wide range of expressions in order to provide a natural interaction is a key feature for a social robot. This is why developing systems that allow to control the expression capabilities of a robot has become a critical task. In this work, we present our approach to expressivity management in the robot Mini. We have developed a library of gestures, which are sets of expressive actions (for example, delivering an utterance or moving a joint). Our Expression Manager activates these gestures when it is requested to do so, checks that there are no conflicts between them (multiple gestures trying to use the same interfaces at the same time) and controls the execution of the expressions. All the gestures have been modelled as state machines and can be designed through a graphic user interface. The system proposed has been integrated and tested in Mini, a social robot designed for helping elders.