The recent advances in virtual, augmented and mixed reality (MR) technologies are presenting new opportunities for artists in the fields of theater, dance and performance to experiment with new ways of presenting their art and engaging the audience. Especially in the field of dance, there are numerous examples of technology-driven performances. This relationship between dance and technology is far from being something new and it is something rather expected if we take into consideration the similarities in the workflow of digital artists and choreographers. It is this common ground that enables them to work together, but also to evolve by exchanging ideas and copying one from the other. This is further noted by the numerous cases of dance projects in the field of education and entertainment that experiment with the use of body capture technologies. Nevertheless, a closer look on these cases shows a limited use of mixed reality technologies and real-time digital characters. Most of the times, interactivity is limited in 2D graphics or predefined visual and/or audio responses to the dancers’ movements. Even in cases where 3D graphics, avatars and motion capture are used, the result is a pre-choreographed digital dance. But what about digital content created in real-time and how can the virtual environment respond to that? This research is focusing on that question by examining an MR dance performance where digital dancers react and create dance content in real-time triggered by real-life dancers on stage, creating an ensemble of digital and real-life movers.