The collaboration of human-computer interaction (HCI) and Big Data has introduced new demands for scalability, responsiveness, and cognitive alignment in interactive systems. Due to this synergy, as data complexity grows and machine learning (ML) models become central to decision-making, user interfaces (UIs) must evolve from static dashboards to dynamic, adaptive environments that support real-time exploration, transparency, and trust. More specifically, this survey discusses recent advancements at the intersection of HCI and Big Data, focusing on four core dimensions. First, scalable visual interaction techniques that address overplotting, high-dimensional embedding, and cross-modal coordination are synthesized. Then, the discussion focuses on system architectures that prioritize interaction responsiveness by leveraging adaptive processing pipelines, decentralized execution, and data shaping strategies centered around the user interface. Also, we examine cognitive modeling strategies for intent inference, cognitive load detection, and adaptive composition of views. Finally, we evaluate mechanisms for explainability and trust, including interactive explanations, selective transparency, and auditable system behaviors. Together, these contributions define a design agenda for future systems that are not only data-intensive but also human-aware, accountable, and ethically aligned.