MIDAS (Multisensory Data Analytics System), is a set of interactive tools for data visualization, hypothesis generation, and hypothesis testing. It provides visualizations and sonifications of datasets from ballistic interception simulations to allow the user to explore relationships between parameters in the dataset and between data sets.
My role as part of MIDAS was creating a number of VR environments for data visualization, while supervised by Professor Remco Chang and Professor James Intriligator, to explore data from ballistic interception simulations. These are intended to eventually be integrated with forms of data sonification. These VR environments read data from a csv file, and generate visualizations automatically based on that data.