Menu
×

> HOME

> PORTFOLIO / PAST WORK

> ACTIVE AND IN-PROGRESS

> EARLY-STAGE AND FUTURE PROJECTS

> RESUME

MIDAS

Multisensory Data Analytics System

MIDAS (Multisensory Data Analytics System), is a set of interactive tools for data visualization, hypothesis generation, and hypothesis testing. It provides visualizations and sonifications of datasets from ballistic interception simulations to allow the user to explore relationships between parameters in the dataset and between data sets.

My role as part of MIDAS was creating a number of VR environments for data visualization, while supervised by Professor Remco Chang and Professor James Intriligator, to explore data from ballistic interception simulations. These are intended to eventually be integrated with forms of data sonification. These VR environments read data from a csv file, and generate visualizations automatically based on that data.

The first of these environments created a relatively standard graph, in which 7 values from each row of the file were represented by the x, y, and z position, length, width, height, and color of a cube. Cubes could be grabbed and moved around, to allow for comparision, and all data could be re-scaled using a UI in the virtual environment.
Other visualization methods used the visual effects graph within the Unity High-Definition Render Pipeline. Using particle systems rather than static 3D graphs allowed the simultaneous representation of more variables.
Data bubblers represented the same information as the 3D graph, but by using particle systems via the visual effects graph, allowing the user to stand in and be fully immersed by the data.