Data analysis experts that are tasked with analyzing big data sets (e.g. to build complex predictive models) do not always have domain expertise to know the goals of data analysis and what measurements to take relating to the data sets. Data analysis experts need a tool that ensures accuracy and completeness in their data, a tool to allow discovery and classification of their data, a tool to conduct what-if analysis on the data, and a tool to determine trends and outliers in the data. More specifically, a tool providing superior data quality is needed where information is often siloed and existing in multiple locations and in different formats. In addition, a tool providing an understanding of different data types across multiple data sets and the relationships between the different data types is needed. Additionally, a tool with the ability to conduct deep analytical dives into consolidated data sets to discover actionable insights is needed. As such, there is a need for a tool to aggregate data, reconcile it, analyze the data, and visualize the data rapidly. Such data analysis may be an iterative process with visualizations that may allow for further analysis.