As illustratively defined by the International Telecommunication Union in ITU-T Recommendation Y.2060 dated June 2012, the “Internet of Things” (IoT) is a global infrastructure for the information society, enabling advanced services by interconnecting (physical and virtual) things based on existing and evolving interoperable information and communication technologies. For example, in a typical IoT deployment, there are a large amount of electronically interconnected devices with different capabilities. The IoT devices can form a heterogeneous ad-hoc network wherein diverse devices such as, for example, sensors, actuators, radio frequency identification (RFID) tags, and smartphones, interact with each other to achieve a common goal with no or very little underlying infrastructure support.
In one illustrative IoT deployment, a large number of sensors may continuously produce a significant amount of time-series data, which creates a correspondingly significant demand for time-series data analysis such as, e.g., pattern recognition and visualization. Such time-series analysis can be useful in a variety of scenarios including, but not limited to, economic forecasting, sales forecasting, budgetary analysis, stock market analysis, yield projections, workload projections, and process quality control. However, managing the large amounts of time-series data can be a significant challenge for existing time-series analysis systems.