Return home

Sensor Fusion Visualizer

This is a WebGL visualization for demonstrating the concept of sensor fusion in a distributed sensor network. View the code.

Sensor fusion involves synthesizing data from multiple sensors in order to decrease uncertainty of the information being sensed.

A grid of temperature sensors is simulated below. A red box = hotter temperature reading, blue box = colder temperature reading.

The slider labeled "Sensor Fusion Iterations" controls the number of iterations of sensor fusion to run. Sensor readings will be run through a spatial recursion algorithm (average each sensor reading with those of it's nearest neighbors). This is reflected in the visualization - notice how "hotspots" get "smoothed" out as number of iterations is increased.

There is a slider to control the environment temperature. Each simulated sensor will "read" the environment temperature, with some amount of psuedo-random noise applied to the reading to simulate real-world sensor noise.

The "Sensor Noise" slider controls the amount of psuedo-random noise in sensor "readings". A low slider value means sensor readings are more consistent, whereas max slider value is essentially giving random readings.

The last slider is to simulate offline/malfunctioning sensors. With no sensor fusion (0 iterations), a "dead" sensor will show up as a black box in the grid.

Using the different sliders, you can experiment & observe the effects of different conditions. For example, lower sensor noise should require fewer sensor fusion iterations to get a smooth output visualization.





Sensor Fusion Iterations:  1 
Environment Temperature 
Sensor Noise 
% of Offline Sensors:  5