The first time around...

For our first system we used a webcam to sense the modulated brightness of a computer monitor. The resulting brightness values (from 0 to 255) are then mapped to a series of levels in post-processing, allowing for interpretation of the signal.

Transmitting Data

Program Specifications

Data transmission is accomplished by a Python program using the Pygame package for graphics. The program takes 4 options: levels, the size of the transmit alphabet (in 2bits); speed, which specifies the how many characters from the alphabet will be transmitted per second (in Hz); minimum, the pixel value used for the lowest character; and maximum, the pixel value used for the higest character. Both minimum and maximum are arbitrarily scaled, but monotonically increasing units of intensity, and range between 0 and 255.

Transmission Protocol

True black (regardless of the setting defined by minimum) is held for at least 5 seconds, indicating that transmission follows. The program first transmits a calibration sequence which consists of all of the characters in ascending order. It then loops to read one line at a time from stdin for each line, it expects one integer per line between 0 and levels - 1, and nothing else. The level specified by this integer is then transmitted.

Receiving Data

For the recieving end of the camera-based system, we wrote code that used information from a webcam to capture a region of interest and then record its luminance to a file. This was handled by a Logitech 905 webcam connected to a PC running LabVIEW. All automatic features of the camera were disabled, allowing for manual adjustment of white balance, contrast, exposure, and other camera attributes in National Instruments Measurement and Acquisition (MAX). This enabled the collection of data sets within a uniform contextual reference. Individual frames were extracted from the video at 15Hz using LabVIEW's IMAQ toolkit. These frames were then displayed on the front panel, where the developers selected the region of interest in each frame corresponding to the transmitter. The image was then cropped and converted to grayscale, enabling the rapid calculation of the region's brightness by averaging the pixels' values over the entire image (although the division always returned a floating point number, the positive correlation between the number of pixels, relevant significant figures, and low-noise data is significant and should be noted, see this effect in the figure below. Each of these average brightness values were then stored in an array and eventually logged to a .csv file.

The live video is cropped to a user-specified region of interest. This region of the image is then converted to grayscale and its pixels' brightnesses are averaged, yeilding a ``discretized analog brightness value'' for the frame. Although this value will always be a floating point number, its range is 0.0 to 255.0 and its minimum increment is 1/xy, where x and y are the dimensions of the region of interest. These values are stored in an array while the code runs and logged to file at its completion.
This diagram illustrates the effect of averaging the brightness across the entire screen compared to taking the brightness of a single pixel. Notice the decrease in noise--despite the increased distance of transmission--as more brightnesses are factored into the recieved data.