Thanks to Chart.js documentation, I was quickly up and running with a simple bar chart to visualize AS7341 data. My first draft was done late at night, so this chart corresponded to the spectral distribution of my room's ceiling light.

With an advertised color temperature of 2700K, its "warm white" showed a strong response in the yellow and orange areas of the spectrum. The next morning, I had a ray of sunshine coming in a window and I set my AS7341 sensor within it.

The first lesson was that sunlight -- even just a tiny beam at an oblique angle -- is significantly stronger than my ceiling light. Direct exposure will always reach sensor saturation (ADCfullscale value) no matter what I do. I ended up placing a sheet of printer paper at my sunlight spot and aiming the sensor at that reflected light.

This spectrum has whatever distortion added by a sheet of paper, but it is still very interestingly different from my ceiling light. There is a huge response on NIR sensor, and there isn't as strong of a peak on orange. My brain sees both of these light sources as white, but the sensor sees very different spectrum between them. Raw sensor data (with clear channel hitting saturation) are as follows:

{
  "415nm": 5749,
  "445nm": 6342,
  "480nm": 9533,
  "515nm": 10746,
  "555nm": 11245,
  "590nm": 12577,
  "630nm": 12633,
  "680nm": 15217,
  "clear": 65535,
  "nir": 34114,
  "settings": {
    "atime": 30,
    "astep": 3596,
    "gain": 64,
    "led_ma": 0,
    "read_time": 674
  }
}

According to AMS AS7341 calibration application note, knowing NIR level is important for properly compensating values of spectral sensors F1-F8. They are sensitive to whatever NIR leaked past their filters, so knowing NIR level is important for precise color accuracy. The clear channel and flicker channels likewise have their own impact on color accuracy. But since I'm just goofing around and not concerned with utmost accuracy, I'm choosing to ignore them and dropping NIR from my visualization.

I will, however, make use of this sunlight spectrum to compensate for the differing sensitivities across spectral sensors F1-F8. Using sunlight as my reference for a light source emitting all wavelengths of light, we confirm AMS AS7341 datasheet information that 415nm is the least sensitive and 680nm is the most sensitive. I can selectively boost sensor values so that F1-F8 would all return the same value under direct sunlight. This is crudely analogous to a camera's color balance (or "white balance") features, and I implemented the following normalization options in my app:

  • Default normalization curve based on these sunlight values.
  • A direct data option skipping the selective boost.
  • An option to use the next sensor reading as reference. I can point the sensor at something and activate this option to tell my app: "treat this color as white".

Each of these options had a corresponding button onscreen. Functional, but the jumble of controls on screen is starting to cause usability problems. I built this app and if I get disoriented, how bad would it be for everyone else? It's time to put some effort into layout with CSS.


Code for this project is publicly available on GitHub