Now that our circuit board from OSH Park is populated and running, it's time to evolve the PIC code beyond displaying a test pattern. The objective of the exercise all along was to display data sent to the unit over I²C, but the exact details of what to send hasn't been finalized.

Stage 1: Raw Bits

This was the easiest and so was the first thing we implemented. It is the lowest-level way to communicate between the host and the display. The host sends data in the form of raw bytes. Each bit in a byte correspond to the 8 segments in a single digit. The host sending 4 bytes will fill 4 digits. We send that byte directly out to PORT C on the PIC micro controller, which are connected to 8 segments of a LED digit.

This method is powerful in the sense it allows the host to display arbitrary patterns. But it is terribly unfriendly to use. The program running on the host has to tailored to the implementation details of our display: the host has to know which bit corresponds to which segment, and whether a bit value of 0 or 1 corresponds to light or dark. Which means if the user wants to swap out to a different display, they would have to rewrite the host code.

This system served its purpose to prove we could light the LED, but it is not a good way forward.

Stage 2: Hexadecimal Decode

In this system, the bytes sent by the host is decoded into their hexadecimal representation and displayed on screen. Since a hexadecimal digit represents 4 bits, the host sending 2 bytes (16 bits) will fill the 4 digits. This is actually a pretty useful mode in certain debugging operations where we do want to see those values. However, it is very difficult to represent human-friendly information this way. We're also unable to make full use of the LTC-4627JR this way: there's no way to represent the decimal point or the colon in the middle.

Since it is useful for machine-level (not human-readable) data debugging, we might want to retain this capability in the form of a special mode later. In the meantime, let's move on.

Stage 3: Binary-Coded (Hexa)Decimal

The next evolution resembled binary-coded decimal system, which separated out each digit to its own byte. This makes the host program easier to write because each character can be treated individually instead of having to pack 2 characters into the upper/lower 4-bits of a byte. Unfortunately it also shares the limitation that we couldn't represent the decimal point or the colon.


Which brings us to the latest approach:

Stage 4: String

Since most computer operations that result in human-readable information end up with the string data format, we're going to try using that as our I²C protocol. It is established, well-understood, and a piece of cake to use from high-level programming languages like Python. The downside is that it is much more verbose. The character sequence to light up all the LED is 8.8.:8.'8. which requires 10 bytes to represent. This is a five-fold increase in bandwidth relative to the 2 byte hexadecimal decode. Will this added bandwidth cause problems? I don't know yet, we'll find out.

But for now, we have a very user-friendly interface. Sending the string "79.2F" resulted in the picture attached to this post. The easy interface also enabled a very short example program.

(The project discussed in this blog post is publicly available on Github)

IMG_5272