Future Electronics – FLIR Lepton LWIR Thermal Imaging Camera Demo on Microsemi IGLOO2 Creative Board

Often engineers choose microcontrollers over Field Programmable Gate Arrays (FPGAs), because they assume that MCUs are cheaper, easier to program and more reliable. The truth is that you can make microcontrollers with FPGAs but not the opposite. An FPGA is a device which enables a developer to synthesize digital circuits. It can be reconfigured, reprogrammed, and redesigned in millions of different ways to fit your needs. The key difference between these devices is that FPGAs are optimized for parallel, pipelined design, while microcontrollers are optimized for serial design. Nevertheless, FPGAs can also synthesize serial systems, as in the presented demo: Thermal Imaging on Microsemi IGLOO2 Creative Board.

FLIR Lepton LWIR Thermal Imaging Camera and Microsemi IGLOO2 FPGA
The Future Electronics System Design Center (SDC) developed a thermal imaging video streaming demo using the FLIR Lepton LWIR Thermal Imaging Camera and the Microsemi IGLOO2 Creative Board (Figure 1).

pg16_TV

Figure 1: FLIR Lepton LWIR Thermal Imaging Camera mounted on Microsemi IGLOO2 FPGA

The video streaming is in a cascade of three main blocks (Figure 2):
1) continuous collection of pixels captured by the FLIR thermal sensor over
a Video Over SPI (VoSPI) protocol,
2) data processing by the FPGA, and
3) display on a PC Application (GUI).

pg16_TV_3

Figure 2: FLIR Lepton Thermal Module and Microsemi IGLOO2 FPGA Block Diagram

FLIR Lepton Thermal Module
The thermal camera module with a resolution of 80 x 60 pixels is the most compact longwave infrared (LWIR) sensor available as an OEM product. The Longwave Infrared (LWIR) Camera Module is smaller than a dime and ten times less expensive than a traditional IR camera. The thermal camera is controlled by the FPGA and it streams a continuous sequence of VoSPI frames following a synchronization event, which is triggered/ controlled by the FPGA. Provided that synchronization is maintained, a VoSPI stream can continue indefinitely. To establish synchronization, the Libero SoC Project has a Camera Control Interface dedicated to that.

IGLOO2 FPGA Libero SoC Project
The biggest challenge of the project is the system’s synchronization. Designing with an FPGA is like playing with Legos; your idea is a collection of many elementary blocks, which combined together will build the system. The first block is the Camera Control Interface (Figure 3), which controls the timing to the control unit of the system. This block is enabled after 185 msec delay to ensure the synchronization of the thermal sensor, when the FIFO is empty and the UART is ready to transmit. These last two conditions are useful because the data needs to be stored in an SRAM to have enough time to acquire one full frame before being transmitted to the UART. The FIFO is a set of a FIFO Controller and SRAM. The FIFO Controller transfers one byte at time from the Camera Control Interface to the SRAM. Once the SRAM accumulates one video frame, the FIFO Controller transmits everything to the UART, empting the memory. These designs can be seen as a water filling algorithm, like the one used in communication systems design.

pg16_TV_2

Figure 3: Block Diagram of FLIR Lepton Thermal Imaging on Microsemi Creative Board IGLOO2

Another aspect to take into account is the speeds of the Camera Con- trol Interface and the UART. The Camera Control Interface has to work within a range 2MHz to 20MHz, as specified in FLIR Lepton datasheet; and the UART needs to operate at a frequency where the packets are not overwritten and the Camera Control Interface is enabled at the right time. Therefore, the Camera Control Interface frequency is set at 20MHz, and the UART frequency at 24MHz with a baudrate of 460800 baud.

To make the synchronization’s management easier, the Camera Control Interface communicates with the thermal sensor and executes four main synchronization actions, specified in the FLIR Lepton datasheet:
• De-assert the Chip Select and the SPI clock for at least 5 frame periods (>185 msec) to ensure that the VoSPI Interface puts the Lepton in the proper state to establish (or re-establish) synchronization.
• Assert the Chip Select and enable the SPI clock to allow the Lepton to start transmitting the first packet.
• Examine the ID field of the packet, identifying a discard packet.
• Continue reading packets. When a new frame is available (should be less than 39 msec after asserting the Chip Select and reading the first packet), the first video packet will be transmitted. The master and slave are now synchronized.

The Camera Control Interface starts the communication with the thermal module, and upon the receiving of the packets, it starts processing them. In its default configuration, the camera will transmit a packet in a format of 164 bytes long with 4 bytes dedicated to the ID and CRC, and 160 bytes to the Payload (Figure 4). The payload represents the temperature value of 80 pixels encoded on 14 bits. Each pixel of a packet or line is defined on 2 bytes.

IDCRCPayload
4 bytes4 bytes160 bytes

Figure 4: Generic Video Packet
IDCRCPayload
xFxxxxxxxDiscard data (same number of bytes as video packets)

Figure 5: Discard Packet

As mentioned in the Lepton module specifications, at the beginning of SPI video transmission until synchronization is achieved and also in the idle period between frames, Lepton transmits discard packets until it has a new frame from its imaging pipeline. The 2-byte ID field for discard packets is always xFxx (where ‘x’ signifies a “don’t care” condition) (Figure 5).

If a discard packet is detected, the Camera Control Interface disables the communication with the thermal sensor, raising the SPI Chip Select and stopping the SPI Clock for the length of the packet. Instead, if a valid packet is detected, it is sent to the FIFO, stored in the SRAM and sent to the UART as soon as a full video frame (60 packets) is acquired.

A loss of synchronization can be caused by three main violations (Figure 6):
• Intra-packet timeout. Once a packet starts, it must be completely clocked out within 3 line periods.
• Failing to read out all packets for a given frame before the next frame is available.
• Failing to read out all available frames.

pg16_TV_4

Figure 6: Synchronization Diagram

After explaining all the conditions concerning the synchronization, you can understand that to manage each block has been quite tricky. Furthermore, due to the complexity of the synchronization logic, the design is low – level oriented, minimizing multi-paths and combinatorial processes.

In cascade to the Camera Control Interface, the FIFO interfaces the system control unit with the UART. One video frame is stored before being transferred to the UART, reducing the complexity of the synchronization logic. The FIFO consists of a FIFO Control unit, with the Prefetch option enabled, and an external SRAM memory. The goal of this design is to make the system faster and efficient so the Lepton will be able to keep its synchronization. Therefore, the FIFO Control and the SRAM are both set to read and write 60 x 164 bytes (1 Full Video Frame). Both have a writing frequency comparable to the Camera Control Interface (20MHz) and a reading frequency comparable to the UART frequency (24MHz). The different speed is due to the fact that the writing is related to a serial interface (the Camera Control Interface); instead, the reading concerns the Universal Asynchronous Receiver Transmitter (UART). Once the FIFO is full, the UART with a frequency of 24MHz starts receiving data and sending it to the PC through the USB cable with a baudrate 460800bps.

PC Application
The USB throughput is a continuous sequence of packets 164 bytes long, while the video is a continuous sequence of 80 pixels long (160 bytes) (Figure 7), ordered by increasing ID number in 1 Video Frame (60 row table) (Figure 8); therefore, as the first action, the GUI decimates the video frame to 60 x 160 bytes.

Byte 0Byte 1Byte 2Byte 3
Line mLine mLine mLine m
PIxel 0PIxel 0PIxel 1PIxel 1

Byte 158Byte 159
Line mLine m
PIxel 79PIxel 79

Figure 7: 1 Video Line per 160 byte payload

The image is built using the same principle of a brush wiping a screen from the left to the right, starting from the top until reaching the right corner on the bottom (Figure 8).

pg16_TV_5

Figure 8: 1 Video Frame, 60 x 80 pixels

The video streaming contains information about the temperature of each section of the image, represented with different colors (Figure 9: GUI Video Streaming).

pg16_TV_6

Figure 9: GUI Video Streaming

The Lepton uses an AGC algorithm histogram-based, which converts the full-resolution thermal image into a contrast-enhanced image suitable for display using a linear mapping from 14-bit to 8-bit. However, when a scene includes both cold and hot regions, linear AGC can produce an output image in which more pixels are mapped to either full black or full white with very little use of greyscales (8-bit values) in between (Figure 9). By default, the histogram used to generate Lepton’s 14-bit to 8-bit mapping function is collected from the full array.

pg16_TV_7

Figure 10: Histogram for a 3 x 3 Pixel Area

In conclusion, the design of a serial system with high synchronization complexity can be resolved by building elementary blocks and then combining them in a puzzle of synchronization signals, processing and control processes. For this type of design, it is very important to carefully analyze the system and the hardware (Microsemi Creative Board IGLOO2 and the FLIR Lepton LWIR Thermal Imaging camera) to optimize the final solution.

Leave a Reply

Your email address will not be published. Required fields are marked *

Protected by WP Anti Spam