MEDUSA: An open-source and webcam based multispectral imaging system

Graphical abstract


Hardware description
This multispectral camera (MSC), hereafter called MEDUSA, is a tabletop, customizable and cost-efficient multispectral imaging system built with less than a 300USD budget. It uses an open Python software framework and a hardware interfacing Arduino board. Building costs and user learning curve are a fraction of the required by any similar multispectral technology. The system also uses LED lighting from multiple VIS and NIR wavelengths spanning the visible and near infrared spectrum.
Quality of lighting and imaging components was tested by experimentation with research grade and custom-built standards. Two initial proof-of-concept tests included sets of spectral band images to analyze ink changes on a counterfeit bill, and surface bruises on Hass avocado fruits. These are followed by a scatter plot analysis of red band light reflectance as related to the nitrogen content of dry plant samples provided by a commercial plant analyses service. In addition beyond simple visual scoring on sample images, potential quantitative approaches are also illustrated by using principal component imaging dimension reduction on multispectral images of petri dish grown bacterial colonies. If properly calibrated, this device might find use for the non-invasive analyses on areas like: Food quality Counterfeit analysis Microbial colony phenotyping Plant chemical analysis Preliminary spectral analyses on multiple surfaces The general appearance and main components of MEDUSA are shown in Fig. 1.

Design files
Design files summary  ControlMedusaPC: An Arduino firmware with routines to turn LEDs on and off from PC, this file is also used to modify the LED light intensity. PCBSHD and PCBLGH: The first has connectors with electronics and is installed onto the AMEGA; the second makes the LEDs pads. FRP1 and FRP2: Both are part of a hinge mechanism that allows the user to open and close the semisphere. Baffle: Is an accessory that may or may not be necessary depending on the user's configuration and is used to block direct light to the camera. We provide a simple design, but the user can use a simple sheet of cardboard for this purpose. To avoid undesirable reflections on the sample area, it is white painted on its LED facing side and black painted on the sample facing side.

Build instructions
Full assemblage of MEDUSA also requires 1) building the circuitry and the semisphere for even lightening and imaging 2) installing the software provided on the file repository and 3) programming the Arduino MEGA board with the firmware provided on the file repository.

Building the lightening and imaging unit
Each major component and placement is better visualized in Fig. 2. The system requires a semisphere, two PCBs (PCBSHD and PCBLGH) and electronical supplies available from online stores like Amazon. The documentation includes a detailed guide for assembling both circuits. The web camera is a videoconferencing HD3000 camera from Microsoft whose IR filter was removed to enhance its spectral sensitive range. The camera control is written in Python using the free library OpenCV [6], whereas the LEDs are controlled by the AMEGA shield. Building the electronics hardware We posted a guide for assembling MEDUSA on the source file repository (OSF | MEDUSA: An ultra-low-cost webcam based multispectral imaging system) with the following topics: 1. Soldering all components onto PCBSHD and PCBLGH. 2. Installing PCBSHD on AMEGA and these to PCBLGH. 3. Putting MEDUSA together. 4. Adjusting light intensity. 5. Removing the camera IR filter.
We strongly recommend the reader to check out the assembly guide, especially the section about adjusting the LED light intensity. Here we use a PWM signal to modulate the LED intensity, as in many electronic devices that use this kind of signal to control the screen brightness. Adjusting this intensity can be needed for some specific band/sample combinations. For example, if a plant leaf sample is being analyzed you could need to increase the intensity of the blue or red associated bands which are strongly absorbed by chlorophyll.
The semisphere was built by hot molding a 2 mm thick Plexiglas sheet over a 14 cm radius CNC machined semisphere piece of solid wood. The inner reflective surface was prepared by first adding two layers of black paint which were further covered with three layers of white paint (Viniltex TM ''Blanco puro 1520").
This lighting device differs from an integrating sphere as presented elsewhere [10]. There, a light ray strikes the wall and lights up the entire inner surface after multiple scatterings. That device is bulky for our purposes and difficult to build. We instead used half of the integrating sphere, a semisphere, which was easier to build from any recipient with a spherical geometry and whose effectiveness for uniform lighting was tested before [11]. To improve light distribution in the region of photography (ROP), three light sources were placed at 8 cm from the center in a triangular pattern around the ROP. This pattern of light distribution was experimentally tested by comparing lightening variability inside a radius growing region of interest (ROI) on images taken with one, two or three LED sources. Fig. 3 presents a lateral view of the semisphere and the relative positions of components. The camera was directly inserted above the ROP as seen in Fig. 3. The orientation of the camera should be tested by the user right before fixing it to the semisphere by previewing an image.
The baffle shown in Fig. 3 could be different, the design provided on the file repository called MEDUSA_Baffle is just something we propose as a possible solution for someone getting direct light to the camera sensor. Different geometries or different solutions must be defined by the user.
Installing the software and camera configuration MEDUSA G is the camera and LEDs easy control software. Python is initially installed through Anaconda [1]. Then additional libraries must be installed; Pyserial [13] which communicates the user interface with the AMEGA board. OpenCV [6] which communicates the user interface with the camera. The camera is used on auto mode, except that python scripting is used to convert RGB images to gray scale images. Additional settings can be customized by skilled python users. The AMEGA board must be programmed with the firmware Medusa_PC_Arduino. The final step is to run the script UI_Medusa_PC_Python within the environment of Anaconda, which is bundled and runs along with Spyder [21].

Connection to PC and powering MEDUSA
The connection only requires an AC plug which is connected to a 12 V supply voltage and two USB cables that are plugged to the PC, one for the camera and the other for the AMEGA board.
Using the software MEDUSA G is presented in Fig. 4, it was written in Python using the graphic library Tkinter [20], which is included by default with any Python distribution. Additionally, it depends on the free libraries pyserial [13] to communicate with the AMEGA board, and OpenCV [6] to communicate with the camera.
MEDUSA G was developed under a Windows environment, but since it was written in Python It can be user modified to run under a different operating system. Its functions can be divided into 5 groups: 1. Cam. Port and Ardu. Port are connecting port numbers for the camera and AMEGA board. These are integers 0. For instance, at the Arduino code COMX, X is an integer 0. Connect links the camera and the AMEGA to the PC. Preview pops up a preview a default white led illuminated image. Note that these functions will work only if the camera and AMEGA board are connected to the PC. 2. Check all/Uncheck all Buttons allows mass selection of bands instead of manual band checking. Do a PCA per set will make a dimension reduced PCA image recovering most variation across each set of images and pixels therein. 3. NOT and WHT checkboxes within the light bands are just positions with no LED installed or a white LED lighting to make a reference VIS image. 4. The ''Do a PCA per set?" checkbox will perform a Principal Component Analysis (PCA) over each set of captured images. 5. No. of sets and Interval (s) are used together to run planned time lapse experiments. Sets of images in selected bands, separated by a time interval specified in seconds are user defined. Finally, the BEGIN button starts the experiment and is turned into a STOP button once clicked.

Lighting and imaging performance
Below we describe the process to characterize these components in MEDUSA and its validation through some laboratory experiments.

Quality of LED lighting
Overall LED bands spectra covered visible (VIS) and near Infrared (NIR) electromagnetic spectrum as seen in Fig. 5. These bands have on average a FWHM ± 10 nm and were captured in a dark room by reflection on a Labsphere TM certified reflectance standard (CRS) using a FLAME-VIS spectrometer and a halogen light source (both from OceanOptics TM ).

Testing the semisphere and the inner light reflecting surface
The reflectance spectrum from a halogen lamp (HL-2000-FHSA, OceanOptics TM ) on ''Blanco puro 1520 Viniltex TM " white paint matched well the spectrum reflected by a Labsphere TM CRS. Fig. 6 presents the superposed reflected spectra which were almost identical. To better visualize the differences, an absolute difference plot between the standard and the white paint reflectance is also included.  5. Standardized reflectance spectra of available LEDs. Spectra were obtained by reading the reflection of LED light upon a certified reflectance standard from Labsphere TM . Such readings were captured using an OceanOptics TM FLAME. LED sources are named by the wavelength value at its peak's spectrum. Testing the uniform lighting in the sample area and the spectral band sensitivity of the USB camera Four parameters to guarantee image consistency were identified, likewise, three tests were designed to test some of these parameters.
1. Absence of light in dark conditions (or background noise): To remove this source of error, the camera was programmed to capture a dark, LEDs off image, which is then subtracted from its subsequent LED illuminated image. 2. Even light distribution: It guarantees spatial consistency of the lighting system. Here, a pink paper 6 cm in diameter was positioned in the center of the ROP. Three tests were run after lighting with 1, 2 or 3 632 nm centered LEDs respectively. The images were converted to a gray scale and then false colored with a 6 Shades color palette in ImageJ to visualize the differences. These differences were in turn quantified by calculating the mean gray value (MGV) and standard deviation (SD) of pixels in a 0.1 mm successive radius growing circle, starting from the ROI center up to a 70 mm final radius, such measurements were made with simple ImageJ scripting. 3. Sensor spectral band sensitivity: The camera sensor should be sensitive to the available LED bands. To test this, the CRS from Labsphere TM was placed in the center of the ROP and a set of images was captured using the available LED bands. Then, the MGV from each image was obtained from a circular ROI inside the CRS. Sensor saturating bands evident in images with MGV 255 were intensity reduced to reach MGV values lower than 240. As a comparison, the intensity perceived by a camera, side placed optical fiber FLAME-VIS spectrometer from OceanOptics TM was also measured. 4. Consistent lighting and imaging performance over time: Time lapsed images of a custom standard, which are captured under the same lighting conditions, must show the same reflectance. To test this, a pink paper disk 6 cm in diameter was placed in the center of the ROP. Then 100 images were sequentially captured at 10 s intervals using a 525 nm band centered LED. From each image, a circular ROI was located at the exact same position to get their MGV.
Even light distribution Fig. 7 visualizes** the light distribution over a 7 cm in diameter pink paper disk as the number of lightning LEDs is increased from 1 to 3. The images were taken using a 632 nm band centered LED. Additionally, a light distribution profile was obtained by measuring MGV values along with their SD inside a growing circular ROI centered with the paper.
Light distribution was improved as the number of LEDs is increased from 1 to 3. Subtle lighting differences are image visualized using a 6 Shades color pallet available on ImageJ. It is also reinforced by the MGVs and SD represented by the dis- Fig. 7. Light intensity over a 7 cm disc of pink paper as light sources are increased from 1 to 3 LEDs. For each case, the light intensity profiles were obtained by measuring the MGV (represented by the orange line) inside a growing circular ROI, the shadowing region represents the corresponding SD for each MGV measurement. The relative position of LEDs is represented by a yellow star. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.) tribution profile plots below each image. Since the LEDs are plugged in a parallel circuit configuration their light intensity (measured as MGV) does not change as the number of sources is increased. The decrease on the SD values is then linked to an increased number of LEDs. Even though it does not appear to follow a proportional relationship, a 3-LED configuration distributed in a triangular pattern was chosen here, as the variations seems to become constant.
It is worth noting that further improvements to asses general lightening design by diffusive domes are already available [11] and could be custom added to MEDUSA or its images by interested users. Geometric and optical adjustments are also user customizable to obtain multispectral transmittance and fluorescence imaging of diverse objects. Although all possible configurations and improvements are beyond our purpose of deploying the basics of an open and customizable multispectral imaging and analysis platform.

Sensor spectral band sensitivity
The camera sensor must be sensitive to the available LED bands. A set of images was captured targeting the CRS from Labsphere TM . Since some LEDs saturated the camera sensor, their reflectance intensity was reduced to a maximum of 240 MGV to avoid data loss, this process is detailed on the assembly guide. Fig. 8 presents a reference reflectance ''spectrum" for MEDUSA, which might be considered in case this system is replicated. For comparison, the intensity perceived by a FLAME-VIS spectrometer from OceanOptics TM was also captured. It clearly shows a relative high sensor response on most bands, although it is lower for bands at 419, 592 and 950 nm. Experimentation and deep literature review will be required to define the spectral bands to use according to specific targets. Nonetheless, the lighting pads are freely customizable. Please consider also that the camera resolution is 8 bits while the FLAME's resolution is 16bits, hence, the scale difference is not proportional to their specific sensitivity.

Consistent lighting and imaging performance over time
The webcam HD3000 made a set of 100 consistent images over time. These images were automatically captured at a 10 s interval and converted to grayscale. Fig. 9 shows a set SD lower than 0.5% on the MGV of a 7 cm radius circle. It shows that Fig. 8. LED band reflectance intensity as perceived by the Webcam HD3000 from Microsoft TM with infrared filter removed and a FLAME VIS-NIR spectrometer from OceanOptics TM . These mean gray values were taken by placing a Labsphere TM certified reflectance standard on the measuring area. Fig. 9. Consistency test where the mean gray value was measured in 100 sequential images under the same lighting conditions. The LED band was centered at 525 nm and a pink paper 7 cm in diameter was used as a target. The yellow circle shows the region of interest for this test measurements. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.) this camera can consistently capture sequential images at least for short term experiments. Additional experimentation over longer periods of time might be needed for specific users.

Proof of concept tests
We did not make comparative tests against commercial desktop multispectral imaging systems since these were not available. The overall performance of MEDUSA beyond its lightening and imaging components were experimentally tested through four proof-of-concept tests. Test one is made with the total number of bands taken from a fake and authentic bill, as well as the regular RGB image. Test two is comprised by a multi band image montage of varying quality avocado fruits; it compares the regular RGB image with several IR imaging bands. Test three shows the potential analytical value of single band imaging. There, a 660 nm band was used in a set of standard laboratory analyzed plant samples, to quantitatively relate a red band light reflectance to its nitrogen content. Finally, a fourth test used multiple imaging bands dimensionally reduced by multivariate statistical analysis. PCA images were obtained by python scripting which is provided here, and are presented as dimensionally reduced 1st PCA (PCA1) image components. Final pseudo color images were obtained in the free software ImageJ [19].

Detection of counterfeit bills
MEDUSA was used to detect general surface ink differences. Fake and authentic 20.000COP bills were multispectral imaged and a comparative montage was made using the available bands and the color image along with its corresponding red, green and blue channels (Fig. 10). Some of the bands (e.g., 880 nm or 950 nm) clearly show differences between the authentic and fake bill. These are not easily noticeable on the RGB image.

Avocado fruit damage detection
Five avocado fruits (Persea americana cv. Hass) with different amount of damages and increased ripening stage (bottom to top) were obtained from the local market and externally imaged on MEDUSA. A montage was made using the RGB and IR bands. The latest highlighted different surface damages (Fig. 11). Some red arrows are used to pinpoint damages better seen on specific bands. Fig. 11 suggests that a color image provides few clues on general fruit quality. However, specific band images show remarkable fruit differences at their skin. Some of these differences might be explained by differences in fruit ripening. Previous research has also shown chances for non-contact estimation of avocado ripeness at harvest, as reported by [23], which made a comprehensive study on avocado ripening detection by using a scientific grade hyperspectral camera, industrial lighting and refined image analysis algorithms in an external software platform.

Red light reflectance and nitrogen content in leaf plant samples
Nitrogen in plant leaves can be estimated from a 650 nm leaf light transmittance standard meter known as a SPAD meter [25]. Transmittance is tightly tied to surface reflectance. To preliminary check the possibility to estimate nitrogen concentration in dry ground plant samples by red light reflectance, we made 120 reflectance images using a 660 nm centered Fig. 11. Montage of five quality and ripening increasing (from bottom to top) hass avocado fruits placed on the same position and at the same distance from the camera. Red arrows highlight specific band damages unseen on the standard RGB color and spare RGB bands. Previous advantages on fruit quality detection by multispectral imaging on thick rind fruits like avocados were presented before [2]. (For interpretation of the references to color in this figure legend, the reader is referred to the web version of this article.) Fig. 12. Medusa obtained 660 nm light band reflectance on dry ground plant samples, and its grossly association with its Kjeldahl extracted nitrogen content. Mean gray values were extracted in ImageJ using a macro function freely available elsewhere [16]. LED band in MEDUSA. These samples comprised leaves from four commodity crops (cocoa, natural rubber, chrysanthemums and bananas) which were provided by a commercial plant analysis laboratory and were previously analyzed by a standard Kjeldahl nitrogen test. It showed a strong negative correlation between increasing leaf nitrogen content and dry ground sample imaging decrease in red light reflectance (see Fig. 12). It was expected from well settled theory on optical spectroscopy for plant nutritional analysis, in which increased red light reflectance associates with a decreased leaf nitrogen content [17]. This suggests a potential for MEDUSA to dry analyze nitrogen and possibly other plant nutrients if further statistical models are properly developed and validated.

Bacterial colony analysis
Tools for automatic bacteria colony growth analysis might help to understand strategic shifts in microbial metabolism with clear biological [14] and human health consequences [3]. These shifts are also linked to chemical clues or media changes which can be detected by sophisticated, yet invasive techniques [24]. Here, we multispectral imaged bacterial colonies seeded on a 4-compartments Petri dish (see Fig. 13). Such colonies were originally obtained from a sterile water dilution of a garden soil sample. The test was conducted on 5 mL of nutrient agar on each dish section.
Inoculum was a 5 lL drop of a bacterial cell suspension which was stored in a 0.85% sterile NaCl solution. The Petri dish was placed open inside the multispectral chamber at room temperature. The test was carried out in the Soil Microbiology Laboratory of the National University of Colombia at its Medellín campus, which has an average temperature of 22°C and a relative air humidity ranging between 63% and 73%. A set of images was then obtained at 10 min interval during 33 h. We chose 4 specific times to highlight the differences between the RGB images and the PCA multispectral images (Fig. 13).
Colonies on the upper left and right compartment were UV fluorescent on King B agar, made by gram negative bacilli, presumably identified as Pseudomonas sp. The diffuse grown colony on the lower right compartment is made by gram positive, spore forming bacilli, and is presumably identified as Bacillus sp. These colonies are used only to test the system. We expect other microorganisms to exhibit a variable development which can be in turn traced by this multispectral imaging. Some areas in which differences are detectable over time are pointed with vertical black arrows and some areas where changes were only visible with the PCA1 image are pointed with horizontal black arrows.

Device limitations
Here we disclose a fully open spectral imaging system, which can be adjusted for specific purposes and sample types. Not without drawbacks, our experimental tests show that the sensing and illumination system, based on regular LEDs and a common web camera, can perform well in lapsed runs of sample imaging. As for any analytical tool, MEDUSA must be calibrated by the user. Then it might be necessary to make reflectance standard photographs with all light bands to assure operating consistency and check for suspicious reflection values before using the system. To ease that task, instructions on LED intensity adjustment are also included in the assembly guide.
It is worth noting that in spectroscopy applications in general, adequate specific band light intensity depends on both: the sample optical properties being analyzed as well as the sensor sensitivity in these light bands. Biases will be avoided by keeping a consistent optical set up. Then we recommend the user to do some previous experimentation before defining the more convenient band intensity to use. The proof of concept tests presented here helped our point on showing the potential discovery of this open tool, but these are only glimpses which must be increased in sample numbers before drawing any conclusion. PCA images yielded easy examples for visual scrutiny, which may inspire the design of further standard analysis systems for samples of interest to builders and users of this system.
For instance, it is apparent from the correlation between light absorption and N concentration measured in the laboratory, that further increasing on plant analysis in this system may contribute to building new tools for plant nitrogen dry analysis, based on a few or even a single light band. These however are ambitious goals which are just beyond the scope of this system design and preliminary test Specific enhancements can be directed to: Adding more bands with a narrower FWHM. Using a monochromatic camera with increased spectral sensitivity and speed. Custom software programming and processing guided by hypothesis might also provide new insights into many nature's hidden clues.
Further use may also reveal additional downsides requiring attention. A remaining challenge for a system like MEDUSA is the speed at which some essential components can change on the market. For instance, there is a big chance of not finding the very same LEDs used in MEDUSA if we were to look for them tomorrow or the day after. Although the same is true in many DIY projects. Then PCA images might not be exactly reproduced in a different set up of this system, since changes in LED intensity and spectrum, general hardware differences and even environmental conditions might translate into a different image of a same object. However, dry and contactless variations on samples surface will be easily discovered by specific interactions between light energy and objects of interest if this multispectral tool is available, and their components properly tested.
True access to tools fuels advances in science and technology. Despite its downsides, MEDUSA is an open and affordable alternative to speed discovery and can be build and used in many fields by small, transdisciplinary teams, without the hassles of the rigid and cost prohibitive access to commercial devices.

Declaration of Competing Interest
The authors declare that they have no known competing financial interests or personal relationships that could have appeared to influence the work reported in this paper.