The Power Of Seeing Beyond The Capabilities Of The Human Eye

The different colors that we can see are based on different wavelengths of light. The human eye can detect and differentiate wavelengths in three bands (red, green, and blue) covering the range from 450 to 650 nanometers, but we cannot see light from the hundreds of other bands of light that exist outside of that range. There is a technology called hyperspectral imaging that can give an enhanced view of what is going on in the world around us. There are specialized cameras that separate up to 300 bands of light with prisms and then digitize the energy they are detecting on a wavelength-specific basis. These cameras have a huge range of potential applications. For instance, they can be used to monitor greenhouse gas emissions, tell the difference between mixed clear plastics, or measure the ripeness of fruit on a packing line.

There are several manufacturers of these hyperspectral cameras, but at least for now, they are quite expensive – starting at around $20,000. The camera-specific software they use isn’t that easy to integrate with other systems. The other challenge that comes with this expanded view of the world has to do with the volume of data – these cameras generate around one gigabit of data per second!

There is a company called Metaspectral that is seeking to expand the potential of hyperspectral imaging by offering a combination of hardware and software to make this data source more user-friendly. They use “device agnostic” edge devices running compression algorithms that can be connected to any hyperspectral camera and turn its data output into a manageable flow. Their proprietary Fusion AI platform can be used to interface with familiar user software, drive robotics, or feed artificial intelligence and deep learning systems.

Metaspectral recently raised $4.7 million in seed round of funding from SOMA Capital, Acequia Capital, the Government of Canada, and angel investors including Jude Gomila and Alan Rutledge. The company was co-founded by Francis Doumet (CEO) and Migel Tissera (CTO). Tissera describes their offering as follows: “We have developed novel data compression algorithms which allow us to shuttle hyperspectral data better and faster, whether from orbit-to-ground or within terrestrial networks. We combine that with our advances in deep learning to perform sub-pixel level analysis, allowing us to extract more insights than conventional computer vision because our data contains more information on the spectral dimension.”

Indeed, hyperspectral imaging can be employed at very different scales. For instance, one of the most developed applications of Metaspectral’s system is with close-up cameras on sorting lines for mixed recycling material where it can differentiate clear plastics by chemical composition so that they can be sorted into the extremely pure streams required for re-processing.

The largest Canadian waste recycler is now using this system. There are other close-up applications for quality assurance in assembly lines or fruit sorting.

At the other extreme the camera can be generating data from a satellite where each pixel of the image represents 30m x 30m square (900 square meters). The Canadian Space Agency is using that approach to track greenhouse gas emissions and even to estimate soil carbon sequestration in farmed or forested land by comparing flux rates over time. The technology is also slated for future deployment on the International Space Station. Forest wildfire risk assessments are another potential application to guide actions such as prescriptive burns.

Another option that would be of particular use for agriculture is to deploy the cameras with drones flying at 50-100 meters. In that case, each pixel of data can represent an area 2cm by 2cm and the ability to monitor so many different wavelengths could allow early detection of invasive weeds, insect activity, fungal infections at stages before they are visible to humans, early indications of water or nutrient deficiencies, or crop maturity parameters to guide harvest timing. It might be possible to track greenhouse gas or ammonia emissions from farmed soils to better understand how those are influenced by specific farming practices like reduced tillage, cover cropping, variable rate fertilization or “controlled wheel traffic.” At this time what is needed is a good deal of “ground truthing” research to connect the imaging data with measurements of the variables in question, but this will be much easier with the data compression and interface capabilities available from Metaspectral.

One hope is that the diverse applications of hyperspectral imaging facilitated by the Metaspectral platform will create sufficient demand for the cameras to push manufacturing further down the cost-learning curve.

Source: https://www.forbes.com/sites/stevensavage/2022/12/14/the-power-of-seeing-beyond-the-capabilities-of-the-human-eye/