Metaspectral, a software company advancing computer vision using deep learning and hyperspectral imagery, has completed a $4.7 million seed round from SOMA Capital, Acequia Capital, the Government of Canada, and multiple notable angel investors including Jude Gomila and Alan Rutledge.

The company plans to use this investment to scale up its team to support the continued development and refinement of the Fusion platform which is set to publicly launch this Fall.

Fusion makes it easy for those with or without technical expertise to train and deploy deep learning models that analyze hyperspectral imagery in real-time. Hyperspectral images contain information from the electromagnetic spectrum, making it possible to identify the chemical composition and other invisible properties of materials with computer vision.

“The platform can visually detect defects on a manufacturing line, classify plastic polymers, quantify greenhouse gas levels on the Earth’s surface, and has countless other applications,” said Francis Doumet, Metaspectral CEO and Co-Founder. “We have spent the last three years developing this technology and it is already being used in the aerospace, defense, agriculture, manufacturing, and other significant industries.”

“Metaspectral is perfectly positioned to service the diverse needs of both enterprise and government clients to inform better, more immediate decision-making. The team has a clear vision and we are excited to support this next stage of the company’s growth,” said Aneel Ranadive, Managing Director and Founder of SOMA Capital.

The technology is also planned for deployment on the International Space Station to demonstrate real-time compression, streaming, and analysis of hyperspectral data from Low Earth Orbit (LEO). The company’s client list also includes organizations such as the Canadian Space Agency, Defence Research Development Canada (DRDC), and one of the largest recyclers in Canada.

“Hyperspectral images include up to 300 unique spectral bands instead of the usual three that conventional color cameras capture. This results in a tremendous volume of data that our technology is uniquely designed to handle,” added Migel Tissera, CTO and Co-Founder of Metaspectral. “We have developed novel data compression algorithms which allow us to shuttle hyperspectral data better and faster, whether from orbit-to-ground (in space) or within terrestrial networks (on Earth). We combine this with our advances in deep learning to perform subpixel level analysis, allowing us to extract more insights than conventional computer vision because our data contains more information on the spectral dimension.”

Metaspectral is currently hiring deep-learning engineers and scientists, remote sensing scientists, and full-stack engineers. A full list of available positions is available at Metaspectral.com.

About Metaspectral

Metaspectral delivers the next generation of computer vision software, capable of remotely identifying materials and determining their composition, condition, abundance, and other properties such as defects, otherwise invisible to conventional cameras. It achieves this by leveraging hyperspectral sensors and analyzing the data captured in real-time using artificial intelligence (AI) via its scalable, cloud-based platform. The software is already deployed in a range of industries including aerospace, defense, agriculture, manufacturing, and more.