Hello,

We noticed you're browsing in private or incognito mode.

To continue reading this article, please exit incognito mode or log in.

Not an Insider? Subscribe now for unlimited access to online articles.

Intelligent Machines

How Moss Helped Machine Vision Overcome an Achilles’ Heel

There are some thing machine vision just cannot recognize well. Now a research project to identify moss has found a way to overcome this limitation.

In recent years, deep-learning algorithms have revolutionized the way machines recognize objects. State-of-the-art algorithms easily outperform humans in identifying ordinary things such as tables, chairs, cars, and even faces.

But these algorithms have an Achilles’ heel: there are some things they just cannot see. For example, machine vision is not good at recognizing things like grasses and herbs, because they have amorphous forms that are hard to define.

A table generally has four legs and a flat surface, features that machine learning is good at identifying. By contrast, grasses and herbs of the same species can be different sizes and have different numbers of leaves, seeds, and so on, depending on the growing conditions. That makes it hard for machine vision to recognize them, particularly if they aren’t in flower.

Machines find it similarly hard to identify trees from aerial imagery or crops from satellite images. What’s needed is a new approach that can train deep-learning algorithms to work their magic on objects with ambiguous form.

Enter Takeshi Ise and pals at Kyoto University in Japan. These guys have developed a simple technique that helps deep-learning machines to recognize these amorphous plants. They’ve put the new technique through its paces by teaching it to recognize different types of moss, a plant with a hard-to-define form.

The team is well placed to study moss, given Kyoto’s famously warm and wet climate, which promotes its growth. Ise and co began by photographing moss at a traditional Japanese garden in Kyoto, called Murin-An, where it is cultivated.

They identified three kinds of moss and photographed each individually but also in places where they are all present along with other non-mossy plants and features. Each picture was taken with a digital camera, such as an Olympus OM-D E-M5 Mark II, with a 50mm lens (or equivalent) from a distance of 60 centimeters directly above the moss mats. These images have 4608 x 3456 pixels.

The goal for their deep-learning algorithm is to identify the different types of moss in a single image and to distinguish the moss from other objects and plants.

Their method is straightforward. To train the algorithm, the team divides up each image of a specific moss into much smaller regions of 56 x 56 pixels, with 50 percent overlap. In this way, the original image generates some 90,000 images, of which they use 80 percent for training their algorithm and the rest for testing it.

Although the training images of were taken of a uniform mat of a specific type of moss, these mats can contain small regions of other mosses. So the team examined all the training images and removed the images of alien mosses by hand. That left images of three type of moss—Polytrichum, Trachycystis, and Hypnum—as well as non-moss features. All of the training images could then be labeled as one of these types and fed into the deep-learning machine.

The results are impressive. Using this method, the algorithm quickly learned to recognize each type of moss with good accuracy. When the researchers let the algorithm loose on a single image showing various types of moss, it was able to accurately identify the mosses in different areas of the image. “The model correctly classified test images with accuracy more than 90%,” they say.

The algorithm does better for some types of moss than others. “The estimated performance for Polytrichum is 99% [recognition accuracy], Trachycystis is 95%, and Hypnum is 74%,” say Ise and co.

The lower accuracy for Hypnum is because this plant is more amorphous than the others, with less well-defined forms of growth. By contrast, Polytrichum has a distinctive, well-defined shape.

The team say there are various ways of improving the accuracy, such as building a training set of photographs taken at different times of the year when the Hypnum moss, in particular, can look more distinctive. Or the white balance on the digital camera could be standardized to get more accurate color rendition for each moss.

In any case, the results show significant promise for the future. The technique could be applied to aerial imagery to better identify trees and plants in images taken from above. That would be hugely useful for stock-taking in the wild or in large managed areas such as farms and forests.

In the meantime, Ise and co say they plan to develop an app that allows people to identify moss using a smartphone. That could prove popular for gardeners. 

Ref: arxiv.org/abs/1708.01986 : Identifying 3 Moss Species By Deep Learning, Using The “Chopped Picture” Method

Cut off? Read unlimited articles today.

Become an Insider
Already an Insider? Log in.

Uh oh–you've read all of your free articles for this month.

Insider Premium
$179.95/yr US PRICE

More from Intelligent Machines

Artificial intelligence and robots are transforming how we work and live.

Want more award-winning journalism? Subscribe to Insider Plus.
  • Insider Plus {! insider.prices.plus !}*

    {! insider.display.menuOptionsLabel !}

    Everything included in Insider Basic, plus ad-free web experience, select discounts to partner offerings and MIT Technology Review events

    See details+

    What's Included

    Bimonthly magazine delivery and unlimited 24/7 access to MIT Technology Review’s website

    The Download: our daily newsletter of what's important in technology and innovation

    Access to the magazine PDF archive—thousands of articles going back to 1899 at your fingertips

    Special discounts to select partner offerings

    Discount to MIT Technology Review events

    Ad-free web experience

/
You've read all of your free articles this month. This is your last free article this month. You've read of free articles this month. or  for unlimited online access.