Atima Lui grew up in Kansas as the descendant of American slaves and the daughter of a Sudanese refugee, and she remembers trying on makeup with a friend for the first time as a child. Her friend had lighter skin. “As soon as she put it on my face,” Lui says, “there was nothing we could do to make it look good.” She’d discovered the cosmetic industry’s long-running assumption that “nude” means white or light.
Lui is now deploying an AI-based app called Nudemeter to try to fix that problem. Through photos and a short quiz, it determines a user’s skin color, accounts for how the skin is illuminated, predicts changes in skin tone through the year, and helps consumers of any complexion choose makeup colors that work with their skin.
Lui has managed to build a business around Nudemeter, but her goals go beyond the technology itself. Growing up, she says, she was shaped and hurt by society’s assumptions about “who gets to be an entrepreneur, or who gets to be a technologist.” That’s something else she’s trying to fix.
Photo by Ashley Soong
Jiwei Li applies deep reinforcement learning—a relatively new technique in which neural networks learn by trial and error—to natural-language processing (NLP), the field of computer science in which programs are made to manipulate human languages.
By using deep reinforcement learning to identify syntactic structures within large pieces of text, Li made machines better at extracting semantic information from them. Syntax refers to the grammatical relationship between words, while semantics refers to their meaning.
In written language, words with a close semantic relationship are not always close together on the page. A verb and its object can be separated by a string of adjectives or a subordinate clause, for example. Previous attempts at getting machines to parse natural language often overplayed the importance of proximity, leading to obvious mistakes. Li’s machine-learning algorithms find the grammatical structure of a sentence to get a much more reliable sense of the meaning. They have become a cornerstone of many NLP systems.
Li grew up in China and studied biology at Peking University before moving to the US, where he began a PhD in biophysics at Cornell. But he soon switched fields, turning to NLP first at Carnegie Mellon and then at Stanford, where he became the first student ever to obtain a computer science PhD in less than three years.
Li has also explored other ways to teach artificial intelligence how to spot patterns in linguistic data. In 2014 he and his colleagues correlated Twitter posts with US meteorological data to see how weather affected users’ mood. First he labeled 600 tweets by hand as happy, angry, sad, and so on. He used this labeled data to train a neural network to assess the mood of a tweet and cross-referenced that mood against geolocation data for about 2% of all the tweets published in 2010 and 2011.
His results were not surprising. Moods worsened when it rained; people expressed anger when it was hot. But for Li it was a lesson in how hidden information could be extracted from large amounts of text.
After finishing his studies in 2017, he moved back to Beijing and founded an NLP startup called Shannon.ai, which now has dozens of employees and $20 million in funding from venture capitalists. Li’s company is building on the pattern-matching work demonstrated in the Twitter weather study to develop machine-learning algorithms that extract economic forecasts from texts including business reports and social-media posts.
Li has also applied deep reinforcement learning to the challenge of generating natural language. For him it is the obvious next step. Once you have learned to read, you can learn to write, he says.
Even the best chatbots still make obviously stupid mistakes, spewing out non sequiturs or displaying a lack of basic common knowledge about the world. The longer a conversation, the harder it is for an AI to keep track of what’s been said. Li’s techniques give AI a good grasp of linguistic structure. In a conversation, keeping track of subjects and objects is easier if the syntax of utterances is explicit. For example, given the question “Shall we get started?” a bot might answer “Of course!”—but that response could follow any question. Li’s technique can instead give responses more like “Yes. We’ve got a lot of work to do here,” referencing the content of the original query.
Photo by David Vintiner
Modern Electron has applied a modern twist to an old technology. By using computer simulations and novel materials, the Seattle startup has made a new type of thermionic converter, a heat engine first developed in the 1950s, that’s more efficient than the old model at turning heat into electricity.
Cofounder and CEO Tony Pan believes his company can use the technology to convert home boilers and furnaces, which generally use natural gas or oil to heat water and homes, into mini residential power plants that produce electricity on site. He says this would be a far cheaper and more efficient way of generating residential power than a central power plant, particularly when coupled with home solar panels and batteries.
A thermionic converter consists of a pair of metal plates, separated by a vacuum. Heat—from, say, the flame of a furnace—agitates and excites the electrons on one plate to the point that they leap across the gap to the cool one, generating an electric current. In one application, Modern Electron has rolled the metal plates into a tube that resembles a light-saber handle and fits over a gas burner.
Homeowners could rely on rooftop solar panels much of the time, turning to Modern Electron’s system during the night, on cloudy days, or in the winter months. If adopted widely, the product could reduce our reliance on electricity from centralized coal or natural-gas plants, which waste vast amounts of energy between burning fuels and delivering power over hundreds of miles of transmission lines. That, in turn, could reduce greenhouse-gas emissions from the power sector, Pan says.
The company’s technology also works with other fuels. So if residential heating systems eventually shift toward low- or zero-emissions sources like hydrogen, a change some companies and regions are exploring, the thermionic converter could make a bigger dent in pollution.
Pan believes his device could have an even bigger impact in developing countries. Enabling communities to set up their own mini power plants would allow them to skip the massive investments of money and time required to build centralized generation and distribution systems. That could bring electrification faster to rural areas