The news: IBM has built a new chemistry lab called RoboRXN in the cloud. It combines AI models, a cloud computing platform, and robots to help scientists design and synthesize new molecules while working from home.
How it works: The online lab platform allows scientists to log on through a web browser. On a blank canvas, they draw the skeletal structure of the molecular compounds they want to make, and the platform uses machine learning to predict the ingredients required and the order in which they should be mixed. It then sends the instructions to a robot in a remote lab to execute. Once the experiment is done, the platform sends a report to the scientists with the results.
Why it matters: New drugs and materials traditionally require an average of 10 years and $10 million to discover and bring to market. Much of that time is taken up by the laborious repetition of experiments to synthesize new compounds and learn from trial and error. IBM hopes that a platform like RoboRXN could dramatically speed up that process by predicting the recipes for compounds and automating experiments. In theory, it would lower the costs of drug development and allow scientists to react faster to health crises like the current pandemic, in which social distancing requirements have caused slowdowns in lab work.
Not alone: IBM is not the only one hoping to use AI and robotics to accelerate chemical synthesis. A number of academic labs and startups are also working toward the same goal. But the concept of allowing users to submit molecules remotely and receive analysis on the synthesized molecule is a valuable addition of IBM’s platform, says Jill Becker, the CEO of one startup, Kebotix: “With RoboRXN, IBM takes an important step to speed up discovery.”
A horrifying new AI app swaps women into porn videos with a click
Deepfake researchers have long feared the day this would arrive.
The therapists using AI to make therapy better
Researchers are learning more about how therapy works by examining the language therapists use with clients. It could lead to more people getting better, and staying better.
DeepMind says its new language model can beat others 25 times its size
RETRO uses an external memory to look up passages of text on the fly, avoiding some of the costs of training a vast neural network
2021 was the year of monster AI models
GPT-3, OpenAI’s program to mimic human language, kicked off a new trend in artificial intelligence for bigger and bigger models. How large will they get, and at what cost?
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.