New Language for Programming in Parallel
A new programming language has been designed to get the most out of the latest multicore computer processors. If it finds favor among coders, it could provide more powerful software for many computers.

Over the last few years, as they’ve run up against the physical limits of miniaturization, microchip makers have shifted from increasing the power of processor cores—the part of a chip that handles data and instructions—to adding more cores to a single chip. For example, Intel’s i3 and i7 processors have two and four cores, respectively.
This presents a challenge for programmers. Since most programming languages were designed for single-core chips, it can be tricky to divide tasks up and send them to each core in parallel. If a coder isn’t careful, this can cause errors in the way that each core in the chip accesses the shared sections of memory.
Tucker Taft, the chief technology officer and chairman of the Boston-based software company SofCheck, designed the new language—called Parallel Specification and Implementation Language (ParaSail)—specifically for writing software for multicore processors. The language is intended to avoid the pitfalls that typically happen when working with multicore chips.
To a programmer, ParaSail looks like a modified form of Java or C#, two leading languages. The difference is that it automatically splits a program into thousands of smaller tasks that can then be spread across cores—a trick called pico-threading, which maximizes the number of tasks being carried out in parallel, regardless of the number of cores. ParaSail also does the debugging automatically, which makes code safer. “Everything is done in parallel by default, unless you tell it otherwise,” Taft says.
Over the next decade, the number of cores on computer chips is expected to increase even further. “There are some machines out there with dozens or hundreds of cores now,” says Taft.
ParaSail uses a number of other tricks, some that draw on languages developed in the late 1980s and early 1990s for supercomputers—machines running many individual computer chips networked together. “The design of the language itself is essentially complete,” says Taft, who presented details of the language on Wednesday at the O’Reilly Open Source Convention. “The first version of the compiler will be released in the next month or so.” The language will work on Windows, Mac, and Linux computers.
Microsoft and Intel are putting $20 million into adapting existing languages for multicore processors, so it’s difficult to say if ParaSail will become widely adopted. “There are a lot of people chipping away at the problem, taking existing languages and trying to make them better at handling parallel processing,” says Taft.
Taft already has a proven track record in the world of computer language development, says Denis Nicole of the Dependable Systems and Software Engineering Group at Southampton University. But he adds that “it usually takes companies the size of Sun to push new languages on the community.”
Keep Reading
Most Popular
This new data poisoning tool lets artists fight back against generative AI
The tool, called Nightshade, messes up training data in ways that could cause serious damage to image-generating AI models.
Rogue superintelligence and merging with machines: Inside the mind of OpenAI’s chief scientist
An exclusive conversation with Ilya Sutskever on his fears for the future of AI and why they’ve made him change the focus of his life’s work.
Data analytics reveal real business value
Sophisticated analytics tools mine insights from data, optimizing operational processes across the enterprise.
The Biggest Questions: What is death?
New neuroscience is challenging our understanding of the dying process—bringing opportunities for the living.
Stay connected
Get the latest updates from
MIT Technology Review
Discover special offers, top stories, upcoming events, and more.