As communication, entertainment, and navigation technologies evolve, consumers increasingly expect to access those technologies from anywhere, at any time—including when they’re driving, which can create obvious safety challenges. Ford Motor Company’s SYNC 3 system meets customers’ expectations for reliable, convenient access while ensuring that they keep their eyes on the road and hands on the wheel.
Dearborn, Michigan-based Ford was the first automaker to deliver a fully integrated, updatable platform for mobile devices in its vehicles. The first generation of Ford SYNC debuted in 2007—the same year Apple’s iPhone hit the market, followed by the first Android phone in 2008. In 2010, building on its SYNC platform, Ford developed a smarter, safer way for drivers to access their smartphone apps. The company launched AppLink, the industry’s first solution for voice-controlled access and operation of smartphone-apps in the car.
INTRODUCING SYNC 3
The SYNC 3 system, which has already rolled out in select 2016 models, addresses the ever-growing use of mobile devices, particularly smartphones. The number of smartphones in use worldwide doubled in just three years, from 1 billion in 2012 to 2 billion in 2015, according to research by Andreessen Horowitz; the venture-capital firm estimates the user base will double again by 2020.
SYNC 3 offers a fully integrated smartphone, navigation, entertainment, and app experience. In developing SYNC 3, Ford focused on optimizing the driving experience for customers, while prioritizing safety and simplicity. Its screen display is designed for ease of use. SYNC 3 is voice-activated, so using it is as easy as saying “call Mom” to make a phone call or “95.5” to change the radio station.
INCORPORATING DRIVER PRIORITIES
SYNC 3 is built on the capabilities of the existing SYNC technology already installed in more than 12 million Ford vehicles worldwide. In developing the latest version of SYNC, Ford enlisted designers, ergonomists, psychologists, anthropologists—and, of course, engineers—to collaborate in developing a system that’s faster, more intuitive, and easier to use than previous versions.
Most importantly, SYNC 3 was designed with customers in mind. In designing the system, Ford collected feedback from more than 22,000 drivers. “We really understand what people’s priorities are, what matters most to them,” says Parrish Hanna, Ford’s global director of interaction and ergonomics.
Ford engineering teams, including Hanna’s design team, based every decision on both safety factors and ease of use. As just one example, SYNC 3’s background and foreground illumination varies depending on the time of day or night; its daytime screen features a bright background and large buttons with high-contrast glare-reducing fonts.
Hanna’s human-machine interface group designs everything, from the physical to the digital, even vehicle controls or adjustments such as door handles and seat controls. Like other Ford engineering teams, his group focuses on putting the user experience first.
“It’s really about understanding humans, and understanding their metric movements,” says Hanna. “So if you think about ergonomics and metrics, what do they reach for? What do they feel? What reaction do you have to the lighting? How readable are things while you’re riding? All that has to be phased in; it all has to be integrated into the car itself.
FUNCTIONALITY THAT’S FAMILIAR—AND SMART
In short, Ford’s SYNC 3 system offers fully-integrated connectivity. Drivers can choose either simple voice commands or an intuitive touchscreen access to control audio and phone functions, vehicle climate control, navigation, and smartphone app functions. They can also use familiar smartphone-like gestures, such as swipes and “pinch-to-zoom,” to access the touch screen. Bottom line, Hanna says: SYNC 3 is about getting people to interact with their vehicles just as they would with a computer or smartphone.
Like a tablet or smartphone, Ford’s SYNC 3 system saves the user’s place: if someone was searching through classic rock songs just before turning off the car, for example, SYNC 3 will display the same screen the next time the driver starts up. “It’s what we call ‘state retention,’ so you never have to use a ‘save’ button,” Hanna says. “It just always remembers who you were, where you were, what you were doing. That’s something that works very well.”
Ford engineers designed SYNC 3 for faster performance and seamless AppLink integration, which lets drivers control apps via voice commands while displaying an app’s brand and information on the touchscreen for easier control. SYNC 3 also works with Apple’s Siri Eyes Free for hands-free text messaging on the iPhone. Wi-Fi delivers software updates automatically, requiring no action by the driver.
SYNC 3 also includes advanced safety features such as enhanced 911 Assist, which provides subscription-free, automatic emergency calling in the event of serious accidents, alerting first responders to the vehicle’s location and furnishing other pertinent details such as vehicle speed and number of passengers.
Ultimately, Hanna wants SYNC 3 to be so intuitive that drivers won’t notice how they’re using it—or even that they’re using it at all. “That simplicity and comfort level and speed of responsiveness—people are going to establish that during their first couple of days with the vehicle,” he says. “They will learn the things they want to do really quickly and then forget about it.” The system should be so thoroughly integrated it almost vanishes, he says: “It’s seamless. It’s not intrusive. In the user-interface world, that is our end goal, to some extent: the interface disappears.”
THE FUTURE FOR FORD SYNC SYSTEMS
SYNC 3 made its debut in the summer of 2015 on the 2016 Ford Escape and Ford Fiesta models. Ford plans to integrate the system into its other North American vehicles by late 2016.
As technologies—and the way humans interact with them—evolve, the SYNC system will grow and change as well, Hanna says. The next versions could include even more tightly integrated intuitive controls. “The future is more about intelligent sensors and really triangulating sensors with data,” he says. “The more challenging part is around human cognition. How many features can you present? How many steps can it be? What do you call it? Do they recognize what it’s called? We talk a lot about how people think, what they feel and what they do, so we account for their cognition and their emotions and their behaviors.”