Several years back, Amnon Shashua was merely a very eloquent voice for self-driving car technology popping up at trade shows, usually to packed crowds.
Then came Intel’s roughly $15 billion purchase of the company, completed in April of last year. Now part of one of the biggest semiconductor companies in the world, Shashua leads a key effort of Intel Chief Bob Swan’s drive to diversify the company beyond its server- and PC-chip focus.
Last week, Shashua was in New York for an intellectually stimulating conference all about AI, put on by the Hebrew University of Jerusalem, where Shashua holds a teaching post, called Nexus:Israel. Shashua was the morning keynote speaker.
He took some time out from the conference to tell ZDNet about how Intel is planning the future of autonomous vehicles.
He also offered his thoughts on the general state of AI
A lot of what’s going on these days, both in academia and in industry, with AI, Shashua opined, is “tweaking.”
“They’re coming up with another topology here, another there,” the kinds of things that don’t make profound advances, he believes. “The big players, such as Google, they have things that have been tremendously successful, like BERT, like the Transformer,” he observed, referring to two popular neural network approaches to natural language processing. But, he said, “there are no big inventions there.”
Shashua thinks much of the current course of AI will “reach a glass ceiling at some point.” What’s needed, he suggested, is for researchers to look at “areas where these networks are not successful, things like natural language understanding,” and not just keep improving in areas such as image and text processing where they have had success.
Some of the old approaches will come back in a big way, he predicted. “There can be a combination of knowledge representations, and temporal logic, things of the past that we forgot because they weren’t effective before,” he said, referring to ideas such as symbolic logic that dominated before the advent of modern machine learning.
Shashua’s own academic work has offered interesting angles on things such as what deep learning can teach quantum computing.
When it comes to autonomous cars, the work of Mobileye consists, broadly speaking, in understanding, “What are the key issues to put a self driving car on the road from a commercial perspective,” as he puts it.
The over-arching requirement, what an AI researcher would call the “objective function,” is to not have an accident, of course.
That goal involves lots of work by Intel and others, said Shashua, understanding the human ability to assess risk on the road, such as, for example, the sense of whether a breaking incident is going to lead to a collision. For a long time, autonomous vehicles will be putting up with human drivers on the road, so they must have an ability to cope with human behavior.
“We need to be able to model that human ability,” he said.
On a more prosaic level, as a business for Intel, “autonomous driving is unfolding in two phases,” he said.
Phase one is moving Intel into the business of mobility-as-a-service, or MaaS. Intel can operate networks of public transportation that will move citizens around “smart cities.”
While Intel was giving its analyst day presentation last week, Shashua was in London, at an event that featured a deal to gather road data from vehicles to build those smart cities. “It’s just the tip of the data opportunity,” said Shashua in a video to promote the work.
The big picture for Intel is a $160 billion worldwide market for autonomous fleets such as taxis.
In this first phase of autonomy, fleets as small as a “few thousand” robo-taxis will be sufficient to provide transportation for a city. “It’s not a huge fleet,” he observed.
The first such services are expected to go live in 2022. Within a decade, Shashua said, there will be hundreds of cities worldwide with such fleets. In his view, they will rely on Intel for cloud-based services such as the routing of all those vehicles.
In other words, it’s not the typical “Intel inside” model of just selling chips, though chips will be part of it. It’s also the software and the infrastructure that can be sold on a usage or traffic basis. “The full stack,” as Shashua puts it.
Already, such efforts will boost Mobileye’s sales of its system-on-a-chip into the tens of millions of units, starting this year, he said. Sales of Mobileye processors totaled $209 million last quarter, and were rising at a nice 38% rate as the company continues to sign up automakers.
That’s phase one. Phase two is when ordinary people buy an autonomous vehicle. Yes, people will still own cars, Shashua believes. By 2025, he said, Mobileye will produce the essential parts in volume to make economical self-driving vehicles a reality for a consumer audience.
A key part of bringing down costs is to provide an alternative to a dominant form of car vision, LIDAR.
“LIDAR needs to come down in cost significantly,” he said. LIDAR is based on time-of-flight calculation of lasers sent out to map the trajectory of the road. “It’s all based on a burst of laser, and then steering it, and it’s costly,” he observed.
Using Intel’s technology for what’s known as silicon photonics, the company will provide a different approach, which he described as “mimicking radar using coherent light.”
“If you can imagine it, you can put all of the laser functions on a single chip,” he said. “That tells you there is a horizon for new technology that’s going to make this possible.”
What will people do in their autonomous vehicles? Mainly sit in the back seat and maybe read or watch a video, he predicted.
This is a syndicated post. Read the original post at Source link .