Uncategorized

Why Cars Will Become The Ultimate Mobile Device

A look at autonomous cars and all the software and solutions they will require.

If​ ​fully​ ​automated​ ​cars​ ​are​ ​the​ ​future,​ ​Silicon​ ​Valley​ ​will​ ​have​ ​to​ ​do​ ​away​ ​with​ ​at​ ​least​ ​one infamous​ ​adage:​ ​Move​ ​fast​ ​and​ ​break​ ​stuff.​​​ ​When coding for cars, it isn’t as easy to test​ ​and​ ​pull​ ​back​ ​features​ ​that​ ​don’t​ ​work​ ​as​ ​expected​ ​in​ ​the​ ​wild.​ All of a car’s features ​have​ ​to​ ​operate​ ​bug​ ​free.

Forecasts​ ​​abound​​ ​on​ ​when​ ​fully​ ​automated​ ​cars​ ​will​ ​hit​ ​public​ ​roads, but Andreessen​ ​Horowitz partner​ ​Benedict​ ​Evans​ ​says​ ​very​ ​few​ ​expect​ ​full​ ​autonomy​ ​within​ ​the​ ​next​ ​five​ ​years, and​ ​most​ ​tend​ ​closer​ ​to​ ​ten.​ Boston Consulting Group ​​estimates​​ ​annual​ ​global​ ​sales​ ​of​ ​12​ ​million​ ​fully​ ​(not​ ​partially) autonomous​ ​vehicles​ ​by​ ​2035, but already heavyweights like Ford and GM are offering sneak peeks at their plans.​ Elon Musk claimed Tesla will have a demo vehicle able to drive from California to New York with “no controls touched at any point during the entire journey” by the end of 2017.

In​ ​a​ ​sense​, ​automated​ ​cars​ ​are​ ​the​ ​ultimate​ “​mobile​ ​device,” ​requiring​ ​both​ ​new​ ​interfaces​ ​and intuition​ ​behind​ ​controlling​ ​the​ ​vehicle.​ As this industry grows, so too does the need for engineers that understand how to build software for the entire platform: from the applications in the dashboard, to the overall intelligence and image processing algorithms that help the car navigate the unpredictable aspects of driving, to sensors for the engine and parts.

​”They’re​ ​trying​ ​to​ ​create​ ​a​ ​car​ ​that​ ​not​ ​only​ ​replicates human​ ​behavior,​ ​but​ ​can​ ​be​ ​smart​ ​and​ ​above​ ​human​ ​behavior,”​ ​says​ ​Autonet​ ​founder​ ​and veteran​ ​automotive​ ​software​ ​executive​ ​Sterling​ ​Pratz.​ ​These​ ​skills​ ​will​ ​(at​ ​least)​ ​include developing​ ​software​ ​for​ ​high-powered​ ​sensors,​ ​understanding​ ​computer​ ​vision​ ​algorithms,​ ​deep learning​ ​and​ ​neural​ ​networks,​ ​kinematics,​ ​automotive​ ​hardware,​ ​and​ ​parts.​ ​None​ ​of​ ​those​ ​are new​ — but it’s​ ​the​ ​combinations​ ​that are paving the way for ​new​ types of ​engineers.

That Moment, Episode 3: “The mother of all projects”

Building up the stack

Every​ ​year​ ​since​ ​the​ ​Model​ ​T,​ ​cars​ ​have​ ​taken​ ​work​ ​from​ ​the​ ​driver, such as key-start​ ​ignition,​ ​automatic transmissions,​ ​cruise​ ​control,​ ​automatic​ ​doors,​ ​power​ ​steering,​ ​anti-lock​ ​brakes,​ ​onboard navigation,​ ​distance​ ​sensors​ ​for​ ​parking,​ ​automatic​ ​parallel​ ​parking,​ and ​home​ ​charging​ ​stations​ ​for electrical​ ​vehicles.

Those​ ​have​ ​been​ ​the​ ​incremental​ ​changes.​ ​That​ ​still​ ​puts​ ​cars​ ​at​ ​ Autonomy ​Level​ ​2​ ​(out​ ​of​ ​5​)​,​ ​according​ ​to​ ​the​ ​Society​ ​of​ ​Automotive​ ​Engineers.​ ​Level 3​ ​means​ ​the​ ​system​ ​monitors​ ​the environment​ ​but​ ​the​ ​driver​ ​steers;​ at Level 4 ​the​ ​car​ ​monitors​ ​and​ ​drives​ ​but​ ​the​ ​driver​ ​can​ ​take​ ​over when​ ​needed; and Level 5 ​means​ ​the​ ​car​ ​always ​retains​ ​control​.​ ​Leaping​ ​to where​ ​people can​ trust ​taking​ their ​hands​ ​off​ ​the​ ​wheel offers engineers a huge opportunity: this field requires an combinational understanding ​of​ ​deep​ ​learning​ ​techniques, data science, software engineering, ​and​ ​sensor​ ​processing. Each one of these disciplines will need to be comfortable merging the intersection of “resting data,” like buildings and intersections, and streaming data such as a child running into the road.

That’s​ ​not​ ​to​ ​say​ ​every​ component of an ​automated​ ​car​ ​offers​ significant opportunity for development.​ ​Evans believes sensors​ — ​Light Detection and Ranging (LiDAR)​ ​especially​​ — will​ ​trend​ ​the​ ​way​ ​of​ ​hardware​ ​and​ ​become​ ​a commodity. And dashboard ​applications​ ​will​ ​mostly be​ ​dominated by​ ​Uber,​ ​Lyft,​ ​Netflix, Hulu,​ ​Spotify,​ ​Pandora​, ​Waymo,​ ​and​ ​onboard​ ​navigation​ ​apps.​ ​Most​ ​remaining​ ​apps​ ​will​ ​make​ ​more​ ​sense​ ​used​ ​from​ ​a​ ​phone​ ​anyway.

Those incumbents will still offer jobs but in terms of industry growth, “the​ ​place​ ​to​ ​look​ ​is​ ​not​ ​within​ ​the​ ​cars​ ​directly​ ​but​ ​still​ ​further​ ​up​ ​the​ ​stack,”​ ​Evans​ notes​,​​ “in the​ ​autonomous​ ​software​ ​that​ ​enables​ ​a​ ​car​ ​to​ ​move​ ​down​ ​a​ ​road​ ​without​ ​hitting​ ​anything,​ ​in​ ​the city-wide​ ​optimization​ ​and​ ​routing.”​ ​In​ ​other​ ​words,​ ​focus​ ​on​ ​the​ ​software​ ​that​ ​builds​ ​and operates​ ​the​ ​system,​ ​not​ ​the​ dashboard console.​ ​Two​ ​automated​ ​cars​ ​hit​ ​a​ ​stop​ ​sign​ ​at​ ​the​ ​same​ ​time — ​​which​ ​goes​ ​first?

“…focus​ ​on​ ​the​ ​software​ ​that​ ​builds​ ​and operates​ ​the​ ​system,​ ​not​ ​the​ dashboard console.”

David​ ​Silver​ ​leads​ ​a​ ​self-driving​ ​car​ ​engineering​ ​course​ ​at​ ​online​ ​learning​ ​hub​ U​​dacity and agrees with Evans. He notes most​ ​web​ ​programming​ ​today​ ​occurs​ ​in​ scripting ​languages​ like JavaScript, Ruby, and PHP that run​ ​after​ ​they’ve​ ​been compiled — a process that takes precious milliseconds. ​He​ ​sees​ ​languages​ ​like​ ​C++​, more often used in traditional desktop software development, ​as​ ​keys​ ​for​ ​automated​ ​car​s​’ operation on the streets ​because​ ​the language runs already​ ​compiled​​ — reducing​ ​execution​ ​time and increasing a car’s agility.​ However ​C++​ does ​require​ ​more​ ​memory​ ​management​ know-how so engineers making the jump from web development will need to hone those skills, he says.

So far Silver sees most software developers gravitating towards the deep learning/machine learning side of automated cars — image recognition and processing data from LiDAR sensors. “That resembles traditional engineering because it has less of the mechanical and robotics background,” Silver says. Popular open source machine intelligence library TensorFlow, for example, relies on Python for their primary APIs and does a lot of compiling for faster performance.

Bryan​ ​Salesky,​ ​CEO​ ​of​ ​Argo​ ​AI,​ ​was​ ​recently​ ​​asked​​ ​about​ ​the​ ​number​ ​one​ ​barrier​ ​for​ ​fully autonomous​ ​cars​ ​in​ ​cities.​ ​”For​ ​us,​ ​it’s​ ​about​ ​detecting,​ ​seeing,​ ​and​ ​understanding​ ​the​ ​world.​ ​And going​ ​one​ ​step​ ​beyond​ ​that,​ ​which​ ​is​ ​predicting​ ​what​ ​other​ ​actors​ ​are​ ​going​ ​to​ ​do.”​ ​Any​ ​driver knows​ ​the​ ​importance​ ​of​ ​eye​ ​contact.​ ​Now​ ​engineers​ ​have​ ​to​ ​train​ ​cars​ ​to​ ​work​ ​off​ ​signals​ ​with equal​ ​confidence.​ ​Does​ ​that​ ​jogger​ ​see​ ​you​ ​coming?​ ​Can​ ​the​ ​car​ ​tell​ ​that​ ​person’s​ ​blind?

That​ ​begs​ ​the​ ​question​ ​of​ ​how​ ​cars​ — ​and​ ​engineers​ — see​ ​the​ ​data​ ​at​ ​all.​ ​Sensors​ ​are processing​ ​the​ ​world​, ​but​ ​how​ ​do​ ​designers​ ​make​ ​that​ ​data​ ​useful?​ ​Taxi-hailing​ ​giant​ ​Uber recently​ ​​shared​​ ​how​ ​the​ ​company is​ ​teaching​ ​cars​ ​to​ ​understand​ ​not​ ​only​ ​the​ ​unpredictable pieces​ ​of​ ​the​ ​road​ — kids,​ ​potholes,​ ​opening​ ​car​ ​doors​​ — but​ ​how​ ​they​ ​visualize​ ​data​ ​wrought from​ ​maps,​ ​high-resolution​ ​scans​ ​of​ ​the​ ​ground​ ​surface,​ ​lane​ ​boundaries​ ​and​ ​types,​ ​turn signals,​ ​speed​ ​limits,​ ​crosswalks,​ ​and​ ​vehicle​ ​logs.​ ​Uber engineer Xiaoji Chen says one of the biggest challenges of bringing these data sources together into a unified view is organizing the different geo-positioning data formats. They describe positioning differently and even the slightest error can break a visualization. That​ ​kind​ ​of​ real-time ​conversion processing​ ​requires​ ​unique rendering-visualization​ ​skills​ and ​GPU​ ​programming​ ​skills.​ ​Then​ these ​cars​ ​can​ ​start​ ​making sense​ ​of​ ​all​ ​that​ ​data, the​ ​way​ ​a​ ​teenager​ ​does​ ​after​ ​a​ ​year​ ​of​ ​his​ ​parents​ ​helping​ ​him​ ​watch and​ ​understand​ ​the​ ​road.

“…what​ ​if​ ​automated​ ​cars​ ​are​ ​running​ ​nine,​ ​fifteen,​ ​twenty​ ​hours​ ​each​ ​day? How​ ​does​ ​that​ ​change​ ​the​ ​vehicle?​”

Yet engineers needn’t just focus on the variables on the road, but also those under the hood. Sensors are going to have to also get better at monitoring a car’s moving parts because with more automated vehicles Pratz thinks the industry will have to rethink parts replacements. ​Right​ ​now​ ​parts​ ​and​ ​checkups​ ​are​ ​designed​ ​around​ ​vehicles​ ​that​ ​drive​ ​a​ ​couple hours​ ​each​ ​day​ — what​ ​if​ ​automated​ ​cars​ ​are​ ​running​ ​nine,​ ​fifteen,​ ​twenty​ ​hours​ ​each​ ​day? How​ ​does​ ​that​ ​change​ ​the​ ​vehicle?​ ​Hardware​ ​engineers​ ​have​ ​to rethink​ ​their​ ​materials​ ​and​ ​their designs.​ ​It​ ​will​ ​also​ ​mean​ ​that​ ​car​ ​makers​ ​will​ ​need​ ​sensors​ ​in​ ​parts​ ​of​ ​the​ ​vehicle​ ​they​ ​hadn’t imagined.​ ​The​ ​car​ ​is​ ​evolving,​ ​so​ ​too​ ​will​ ​its​ ​nervous​ ​system.

“We​ ​live​ ​in​ ​a​ ​world​ ​with​ ​friction,”​ ​Pratz​ ​says.​​” Engineers​ ​need​ ​to​ ​understand​ ​that​ ​intersection between​ ​software​ ​and​ ​the​ ​physical​ ​world.”

Change is the only constant, so individuals, institutions, and businesses must be Built to Adapt. At Pivotal, we believe change should be expected, embraced and incorporated continuously through development and innovation, because good software is never finished.


Why Cars Will Become The Ultimate Mobile Device was originally published in Built to Adapt on Medium, where people are continuing the conversation by highlighting and responding to this story.