Origin of Tesla Autopilot/FSD
Tesla was founded in 2003 by Martin Eberhard and Marc Tarpenning with an initial focus on high-performance electric vehicles. At its founding, the company had no explicit emphasis on self-driving technology. Elon Musk joined as chairman in 2004 and became CEO in 2008, and he has been the primary driving force behind pushing Autopilot and Full Self-Driving (FSD) capabilities, repeatedly emphasizing full autonomy (SAE Level 5 - eyes off, hands off, take a nap) as a key goal since at least 2013. Musk's influence escalated in the mid-2010s, where he directed the shift toward vision-based systems and end-to-end neural networks, drawing from billions of real-world driving data frames. While the initial idea of electric vehicles with advanced driver assistance predates Musk's full involvement, his aggressive timelines and decisions, like removing radar in 2021 for cost and vision purity, have shaped its development.
Key Personnel Involved in Hardware (HW) and Software (SW)
Tesla's Autopilot/FSD development has involved a mix of internal teams and high-profile hires. On the hardware side, the focus has been on custom silicon for processing power. On software, it is centered on AI, computer vision, and neural networks trained on vast datasets.
- Jim Keller: A legendary chip architect, Keller joined Tesla in January 2016 as Vice President of Autopilot Hardware Engineering and left in April 2018. He led the design of Tesla's first custom FSD chip (used in HW3), emphasizing redundancy and high-performance computing for autonomy. Prior to Tesla, he worked on Apple's A4/A5 chips and AMD's Zen architecture.
- Andrej Karpathy: A deep learning expert and OpenAI co-founder, Karpathy joined Tesla in 2017 as Director of AI, leading the Autopilot vision team until his departure in July 2022 (after a 4-month sabbatical starting March 2022). He was instrumental in shifting to vision-only systems, developing end-to-end neural nets, and overseeing FSD Beta releases. He rejoined OpenAI in 2023.
Other notable contributors include Pete Bannon (ex-Apple, worked under Keller on HW3+), and teams focused on Dojo supercomputer for training (though Dojo has faced pivots). Elon Musk has been hands-on, often overriding engineering decisions.
Timeline of Major Milestones
Below is a chronological timeline combining key Autopilot/FSD milestones, hardware introductions (with features added), and software revisions. Hardware revisions focus on enabling progressively advanced autonomy, while software builds on that with features like beta testing and vision improvements. Dates are approximate based on announcements and rollouts.
Year/Month | Milestone | Details |
---|---|---|
2013 | Early Vision for Autonomy | Elon Musk begins publicly predicting full autonomy within years; Tesla starts internal development. |
Sep 2014 | HW1 Introduced | First Autopilot hardware (Mobileye EyeQ3). Added basic features: adaptive cruise control (ACC), lane-keeping assist (Autosteer), automatic emergency braking (AEB), and auto-parking. Used 1 forward camera, radar, and ultrasonics. Installed in Model S/X until mid-2016. |
2015 | Initial SW Releases for HW1 | Software v7.0 enables Autopilot features like ACC and Autosteer on highways. Factor of 10 safety improvement predicted by Musk within 6 years. |
Oct 2016 | HW2 Introduced | NVIDIA Drive PX 2 platform. Added 8 cameras (360° coverage), 12 ultrasonics, enhanced radar. Enabled "Enhanced Autopilot" with auto lane changes, Summon (remote parking), and Navigate on Autopilot (NoA) potential. Installed until mid-2017. |
Mar 2017 | SW v8.1 for HW2 | Brought HW2 parity with HW1, adding speed limit recognition and improved Autosteer. |
Aug 2017 | HW2.5 Introduced | Updated NVIDIA hardware with redundant processing units for safety. Added cabin camera (initially dormant). Improved sensor fusion for better reliability in adverse conditions. Installed until early 2019. |
Apr 2019 | Autonomy Day; HW3 Introduced | Tesla's custom FSD chip (dual redundant SoCs, 21 TOPS). Designed for full redundancy and FSD compute needs. Added support for traffic light/stop sign recognition, automatic city driving. Retrofits began for older vehicles. Basic Autopilot made standard. |
Oct 2020 | FSD Beta Program Launch | SW v10 Beta: Early access for select users, enabling city streets navigation. FSD price rises to $10,000. |
May 2021 | Shift to Vision-Only | Removed radar from new vehicles; relied on cameras and neural nets. SW updates like v9 Beta emphasize pure vision. |
Oct 2021 | FSD Beta v10.3 Issues | Brief halt due to safety concerns; quick fixes rolled out. |
Sep 2022 | FSD Price Increase | To $15,000; wider Beta access. |
Mar 2023 | HW4 (AI4) Introduced | Second-gen Tesla chip (higher resolution cameras, more compute ~500 TOPS). Added better handling of complex scenarios, rain/snow performance, and hardware for unsupervised FSD. Installed in new Model S/X, then 3/Y. |
2023-2024 | FSD v11/v12 Rollouts | v11: Unified stack for highway/city. v12: End-to-end neural nets, no hand-coded rules; hands-free with attention monitoring. Wide release of v12.4 in mid-2024 removes torque-based nagging. |
Jan 2025 | FSD v12.6 for HW3 | Improvements in smoothness and decision-making. |
May 2025 | FSD v13 for AI4 | v13.2.9: Enhanced vision monitoring, better edge cases. HW3 lags behind AI4 versions. |
Aug 2025 | Dojo Timeline Updates | Custom supercomputer for FSD training; faces delays but crucial for future nets. |
Looking Forward: AI5 and AI6 Likely New Features
Tesla is pivoting from Dojo to focus on in-house chips for inference and training. AI5 (HW5) is expected in production by late 2026, with massive performance boosts (up to 40x over AI4), optimized for sparse tensors, mixed-precision math (FP8/INT4), and dedicated AI blocks. Likely features: Unsupervised FSD (no human intervention), real-time "reality compression" for efficient processing, better handling of rare events, and integration with Robotaxi/Cybercab. Manufactured by TSMC.
AI6, targeted for 2026-2027 production, is described by Musk as "the best AI chip by far," with even higher efficiency and power. It will likely enable advanced multi-modal AI (e.g., integrating voice, gestures), fleet-scale learning, and energy-efficient autonomy for mass deployment. To be made at Samsung's new facilities. These chips aim to surpass competitors like NVIDIA, but timelines have slipped before.