Why Robotics Startups Don't Use Deep Learning
A lot of engineers, especially from my audience, are interested in joining a self-driving car company. Self-driving cars are fascinating, and we can't get enough of seeing cars driving without anyone behind the wheel. Yet, there aren't many self-driving car startups out there, and unless you live in the Bay Area, there might not be a lot in your country.
This is why in this article, I'd like to focus on robotics startups. Robotics Startups are companies that also implement autonomous technology such as delivery robots, drones, home robot, etc... and they're using skills that are very similar to those used by self-driving cars.
It might seem like these startup implement the same thing, and that a self-driving car is really an autonomous robot on a specific use case.
But it's not.
By nature, autonomous robots are designed for contact, while self-driving cars are designed to avoid contact. Autonomous robots operate in predictable and known environments, while self-driving cars operate in the jungle, out there.
This difference will make the technologies inside very different, as well as the skills needed for Engineers to work inside robotics startups.
To understand better the technical difference, I'll describe 2 robotics startups.
- The first one is a startup doing Autonomous Valet Parking in Paris. I almost joined them a couple of years ago, and I'll talk about the skills and interview process there.
- The second startup is a startup I recently discovered building Autonomous Excavators.
Stanley Robotics — Building an Autonomous Valet
The first company I'd like to talk about is called Stanley Robotics. This company has been building an Autonomous Valet in Paris since 2015!What does an Autonomous Valet look like? It looks like this:
Cool, isn't it? The first time I saw how they parked a car in the Charles de Gaulle Airport, I was flabbergasted.If you don't have many self-driving car startups out there, know that these robotics startups are also all about autonomous tech; as we'll study in a minute.
Mixing the Car Industry and the Robotics Industry
As you've noticed, we really have a mix between cars and robots, and I'd like to take a second and talk about how the inventors came with the idea of this.
It all started in 2013, when the two founders were working on a feature to park cars autonomously and noticed how costs were too high for their project.
They realized that for a car that is parked 98% of the time, it doesn't make sense to spend this much on parking tech. As for every startup, it started with a problem to solve.
So, they met someone who introduced them to car movers. The idea was to design a robot that would be spending 98% of its time parking cars. Just like a valet parking would park your car, this robot would.And this is how they came up with a Valet that is parking your car. Of course, this wouldn't work in a world made 100% of self-driving cars. But we're not there yet; so you can see this company as a transition company.As you can see here, we have a robot parking cars. We're mixing two different industries. And the robot will need to avoid cars while making contact with other cars.
Is An Autonomous Valet using Artificial Intelligence?
There are several points we should note when building an autonomous valet:
- How to physically lift and move a car
- How to navigate from A to B, avoid obstacles, localize, and implement the autonomous stack.
So, here's how it happens "physically".
If we pause on that video just one minute, there is a lot going on:
- The valet can lift any car and park it anywhere in the airport
- It's following a route to pick up a car and move it
- It's implementing Autonomous Tech (LiDARs, GPS, Navigation, ...) like any self-driving car would.
Today, they're running at the Charles de Gaulle Airport in Paris, the Lyon airport, and recently, the Gatwick Airport in London.
POP QUIZ!
Say you're the CEO of this company that just raised 3.6 Million $ to build a valet.
9:06. Tech Meeting.
An engineer asks: "Hey Boss, which sensors, OS, and language should we use to code our robot?"
What is your answer, knowing that:
- This car is an outdoor autonomous robot.
- It must be able to navigate through an airport.
- It's always going to be the same known, predictable, environment.
- It must avoid other cars and robots, but also make contact with vehicles.
- It's a small robot, with small computers, and batteries.
Here's Stanley Robotics answer: They're using an RTK GPS (a GPS with 1 cm accuracy) to localize in Maps of the Airport. Since the environment is known and pre-defined, they're mapping it before driving inside.
Stanley Robotics is a great example of a company in "Robotics Mode". They don't need tons of Deep Learning because the environment is already known and predictable. Most of it will be made of Point Clouds Processing to estimate the distance of obstacles, Kalman Filters to fuse all sensors, and SLAM algorithms to keep an updated map of the airport, and parked cars. It's all coded in C++ with ROS.
Here's the view from the LiDAR when parking a car:
I don't know about you, but I love to watch this scene. It's like everything's normal, but: "Hey, there's a car parking another car!".
True Magic.
Now that we've listed their tech, we know a few skills we'll need: ROS, C++, Kalman Filters, LiDARs, SLAM.
Next, let me tell you about the time I was interviewed to work in this team.
The Interview
First, note that I didn't respond to any offer. I sent an unsolicited application via their career page using my Portfolio approach. I went through the entire interview process... from the first call to salary negotiation.
Although the salary they offered didn't match my extravagant demands, I was accepted and will show you how. 🤫
The interview process was a mix between phone calls and technical interviews on a platform named Codingame. Both were equally important: they want interesting personalities, as well as good engineers.
Can you guess what happened? Well, I was an interesting personality, but I failed the technical questions and scored something like 50 or 60% on the technical interview.
Why? It was mostly Data Structures & Algorithms questions.
In lots of startups, these types of questions are how candidates are being judged. This is something I had no clue about at the time, and not everybody was doing it like today.
But they still told me yes: as I said, it's not just about the technical interview. I had an entire portfolio of projects to back my profile, my resume and LinkedIn profile were optimized. I had written articles, videos, and interviews. Everything was "portfolio" optimized.
I was also running the School of AI Movement in France, and I had a true experience as a self-driving car engineer. Thanks to all of this, I was accepted anyway.
Now that we've seen one startup, let's see the second one.
Built Robotics — Building An Autonomous Excavator
The second company I'd like to talk about is called Built Robotics. Here's their product:
Impressive, isn't it? This is one of their products called an ATL: Autonomous Track Loader.
With that project, they aim to automate construction workers, or rather, to augment them. According to them, a construction worker costs 100k$, which means that a team of 10 workers is already costing a million dollar.
In the first part of the article, we've seen how to build an autonomous valet parking. The construction world is different. It's a multi-billion dollar industry that is worldwide, involves super-heavy and dangerous equipments, and that hasn't changed in a century.
The way we operate excavators today is exactly the same as it was 100 years ago. Let's talk about the tech needed to build autonomous excavators...
How to make an autonomous excavator?
I'm sure you already had the idea of autonomous construction, autonomous farming, or any other specialized vehicle. But technically, how does it work?
The one thing to understand is that Built Robotics is here to help the construction companies save costs. Their clients belong to the Wind Industry, the Oil & Gas Industry, or even the Solar Industry. Every single project is costing millions of dollars. And millions are lost in construction projects every year because of that.
Therefore, selling a new expensive excavator is a bad idea. In fact, it's the last thing they should do. Companies like that aren't building excavators, they're adapting existing excavators and turning them into autonomous machines.
Here's how:
On the top right, you can see a black box they call the Exosystem™️.
This is the computer that will control the 6 cameras, the RTK (Real Time Kinematic) GPS, the IMUs everywhere something is moving, the RADARs, and more.
In this picture, there isn't any LiDAR, but I listened to some interviews where the founder explained that LiDARs were definitely part of the game.Otherwise, how would they be able to do that?
Just above, you can see a Simultaneous Localization And Mapping algorithm, used by most of the robotics industry in the localization step. In this case, they're using it to navigate in the geofenced areas and spot places to dig. Not to brag or anything, but I'm convinced I have one of the best SLAM courses on earth if you'd like to join robotics startups.
That whole software is called the Everest™️. It's goal it to remotely control and program the excavator. It allows for real-time visualization, mapping, remote control, etc.. Using a tablet, you can control your machine 100% remotely.
Although it's fully remote-control, it's also 100% autonomous.
- In a remote control system, you'd need to be here for every single move such as move arm 30° to the right, get the arm down, etc...
- In an autonomous system, you indicate where to dig, and the robot will follow a road to get there and starts digging.
It can also avoid obstacles like humans and is equipped with lots of Machine Learning systems.
So here is the complete system you need to build:
- An external device that plugs to the caterpillar, and that is connected to sensors, computers, etc...
- Robotic Algorithms. Similarly to the startup we discussed before, these companies use almost 0 Deep Learning. In the career pages, you can mostly see skills like SLAM, ROS, Kalman Filters, and Python.
- A monitoring solution — I didn't mention it, but it's definitely part of the equation..And this was our second robotics startup. Built Robotics has an incredible potential and can completely revolutionize the construction industry.
Notice how engineers with self-driving car skills can work on companies like this one, or the first one.
Why Robotics Startups don't use Deep Learning.
Okay, we're now arriving at the interesting part.
I'm not sure if you've noticed, but there isn't any Deep Learning in the first or the second startup. And this is normal: most robotics startups rely primarly on Robotics algorithms, and not Deep Learning.
Although Deep Learning is still used some time to time, this industry doesn't need Deep Learning as much as the self-driving car industry does.
- They can always benefit from using segmentation models, but what is there to segment?
- They can use optical flow algorithms, but what if there aren't any moving target?
- They can use Multi-Task Learning algorithms, but what if they only need to solve one task?
- They can use reinforcement learning, but what if they're always driving in the same map, with the same obstacles, and the same scenario?
Robotics is different than self-driving cars. Some companies will use Deep Learning, but a lot won't; and it's totally okay for them. It's all about their use case, and what they need in priority.
As an engineer, you'll need to know which type of company you'd like to work for, and which type of company you'd like to avoid.
I would personally love to work in the Robotics Field as well as the Self-Driving Car Field. They're both solving important tasks, and they're both using equally interesting technology.