I’m back! No more issues of the newsletter scheduled three weeks in advance. This was the longest I’ve been ‘out’ since I started work on this newsletter, and boy, did I need this. I’m excited about some of the features today and have quite a bit of FOMO, and as I’m writing this, I’m searching for some PS3 cameras. As usual, the publication of the week section is manned by Rodrigo.
My $500M Mars Rover Mistake: A Failure Story
I recommend this write-up by Chris Lewicki, describing a mistake he made while working on the Spirit rover. I can imagine how stressful it must have been when Chris realized that he sent a surge of electricity to the whole system instead of just motors. If you like an excellent troubleshooting story, this is one of them.
I really liked this tiny volumetric display project. The display was made by attaching an LED matrix to a CD drive motor, creating a holographic display. I recommend going through the website, as the author has many exciting projects, often with a link to GitHub.
Drone Motion Capture, The Open Source Way
Joshua Bird created a millimeter-level tracking system based on PS3 Eye cameras that, according to the author, you can buy second-hand as cheaply as $5 per piece. Outstanding results! Make sure to check out the crashes at the end of the video.
100 SLAM-related technical interview questions
Hyunggi published this list of interview questions that many could find helpful. Happy job hunting.
Dobb·E: On Bringing Robots Home
“Dobb·E is an open-source framework for teaching robots new household tasks in 20 minutes via imitation learning”. You will find videos, paper, dataset, code, and models on this website. What else would you need?
Publication of the Week - Visual Environment Assessment for Safe Autonomous Quadrotor Landing
Ensuring safety for any unmanned system is mandatory. This paper presents an autonomous quadrotor with limited computational power capable of safe landing using vision. The device uses a stereo camera and an Inertial Measurement Unit (IMU) to assess its surroundings. The data coming from the sensors are used to compute visual-inertial odometry (VIO). The RGB images from the camera are segmented and combined with the VIO to create a 2D binary map so the quadrotor knows the best places to land. The method is lightweight and doesn’t require any elevation maps or any external global information. In this video, you can check the quadrotor in action and get more explanations.
Robotics funding saw another dip in 2023 | TechCrunch
According to Crunchbase, we’ve seen $2.7B investments in robotics so far in 2023. This is compared to $5B last year and $9.1B in 2021.