Droneblog talks with Yan, PJ, and Neil, the robotics nerds behind Perceptiv Labs, about the soon to be launched project, the SHIFT, a tablet/phone app that allows your drone camera to visually track an object, achieving easier, smoother shots. The guys at Perceptiv Labs share about how they got started, and how the SHIFT works.
Tell us about yourself and your background.
Hey guys, we’re Yan, PJ and Neil from Perceptiv Labs. We are a group of robotics nerds from Waterloo, Ontario, Canada. We studied robotics during our grad school at the University of Waterloo, specialized in UAV autonomy in terms of control, perception and multi-agent coordination.
How did you get started in the drone industry?
It all started when we met our professor. At the end of the our undergrad degree, we began looking for things to do after college, and we stumbled across Prof. Steven Waslander, a new professor in town with some flashy research videos. Turns out he is one of the first few to build an outdoor capable autonomous quadrotor from the Stanford STARMAC project. Fascinated by the opportunity to build autonomous flying machines, we all signed up and never looked back.
Tell us about Perceptiv Labs, what it is, and your role there. Where did the idea come from? Share the story on how the company started.
After our grad school, PJ and Yan worked for a robotics company of two years, building autonomy software for ground vehicles, and Neil was still finishing up his thesis.
We were at a bar one night, and Neil brought up this conundrum: he tried filming his experiment using a drone to provide an aerial perspective, and found it to be the hardest thing. He had to fly the drone while controlling the camera gimbal, and simply couldn’t coordinate well enough to get any usable footage before the battery ran out.
Two months later, we had our first visual tracking gimbal prototype!
The SHIFT uses computational vision tracking software to track the selected object, and take over the control of the gimbal in order to frame the shot smoothly.
What is the SHIFT, and how does it work?
The SHIFT is an aerial cinematography toolkit for drones. Our tablet/phone app streams the live footage from the drone, and a user can select an object of interest through a touch screen. The SHIFT then uses computational vision tracking software to track the selected object, and take over the control of the gimbal in order to frame the shot smoothly. A user can select multiple objects, and also configure how the shot is framed (e.g. the rule of thirds). You can select any object you want, be it a giraffe, a cyclist or a building. Here’s a video with our tracking software in action!
It’s important to note that the SHIFT tracking software is not an autonomous follow-me feature. The users still retain control over the drone flight, and we believe that it’s important that they do in order to exercise creative control over their content.
What types of drones is the SHIFT compatible with? Will this be expanding anytime soon?
Here is a list of drones we currently support! We are looking to support more gimbals and drones in the future, and we would love to hear from the drone community to guide our choices!
The SHIFT is currently aimed at mobile film-makers. What other applications do you foresee for this technology?
Off the top of my head, this technology also applies to inspection services. For example, cellphone tower inspection is a typical use case for UAVs. A service provider would typically fly the UAV close to the tower and visually inspect different regions of interest (ROI). The challenge is keeping the ROI in frame long enough for visual inspection, and our tracking software can make workflow much easier. The same analogy can be applied to different areas of infrastructure inspection services as well.
When will the SHIFT be available? How much will it cost?
The SHIFT Beta kit is available for pre-order right now for $600 and $750 after the Beta program. The expected ship date is in Fall 2015.
What other projects/technologies are in the works at Perceptiv Labs?
We believe that drones are a very powerful tool, but unfortunately it’s currently not smart enough to be usable at a large scale, and our object is to tackle all these technological hurdles in order to make it so. Some obvious examples for these enabling technologies include GPS denied localization and control, mapping, and collision avoidance. We’re developing these technologies with our partners to ensure that our development is customer feedback driven, and to make the human-autonomy interaction seamless.
Thank you so much for your insights. Is there anything else that we haven’t touched on that you would like to share?
Please feel free to reach out to us at email@example.com, happy to talk drones all day 🙂 We are planning to release blogs/videos to keep everyone updated on our progress, and our thoughts on drone technologies. Please keep an eye out and give us feedback!
Follow Perceptiv Labs, it’s the cool thing to do:
- Handmade Drone Ornament - July 15, 2019
- Data Visualization and Analytics: Q&A with Jan Wouter Kruyt of Propeller Aero - November 7, 2018
- End-to-End Drone Mapping Solutions – Q&A on eBee X and ANAFI Work - October 29, 2018
- 3D Sensing Solutions – Q&A with Cepton Technologies - October 26, 2018
- Would You Rather: Race a Car or Race a Drone - July 18, 2018