Volaly (pronounced ['volali]) is a Swiss startup aimed at providing an intuitive solution to control robots and other smart devices through pointing.

Our patent-pending technology allows to specify locations in 3D space with simple pointing gestures that are captured by commodity handheld or wearable devices such as fitness trackers, smartwatches, and smartphones.

The technology is well-suited for controlling ground and flying robots on complex trajectories in real time, as well as giving discrete point-to-target commands.

Being combined with a third-party speech recognition and natural-language user interfaces such as Google Assistant, Amazon Alexa, or Apple Siri, our technology enables an intuitive and efficient way to query information about the surrounding world: "Ok Google, what is the name of this mountain?" or to control smart appliances at home: "Alexa, turn this light on!"

The company is a spin-off of the Dalle Molle Institute for Artificial Intelligence (IDSIA) a non-profit research institute specialized on Artificial Intelligence & Robotics.

Development of this technology was partially supported by the Swiss National Science Foundation (SNSF) through the National Centre of Competence in Research (NCCR) Robotics.


Pointing is a skill acquired by humans within the first year of life. It is a natural way to communicate directions, locations, and objects around us to other people.

We develop a technology that allows users to control robots and other smart devices, and query information about the objects in surrounding world with the same ease as they do when communicating with other people.

The main advantage of pointing gestures compared to conventional interfaces (e.g. joysticks and touch screens) is in the way they define targets: indicated locations and directions are always relative to the user, i.e. defined in human egocentric frame. This approach allows to avoid the mental rotation problem—a cognitive effort one needs to apply in order to understand where point indicated on the screen is located in the real environment.

We capture human pointing gestures with an absolute orientation sensor—an inertial measurement unit (IMU)—that is present in the vast majority of modern wearable and handheld devices such as fitness trackers, smartwatches, and smartphones. It is up to an end-user to decide which device to use.

The acquired orientation is used for estimation of the pointed direction/location with respect to the user. Our patent-pending technique is then applied to convert the indicated location to a target coordinate frame, for example, to control a ground or a flying robot, or to query information from a Geographic Information System (GIS).



Drive your robots around, label area and objects, specify targets to reach or tools to pick.


Drive toy robots through a course of obstacles. Compete for the best time!

Smart Homes & IoT

Point at the light you want to switch or adjust the level of the blinds on a window.

Navigation & GIS

Query information about landmarks and ask for directions: "Hey Siri, how can I get there?"


Spotted a misbehaving drone? Point at it and keep following for a few seconds: the drone is being identified and will be reported to authorities.

Have an idea?

Let us know!

Get In Touch

Have any questions? Send us an email and we will get back to you as soon as possible!

Follow us on social media to stay up to date with our technology.

Visit Us:

Galleria 2, Via Cantonale 2c
Manno, 6928


21–22 Mar 2019, Bucharest, Romania:
European Robotics Forum (ERF) 2019: We will be showcasing our technology at the NCCR Robotics stand (booth #24).

4–5 Apr 2019, Hannover, Germany:
Hannover Messe 2019: We will be showcasing our technology at the NCCR Robotics stand (Swiss Innovation Pavilion, Hall 2, Stand C39).