GyroPalm Gestures

GyroPalm allows users to perform various gestures for an interactive experience. Although the relevant gestures to be performed in each application may vary, developers should be mindful of common gesture workflows and conventions used in GyroPalm Studio.

Overview

From an end-user perspective, GyroPalm can detect a wide variety of gestures where most applications require only 4-5 motions. From a technical standpoint, GyroPalm has defined a standard that comprises a 4-layer gesture intent methodology and a 3-level gesture abstraction technique.

The "gestures" are interpreted by a low-latency inertial measurement unit (IMU) inside the Encore wearable. However, gesture classification is not performed by deriving the raw data as that would affect overall performance. GyroPalm uses its patented methods to combine low-level sensor interrupt data with a higher level AI-engine onboard.

To make these concepts simple for developers to implement in code, the GyroPalmEngine object (mentioned in Firmware Docs) contains gesture callbacks and examples needed for implementing the following aforementioned methodology and techniques.

The 4-layer Gesture Intent Methodology

This methodology can be understood as a human-intent to communicate with a machine. The layers are explained as follows:

  1. Activation Gesture (e.g. single-snap, double-snap, or pullback)
  2. Navigation Gestures (e.g. swipe, tilt, or draw)
  3. Parameter Gesture (e.g. adjustment gesture or draw)
  4. Deactivation Technique (e.g. auto timeout, hand-drop, double-snap)

Activation Gesture

The Activation Gesture is a "global gesture" and is the gatekeeper of all gesture interactions. This gesture is very important because it significantly narrows down any false positives which could otherwise be passed down the remaining levels of the methodology.

The gestures allowed here are limited because the actions to be performed must be of deliberate intent by the user and not easily triggered by natural everyday hand motions (i.e. brushing teeth, washing dishes, high five, typing, etc). It is acceptable to allow the end-user to choose between single-snap, double-snap, or pullback gesture.

To disambiguate an accidental clap as a single-snap or double-snap, the callback function run by setSnapCallback will not be called unless the user naturally raises his/her wrist to glance at the wearable prior to doing the snap or double-snap. See the section below for more about Activation Gestures.

Navigation Gestures are a set of two or more gestures (e.g. swipe left, swipe right, etc) that are subsequently performed after the Activation Gesture. When an Activation Gesture is performed, a global boolean can be raised (e.g. isActive) and can be used in an if (isActive) statement prior to the callback for one or more Navigation Gestures.

Simple Applications

In simple applications, the methodology is complete after a Navigation Gesture callback has been performed. The callback would also set the global boolean such as isActive to false and then return to the default state prior to the performance of the Activation Gesture.

Simple applications would not require any Parameter Gestures as there is no need to specify further intent. For example, in a Presentation Remote, the user simply needs to perform Activation Gesture -> Swipe Left or Activation Gesture -> Swipe Right to accomplish the desired functionality.

Complex Applications

Some complex applications may involve pseudo-recursion. In other words, after the Activation Gesture, Navigation Gestures may need to be performed by the user more than once to achieve the desired control. For instance, a user may use Navigation Gestures to go through a main menu to select Option B and again use Navigation Gestures to go through a sub-menu to selection Run Task 3.

In such cases, the developer would need to declare one or more global variables to note the selected index(es) of the menu(s). There would also need to be other global variables to hold the array of items as well as the current menu or screen that is being shown to the user.

To implement Navigation Gestures into this, a switch-case statement may be used inside the callback of Naviation Gestures to manipulate the index of the current menu.

Some applications may even require a Parameter Gesture variable (e.g. a letter, number, brightness value, boolean, etc) which will be explained below. These applications may pertain to those such as setting the brightness on a lamp, controlling the speed of a motor.

Parameter Gesture

The Parameter Gesture is a "realtime gesture" often performed subsequent to a Navigation Gesture to indicate a time-based or control-based intent to specify a data value (e.g. a letter, number, brightness value, boolean, etc) without involving a touchscreen. A Parameter Gesture can be used to mimic a potentiometer, slider, etc. It is optional and not required in most interactions. It is only used when other means of inputting data are not as accessible.

For a time-based intent such as adjusting the brightness of a lamp or speed of a motor, there is a auto-timeout window. After the current millis() timestamp passes the timeout window, the developer will remove the focus from the Parameter Gesture. In many cases, that simply means setting isActive boolean to false.

For a control-based intent such as air-mouse, controlling a robot, or any free-form of control, allow the user to revoke the focus from the Parameter Gesture by using a Deactivation Gesture (explained below). For example, a user may be driving a robot wirelessly using the GyroPalm Encore. In this control-based intent, a Parameter Gesture will have constant focus until a Deactivation Gesture is performed (e.g. hand-drop). When that happens, robot safely stops. The user can perform an Activation Gesture (and select any necessary menu item) to resume control. The user can also use Navigation Gestures to return to something like a main menu.

A Parameter Gesture is often the last part of user interaction before a task is performed. Unless the application requires 2D or 3D control, Parameter Gestures usually comprise the low-pass data coming from one axis (usually Y-axis) of the sensor. Be aware that the Y-axis values can be inverted depending on whether a user is a left-handed or right-handed wearer.

Deactivation Gesture

A Deactivation Gesture is a "global gesture" allows a user to safely relinquish control from any application that requires continuous Parameter Gestures (realtime-control) without using any touchscreen or buttons. A Deactivation Gesture (e.g. hand-drop) normally responds in 15 milliseconds or less when performed properly. This is very helpful when experimenting with robotic control, flying drones, or controlling moving actuators that may encounter undesired operation. When a user performs a Deactivation Gesture such as the hand-drop motion, the wearable must (1) send "STOP" command(s) to the relevant mechatronic application(s) and (2) it must cease its continuous communication with said applications. The user must perform at least an Activation Gesture prior to restoring control.

The Deactivation Gesture is not required for simple applications that have an auto timeout window implemented (i.e. adjusting brightness, turning off a lamp, etc).

Developers should always exercise the use of a Deactivation gesture when working with realtime-control applications. Imagine this gesture as an "Emergency Shutoff" button.

The 3-level Gesture Abstraction Technique

Global Gestures should always respond to the user regardless of what state the wearable is in. These are Activation and Deactivation gestures. Step count, activity, and fall detection (when available) are also Global Gestures. Code written in the callbacks of these gestures must be non-blocking.

Custom Gestures are any gestures that can be customized by the end-user and processed on GyroPalm's AI-engine prior to callback. These may have a slightly higher latency compared to processing Global Gestures. However, these gestures enrich the GyroPalm experience and allow users and developers to customize how they want the application to respond.

Realtime Gestures are any motions that are not pre-configured into the system. These gestures are relayed as low-pass filtered values intended to be parsed and sent as continuous commands to a robot, computer, or moving application.

Best Practices for Gesture Coding

To further help developers understand how the above mentioned practices work, some coding suggestions are provided. Also, it is very important to test activation gestures for the best possible performance.

Simple Command

This is an excerpt from a fully functional firmware. The example below demonstrates the following:

#include <GyroPalmEngine.h>
// Other includes...

// Object declarations
GyroPalm *device;
GyroPalmEngine gplm("gp123456");

// Global variables
bool isActive = false;  //whether we are active
int lastActivated = 0;  //timestamp of when we double snap

lv_obj_t * btnNext;
lv_obj_t * btnPrevious;

// Event Callbacks
void onDeviceTilt(int direction)
{
    if (isActive != true) { // Ignore event if called without activation gesture
        return;
    } else {
        isActive = false;   // Deactivate after the following code is performed
    }
    Serial.print("Tilted in the ");
    switch(direction)
    {
        case LH_LEFT:
            Serial.println("Left direction.");
            lv_event_send(btnNext, LV_EVENT_CLICKED, NULL); //perform click on gesture
        break;

        case LH_RIGHT:
            Serial.println("Right direction.");
            lv_event_send(btnPrevious, LV_EVENT_CLICKED, NULL); //perform click on gesture
        break;

        case LH_UPSIDEDOWN:
            Serial.println("Upsidedown direction.");
        break;

        // Other cases as needed...
        default: break;
    }
}

void onSnap(int snapTimes)
{
    switch(snapTimes)
    {
        case SINGLE_SNAP:
            Serial.println("Performed Single Snap");
        break;

        case DOUBLE_SNAP:
            Serial.println("Performed Double Snap");
            isActive = true;    // set device to active
            lastActivated = millis();   // note the timestamp
        break;       
    }
}

void onGlance(bool isGlanced)
{
    if (isGlanced) {
        form[curScreen].showIcon(BAR_GLANCE);   // show visual hint
    } else {
        form[curScreen].hideIcon(BAR_GLANCE);   // hide visual hint
    }
}

// Other event callbacks...


void setup() {
    gplm.begin();
    delay(100);
    gplm.listenEvents(false);    //starts listening for events

    // Include only the callbacks you need:
    gplm.setTiltCallback(onDeviceTilt);
    gplm.setSnapCallback(onSnap);
    gplm.setGlanceCallback(onGlance);
    // Other callbacks...

    // Other setup functions...
}

void loop() {
    if (isActive) { //validate activation time
        if (millis() - lastActivated > 3000) {  //been more than 3 seconds
            isActive = false;   //deactivate
        }
    }

    lv_task_handler();
    delay(50);
}

GyroPalm Activation Gestures

As part of GyroPalm's patented gesture control technology, GyroPalm provides developers with access to official activation gestures to ensure an optimal experience and prevent false positives. An activation gesture is performed by an end-user prior to performing other gesture-based commands. At the time of writing, developers can choose one of three activation gestures. Typically, many users learn how to perform these gestures within 10 minutes or less.

This section assumes that you have already understood how to implement gesture callbacks. If you have not written gesture callbacks in GyroPalm before, please refer to the Gesture Callbacks section.

Purpose of Activation Gesture Utility

Note that GyroPalm's Activation system and gestures are carefully designed and provided with the following intentions:

  1. To ensure reliability of all possible gestures and prevent false positives
  2. To eliminate the tedious process of developers needing to manage the gesture lifecycle (autoTimeout, non-blocking, etc)
  3. To detect a quick deliberate motion that will not be confused with everyday activities

Performing an Activation Gesture

Most activation gestures require users to perform a motion within two seconds or less. In almost all cases, activation gestures start with the user glancing at the wearable. Therefore, the onGlance callback is needed. To trigger the onGlance callback, users need to raise their device as if they are "looking at the time". When onGlance is true, then the "glance icon" (symbol that looks like an eye) will show on the top right corner. At this point, the remaining parts of the gesture are physically performed.

Available Activation Gestures

Assuming the GyroPalmEngine object is instantiated as gplm, the activation gesture can be set by calling gplm.activationGesture = ACT_(gesture_name)

The following gesture options are provided:

Manual Activation

To use: gplm.activationGesture = ACT_MANUAL;

The developer implements their own gesture detection and activates by calling gplm.setActivation(true) whenever user performs the activation successfully. The wearable will indicate its "active" status on the top-right corner using a cyan colored "glance icon".

After the appropriate navigation and optional parameter gesture is performed, the developer is responsible for calling gplm.setActivation(false) to mark the activation as "disabled". In other words, a disabled activation can also be considered as an "expired" or "redeemed" activation.

The activation will expire in 3 seconds if gplm.autoTimeout = true. By default, gplm.deactivateTimeout is set to 3000ms meaning that the activation will be disabled after 3 seconds. Developers may change that value between 2000-6000ms. It is highly recommended to keep the value at 3000ms. A low timeout will make it more difficult to complete a gesture sequence while an excessively high timeout may increase the chance of false positives.

Should the developer choose not to implement gplm.setActivation(false) after command completion, set gplm.autoTimeout = true, or both, then there will be a significant chance for undesirable false positives since the wearable will remain active. Similarly, one can assume that if a smartphone's screen remains active in one's pocket, then there is a high likelihood of "pocket-dialing" the wrong number. This is the nature of the technology, not a limitation.

If gplm.autoTimeout is set to false, then it is up to the developer to run gplm.setActivation(false) once the user is finished performing gesture-based activity. Sometimes autoTimeout may need to be false, such as when driving robots, flying drones, or performing any other continuous gesture control.

Note: At any point in your code, you may use gplm.isActive to retrieve a boolean of whether the device has been activated. The boolean is read-only as its value is set by the device. For example, use gplm.setActive(false) instead of gplm.isActive = false as that is the only correct way to alter the active state.

Double Snap

To use: gplm.activationGesture = ACT_DOUBLE_SNAP;

The classic "double snap" gesture has been a fan favorite in many GyroPalm demonstrations. Essentially, the wearer glances at the watch, keeps the watch upright, and snaps his/her fingers twice. The advantage of this gesture is that can be performed with minimal cognition.

Flip Snap

To use: gplm.activationGesture = ACT_FLIP_SNAP;

The flip snap gesture is also know as the "TAG v1" gesture as it was developed for users who want an easier alternative to double snap. The flip snap gesture can be performed as follows: face palm upwards, causally snap once, and swiftly rotate wrist to the "glance" position.

Snap Shake

To use: gplm.activationGesture = ACT_SNAP_SHAKE;

The snap shake gesture is a recently developed motion that is one of the easiest to perform. It can be performed by glancing at the watch, snapping once, and shaking the wrist vigorously for 2 seconds.

Implementing an Activation Gesture

Conventionally, developers were responsible for implementing gesture detection algorithms or callbacks needed for the activation gesture. The GyroPalmEngine framework provides new methods for developers to implement the 4-layer Gesture Intent Methodology for their use-cases without complex repetition.

To implement one of the activation gestures mentioned above, follow these steps:

  1. Write the onGlance gesture callback as follows:
void onGlance(bool isGlanced)
{
    if (gplm.isActive == false) {   //only takes effect if wearable not active
        if (isGlanced) {
            form[curScreen].showIcon(BAR_GLANCE);
        } else {
            form[curScreen].hideIcon(BAR_GLANCE);
        }
    }
}


2. Register the onGlance gesture callback in your void setup() after gplm.listenEvents(false):

gplm.setGlanceCallback(onGlance);   // use onGlance as a helper


3. In your void setup() after gplm.listenEvents(false), add the following lines for activation syntax:

    gplm.autoTimeout = true;    //tells the wearable to deactivate automatically
    gplm.deactivateTimeout = 3000;  // (optional) timeout in miliseconds (3 seconds default)
    gplm.activationGesture = ACT_DOUBLE_SNAP;   // (optional) ACT_DOUBLE_SNAP by default
    gplm.setActivationCallback(onActivation);   // register activation gesture callback


4. Write the onActivation gesture callback as follows:

void onActivation(bool isActive)
{
    if (isActive) {
        Serial.println("Activated!");
        form[curScreen].setIconColor(BAR_GLANCE, LV_COLOR_CYAN);
        // your code here, once wearable is activated
    } else {
        Serial.println("Deactivated!");
        form[curScreen].setIconColor(BAR_GLANCE, LV_COLOR_WHITE);
        form[curScreen].hideIcon(BAR_GLANCE);
        // your code here, once wearable is deactivated
    }
}

At this point, you have a basic working implementation for activation gesture. For the remaining steps, we will demonstrate how to make the wearable observe left and right tilt gestures after the wearable is activated. You can find out if the wearable is active by calling gplm.isActive, which will return a boolean true if it has been activated.

5. To detect the left and right tilt gestures, write the onDeviceTilt gesture callback as follows:

void onDeviceTilt(int direction)
{
    if (gplm.isActive != true) { // Ignore event if gesture performed without activation
        return;
    }

    switch(direction)
    {
        case LH_LEFT:
            showMsg("Left direction");
            gplm.vibratePress();    // Provide user haptic feedback
            gplm.setActive(false);  // Action completed, redeem activation
        break;

        case LH_RIGHT:
            showMsg("Right direction");
            gplm.vibratePress();    // Provide user haptic feedback
            gplm.setActive(false);  // Action completed, redeem activation
        break;

        default: break;
    }
}

Note that you are certainly not limited to onDeviceTilt for your project. You can choose from many of GyroPalm's gesture callbacks to use for your navigation/parameter gestures. See the Gesture Callbacks section for details.

6. Register the onDeviceTilt gesture callback in your void setup() after gplm.listenEvents(false):

gplm.setTiltCallback(onDeviceTilt); // use onDeviceTilt as a navigation gesture

Activation Gesture Example Code

To demonstrate the usage of the above callback functions and activation methods, here is some fully working example code that you can paste into GyroPalm Studio to evaluate:

// Begin AutoGenerated Includes - DO NOT EDIT BELOW
#include <GyroPalmEngine.h>
#include <GyroPalmLVGL.h>
// End AutoGenerated Includes - DO NOT EDIT ABOVE

// Begin AutoGenerated Globals - DO NOT EDIT BELOW
GyroPalm *device;
GyroPalmEngine gplm("gp123456");    //declares a GyroPalm Engine object with wearableID

AXP20X_Class *power;
lv_task_t *barTask;
void lv_update_task(struct _lv_task_t *);

enum Screen { SCR_HOME };   //Screen indexes
lv_obj_t *screen[1];    //screen pointers
GyroPalmLVGL form[1];   //screen helper methods
Screen curScreen = SCR_HOME;    //default screen
// End AutoGenerated Globals - DO NOT EDIT ABOVE

lv_obj_t *btnTest;
lv_obj_t *lvLED;
lv_obj_t * msgboxRead;

// Begin AutoGenerated Callbacks - DO NOT EDIT BELOW
void lv_update_task(struct _lv_task_t *data) {
    int battPercent = power->getBattPercentage();
    bool isCharging = power->isChargeing();
    form[curScreen].updateBar(battPercent, isCharging);
    form[curScreen].setTime(gplm.getTime());     //update Time View
}

void showMsg(String msg) {
    msgboxRead = form[curScreen].createMsgBox((char *)msg.c_str(), PROMPT_OK, msgbox_handler, true);
}

static void msgbox_handler(lv_obj_t *obj, String btnText)
{
    if (obj == msgboxRead) {
        Serial.println("Response from MsgBox A");
        msgboxRead = NULL;
    }
    Serial.print("User response: ");
    Serial.println(btnText);
}

static void roller_event_handler(lv_obj_t * roller, lv_event_t event)
{
    if(event == LV_EVENT_VALUE_CHANGED) {
        int rollerIndex = lv_roller_get_selected(roller);
        char buf[32];   //selected string
        lv_roller_get_selected_str(roller, buf, sizeof(buf));

        switch (curScreen)
        {
            case SCR_HOME:
                if (rollerIndex == 0) {
                    gplm.activationGesture = ACT_MANUAL;
                }
                else if (rollerIndex == 1) {
                    gplm.activationGesture = ACT_DOUBLE_SNAP;
                }
                else if (rollerIndex == 2) {
                    gplm.activationGesture = ACT_FLIP_SNAP;
                }
                else if (rollerIndex == 3) {
                    gplm.activationGesture = ACT_SNAP_SHAKE;
                }
            break;

            default: break;
        }
    }
}

static void btn_event_handler(lv_obj_t * obj, lv_event_t event)
{
    if (event == LV_EVENT_CLICKED) {
        String btnName = lv_list_get_btn_text(obj);
        Serial.printf("Clicked: %s\n", btnName);

        switch (curScreen)
        {
            case SCR_HOME:
                if (obj == btnTest) {
                    if (gplm.isActive == false) {   //check whether GyroPalm is active
                        gplm.setActive(true);
                    } else {
                        gplm.setActive(false);
                    }
                }
            break;

            default: break;
        }
    }
}

// End AutoGenerated Callbacks - DO NOT EDIT ABOVE

// Begin AutoGenerated Screens - DO NOT EDIT BELOW
void showApp(int page) {
    if ((Screen) page != curScreen) {
        form[curScreen].removeBar();    //remove old StatusBar before proceeding
    }

    switch (page)
    {
        case SCR_HOME:
        {
            //Draw screen UI
            curScreen = (Screen) page;
            form[curScreen].init(screen[curScreen]);  //now defining screen items
            form[curScreen].createBar(barTask, lv_update_task);
            form[curScreen].setTime(gplm.getTime());
            form[curScreen].createLabel(0, -72, "Activation Gesture Samples");    //show element
            form[curScreen].createRoller(0, 0, "Manual\nDouble Snap\nFlip Snap\nSnap Shake", 3, roller_event_handler, true, 200);    //show element
            btnTest = form[curScreen].createButton(-58, 88, "Test", btn_event_handler, true, 98);    //show element
            lvLED = form[curScreen].createLED(58, 88, false);    //show element

            form[curScreen].showScreen(ANIM_NONE);   //show the screen w/ no animation
        }
        break;


        default: break;
    }
    gplm.setScreen(&form[curScreen]);
}
// End AutoGenerated Screens - DO NOT EDIT ABOVE

void onPwrQuickPress()
{
    /*
    After the AXP202 interrupt is triggered, the interrupt status must be cleared,
    * otherwise the next interrupt will not be triggered
    */
    power->clearIRQ();

    // We are sleeping the device when power button pressed
    device->displaySleep();
    device->powerOff();
    esp_sleep_enable_ext1_wakeup(GPIO_SEL_35, ESP_EXT1_WAKEUP_ALL_LOW);
    esp_deep_sleep_start();
}

void onGlance(bool isGlanced)
{
    if (gplm.isActive == false) {   //only takes effect if wearable not active
        if (isGlanced) {
            form[curScreen].showIcon(BAR_GLANCE);
        } else {
            form[curScreen].hideIcon(BAR_GLANCE);
        }
    }
}

void onActivation(bool isActive)
{
    if (isActive) {
        Serial.println("Activated!");
        form[curScreen].setIconColor(BAR_GLANCE, LV_COLOR_CYAN);
        lv_led_on(lvLED);
    } else {
        Serial.println("Deactivated!");
        form[curScreen].setIconColor(BAR_GLANCE, LV_COLOR_WHITE);
        form[curScreen].hideIcon(BAR_GLANCE);
        lv_led_off(lvLED);
    }
}

void onDeviceTilt(int direction)
{
    if (gplm.isActive != true) { // Ignore event if called without activation
        return;
    }

    switch(direction)
    {
        case LH_LEFT:
            showMsg("Left direction");
            gplm.vibratePress();
            gplm.setActive(false);  // Action completed, redeem activation
        break;

        case LH_RIGHT:
            showMsg("Right direction");
            gplm.vibratePress();
            gplm.setActive(false);  // Action completed, redeem activation
        break;

        default: break;
    }
}

void setup() {

    // Begin AutoGenerated Setup - DO NOT EDIT BELOW
    gplm.begin();
    delay(100);
    gplm.listenEvents(false);    //starts listening for events

    gplm.autoTimeout = true;
    gplm.deactivateTimeout = 3000;  // (optional) timeout in miliseconds (3 seconds default)
    gplm.activationGesture = ACT_MANUAL;   // (optional) ACT_DOUBLE_SNAP by default
    gplm.setActivationCallback(onActivation);   // register activation gesture callback

    gplm.setGlanceCallback(onGlance);   // use onGlance as a helper
    gplm.setTiltCallback(onDeviceTilt); // use onDeviceTilt as a navigation gesture
    gplm.setPwrQuickPressCallback(onPwrQuickPress); // sleep and wake when power button pressed
    delay(100);

    device = gplm.wearable; //gives control to the developer to run device methods
    device->lvgl_begin();   //Initiate LVGL core
    device->bl->adjust(120);    //Lower the brightness
    power = gplm.power;     //gives control to the developer to access power methods
    power->setChargeControlCur(500);    //enable fast charging

    showApp(curScreen);
    // End AutoGenerated Setup - DO NOT EDIT ABOVE
}

void loop() {
    // Begin AutoGenerated Loop - DO NOT EDIT BELOW
    lv_task_handler();
    delay(50);
    // End AutoGenerated Loop - DO NOT EDIT ABOVE
}

GyroPalm FlexPoint Dynamic Interface

As part of GyroPalm's patented technology, GyroPalm provides developers with many different hands-free mechanisms for unparalleled gesture control. Elements on the screen of GyroPalm Encore such as LVGL buttons, image buttons, checkboxes, and sliders can be manipulated with gestures, enabling any user to interact without specific gesture learning nor custom gesture programming.

A GyroPalm FlexPoint gesture is a form of navigation gesture, which can performed after a successful activation gesture. FlexPoint gestures enrich any GyroPalmLVGL interface since such gestures are easy to perform and require little to no hand-eye coordination. Unlike a traditional air mouse which requires users to precisely point to select an element, such as a button, a user can perform a FlexPoint selection with higher accuracy and less cognitive load. It is indeed one of the many specialties of GyroPalm's intuitive gesture methodology. FlexPoint gestures can be performed quickly, even by inexperienced users.

A FlexPoint gesture involves a user subtly tilting their hand towards the general direction of a desired LVGL widget. This interface involves an "elastic line" that is drawn on the screen. The user tilts their wrist to pivot the line to approach or "cross out" the desired widget before performing a snap gesture to select it. By snapping over a button, a click even is performed on said button. By snapping over a variable element such as a slider, an adjustment gesture interface is opened to allow further gesture manipulation of the control. While FlexPoint is available to all GyroPalm developers, not all GyroPalm apps may use FlexPoint. Some developers may find more meaningful integration with GyroPalm's Customizable Gestures capability.

In the event that the user wants to select a button that is obstructed by another button in proximity, FlexPoint ensures that the further button is selected, as the user desires. This level of intelligent proportional control is made possible by the underlying algorithm observing all the widgets within the line of intersection and selecting the last widget under the final approach of the "elastic line".

FlexPoint Technique

FlexPoint applies our advanced edge-based techniques to process IMU sensor data from the wearable, including a combination of Euclidean Ray-Casting, Temporal Smoothing, and Pre-Disruption Locking. When FlexPoint is activated, the boundaries of all LVGL elements are acquired and the centroids are determined. The IMU data is normalized and stabilized to map the dimensions of the screen. The Euclidean distance between the line of approach and one or more LVGL buttons are calculated and placed into an array for selection. Aside from constant real-time temporal smoothing, the wearable awaits an onRawSnap gesture from the user. Once the user performs a finger snap, a custom algorithm is performed to find the nearest LVGL element with a slight bias on the the data about 125 milliseconds before the snap. This is done as part of the Pre-Disruption locking, which ensures that the user's selection is not affected by the snap itself or other noise after selection. A user does not have to "perfectly" align the line of inference across their desired widget as the algorithm is graceful to human error by a few degrees offset. The results of using this technique allow for 99% gesture accuracy when gesture tests of 150 gesture interactions were performed on a variety of on-screen interfaces.

FlexPoint Dependencies

FlexPoint requires the use of GyroPalmEngine and GyroPalmLVGL. If you do not have those implemented in your project, you may create a new project code using the GyroPalm UI Designer which will generate the bare minimum code you need to proceed. Ensure that you have already understood how to implement gesture callbacks. If you have not written gesture callbacks in GyroPalm before, please refer to the Gesture Callbacks section. You will also need to first implement an activation gesture, prior to adding FlexPoint to your project. Read more about GyroPalm Activation Gestures for details.

Implementing FlexPoint

Obtain the latest version of FlexPoint from here: GyroPalm FlexPoint GitHub Repo

  1. Create a new project using GyroPalm UI Designer first. You'll have the bare minimum code.
  2. Upload the header file in the Resources tab. Then put in main code: #include "GyroPalmFlexPoint.h"
  3. Add this line before the end of void setup(): flexPointSetup(&gplm);
  4. Add this line at the beginning of void loop(): flexPointLoop();
  5. Make sure you have the following declared in your void setup() below listenEvents():
    gplm.autoTimeout = true;    //tells the wearable to deactivate automatically
    gplm.deactivateTimeout = 4000;  // (optional) timeout in miliseconds (3 seconds default)
    gplm.activationGesture = ACT_DOUBLE_SNAP;   // (optional) ACT_DOUBLE_SNAP by default
    gplm.setActivationCallback(onActivation);   // register activation gesture callback

    gplm.setRawSnapCallback(onRawSnap);
    gplm.setGlanceCallback(onGlance);
    gplm.setPwrQuickPressCallback(onPwrQuickPress);
    delay(100);


6. Write the onRawSnap callback:

    void onRawSnap()
    {
        flexPointSnap();
    }


7. Write the onActivation callback:

    void onActivation(bool isActive)
    {
        if (isActive) {
            Serial.println("Activated!");
            form[curScreen].setIconColor(BAR_GLANCE, LV_COLOR_CYAN);
            // your code here, once wearable is activated
            lv_disp_trig_activity(NULL);    //trigger user activity
        } else {
            Serial.println("Deactivated!");
            form[curScreen].setIconColor(BAR_GLANCE, LV_COLOR_WHITE);
            form[curScreen].hideIcon(BAR_GLANCE);
            // your code here, once wearable is deactivated
        }

        flexPointShow(isActive);
    }


8. On every "case" in void showApp(int page), add this before calling the form[curScreen].showScreen function:

    flexPointInterface(&form[curScreen]);

You would treat the above like adding an LVGL widget. Implement your LVGL widget callbacks as usual.

9. (optional) Call flexPointRapid() in your onShake() gesture callback if you want to keep FlexPoint active on shake:

    // Add to your void setup() code...
    gplm.setShakeCallback(onShake);
    gplm.setMaxShakes(5); 

    // Add above void setup() function:
    void onShake(int numShakes)
    {
        if (numShakes >= 5) {
            flexPointRapid();
        }
    }

Using FlexPoint

  1. Compile and upload the GyroPalm firmware to the Encore wearable.
  2. Glance at the watch by raising your wrist towards your face (you should see the "eye icon").
  3. Perform the activation gesture by double-snapping your fingers.
  4. You should now see a blue line that can "sling" towards the widget of your choosing. Tilt your wrist until the line "crosses over" the widget you want to control.
  5. Snap your fingers once more. Snapping over a button or checkbox will perform a click. Snapping over a slider will allow you to perform the gesture adjustment for up to 5 seconds. You can snap once to lock the slider in place.

To keep the FlexPoint interface active for multiple selections, perform an activation gesture and then shake your wrist prior to selecting the upcoming elements. By performing the onShake gesture, FlexPoint will remain activated even after a selection until the activation gesture times out.