7 min. reading time

Human machine interfaces (HMI) are inherently interactive and dynamical. An important goal of HMI engineering is to optimize the interaction between the human and the machine. Ergonomics and efficiency of an HMI are mainly driven by the quality of its design.This makes the HMI design a central part of HMI specifications and implementations. With a series of articles we are going to discuss a model-based approach using state machines for developing human machine interfaces.

Anatomy of an HMI

Modern HMIs are built as graphical user interfaces (GUI) with touch-based user interaction. Due to an HMI’s dynamic nature the design must cover the interface’s behaviour as well as its form and appearance. Altogether there are three major aspects that have to be covered by the specification and implementation of human machine interfaces.

anatomy-of-an-hmi.png

The visual design together with the interactive behaviour are providing the user interface. The latter displays system data and makes the system’s functionality available to the user. Therefore the third aspect relevant for an HMI is the integration of the system’s data and functions. During the development of complex applications like e.g. car infotainment systems we can observe that different people and teams care about these different system aspects. This often leads to separate development life cycles that need synchronization.


Make behaviour explicit

Many HMI specifications focus solely on the visual aspects by providing exact style guides and visual resources like images that have to be applied by the implementation. The user interaction mostly addresses screen flows in form of prose, Powerpoint slides, Visio diagrams or other non-executable forms. In order to validate HMI concepts, prototypes are used typically. Prototypes are usually built using technologies that are easier to handle than those required for implementing the target system.

However, prototypical as well as final implementations are usually suffering from the fact that the behaviour of the HMI is scattered across different parts of the implementation code. The interactive behavior does not exist explicitly on it’s own but is mixed up with technical and implementation aspects of the target technology.  In these scenarios, the specification of the interactive behaviour is very different from its prototypical or final implementation. So no reuse is possible across these development phases. This has severe implications on the necessary efforts when specifications are changed, with a major impact on the pace of iterative HMI development.

These problems can be addressed using a model-based approach using state machines, especially in the form of graphical statechart notation. Statecharts are an executable formalism that can be used to specify the reactive behaviour of HMIs at an adequate level of abstraction. The following example of an infotainment HMI illustrates how behaviour is modelled and executed with a state machine and how it relates to the UI’s visualization.
 

statechart-with-hmi.gif

The statechart shown above defines a set of states (the blue boxes) along with a set of transitions (the lines), conditions, and actions (the textual parts). These elements are defining the behaviour of the HMI, as shown following the statechart. The statechart defines how to switch between the menu and the other scenes on the right hand side of the HMI. To achieve this it defines states for each visible scene, like menue, weather, music, and phone. In the sample statechart only one state can be active at a time. An active state is highlighted in red. On activation of a state the corresponding scene becomes visible. So when the state menu state is active, the menu is displayed. While the user interacts with the UI the statechart decides which state has to be activated next. The possible transitions from a source state to a target state are defined by the lines between them. That is why these lines are called transitions. Typically transitions are triggered by user interactions or system events.

On this level of abstraction the state machine captures the essence of the HMI’s behaviour. It does not interfere with the HMI’s visualization which can evolve independently from the state machine. On the flipside, introducing additional kinds of interaction like swipe gestures is feasible without any impact on the visual elements. This separation allows to develop both parts independently from each other and thus highly increases the HMI’s maintainability as both parts can be developed independently.

Executable HMI specifications

State machines are executable. Enabling executable specifications can have a major positive impact on working on the specification. By simulating a statechart, specification engineers can check and validate the behaviour interactively while working on it. Evaluating the correctness and consistency of complex behaviour like those of modern HMIs based on plain texts and graphics only is not easy or simply impossible. The impact of changes is also hard to assess. However, if a state machine simulation is integrated with the visualization as in our example above the difference between specification and prototype begins to blur.

In any case, executable specifications provide rapid feedback by fast turnaround cycles during specification. This can massively reduce time and costs.

Targeting the implementation

Even though specification and prototype technologies typically differ from the implementation technologies, all the advantages discussed also apply to HMI implementation. The main tool for supporting HMI implementation is code generation. The main task of a code generator is to transform the graphical statechart into programme code, i.e. an equivalent executable state machine. This state machine code is then integrated with the other – often handwritten – implementation artefacts.

targeting-implementation-statechart-hmi.png

Code generation is not restricted to specific programming languages or target systems. We are using this concept for modelling Qt, Android, HTML5, and Flash applications. It is applicable to nearly any other target technology, too – also proprietary ones.

Summary

A model-based approach to HMI development using state machines is applicable throughout the whole development process from specification and prototyping to implementation. This helps making the development process more fluent by reducing gaps between specification, prototyping, and implementation. Making HMI behavior explicit improves understandability and maintainability and finally helps to reduce efforts and costs.

Nevertheless there are some questions that we have not discussed yet and which we will address in following articles:


  1. Interactive behaviour can become very complex. This leads to the need for modularization. How can interactive behavior be modularized?
  2. There are several different UI technologies suitable for for developing HMI prototypes and implementations, like Qt/C++, Android/Java, HTML5/Angular, and others. How does the discussed approach smoothly integrate with these technologies?     

Do you want to try YAKINDU Statechart Tools yourself? Then download our free 30 days trial version!

Download and try  YAKINDU Statechart Tools

Comments