As a result of the emergence of new technical (mobile) devices for both visualization and control usage, new operating concepts and thus also visualizations have become necessary. In their private lives, users experience completely new interaction concepts thanks to tablets and smartphones and increasingly expect these in the professional field.
It is necessary to distribute the same information, regardless of device type, on the basis of a uniform central project. In order to achieve this, the trend is moving towards “responsive design”; design that reacts to the device and the user and adapts optimally to the situation.
However, not just the aspects of the device should be taken into consideration here. A project that needs to be specially adapted to each possible device in the production environment can no longer be implemented and maintained adequately. An example is the fragmentation of screen resolutions on Android devices (from a resolution of 240 x 180 to a resolution of 2040 x 1152 – see figure below). Also current “special solutions”, such as WPF controls, can only be used on Microsoft systems and cannot be adapted for different devices in any way.
Integration relates to more than automatic adaptation, such as for the screen resolution of the device. The goal is seamless design – the integration and automatic adaptation to different hardware and software platforms, users and tasks.
It is not just the visual part of the human-machine interface that plays a specific role, but also the behavior of an application in the background. Examples are context-sensitive navigation, automatic support tools to implement the current task at hand, or the provision of information related to a current piece of data.
SUNAa is what sometimes gets us into a sweat when faced with this problem. With this, it
s not about the sometimes heated discussions during the planning and optimization of a new version of zenon, but the four types of technologies that constitute a challenge for the future of human-machine interfaces:
Social in the sense of HMI and SCADA applications does not refer to the automatic posting of the last alarm message on Facebook. The ordinary example of a user who uses only one single machine to carry out a task is often a thing of the past. More and more often it is about several users, working together and switching between different stations and machines in order to achieve a common objective. In terms of design, this is a difficult task for the integrator.
Agentive algorithms, or machine-like artificial intelligence, recognizes not just our identity, but also our intents and possibly even our current emotional state and can react to it accordingly.
Information on the status of a machine, a production line or an entire factory is all part of our daily business, regardless of whether its directly on the machine, in the control room on the desktop or on the cell phone in the hand of the service technician. Nowadays users constantly change between these technological points of contact. Our objective must be to make this as seamless as possible for the user.
Natural User Interactions
NUIs describe a selection of technologies that take more from our body and its abilities than the classic WIMP interface (windows, icons, menus and pointer). Examples of this are haptic user interfaces, gesture-based user interfaces and language-based user interfaces.
Each of these technological aspects is already a powerful tool when used on its own. However the real added value is created when these are combined in a strong, comprehensive platform such as zenon.
Another challenge is working with several users simultaneously, such as for a joint alarm cause analysis on a Microsoft PixelSense Multi-Touch system (e.g. Samsung SUR40). It is not just the case of a sole operator in front of a monitor, but a smooth cooperation with all those involved, such as the automation technician, the production planner, the controller and the process technician. These systems offer, in addition to the possibility of several users working simultaneously on the same application –something that conventional Windows applications do not currently support – the possibility of pattern recognition. This allows the use of any desired object. For example as a physical rotary knob, and thus a new type of two-hand operation. Here the physical object must be consciously placed on the interface first, before any action is executed. At the same time no dedicated login is needed, because an action is only possible with the personal token of the user.
We at COPA-DATA are already facing the future challenges of stationary and mobile HMI and SCADA applications and are continually working to equip zenon as both an engineering tool as well as a (mobile) device application for the future.