Posts Tagged ‘Multi-Touch’

Multi-Touch features in zenon 7.20

Friday, June 26th, 2015

The way we interact with computers has changed in recent years and since then most people are used to directly manipulating interfaces by using gestures. Also with zenon Multi-Touch features the user literally stays in touch with what is going on in the work process. In zenon 7.20 we equipped the operator with new Multi-Touch features, which can easily be adapted.

MultiTouch_FrameShape

In zenon 7.20 you can use freely definable frame shapes for movable screens, as the Demo login screen shows.

Frames

Frames can be moved without having a title, a system menu or a min/max button. Also freely definable frame shapes can be used for native interaction. To illustrate how to create such frames, let’s have a look at the login screen of the zenon Supervisor demo project as an example:

Create a new frame with a freely definable frame shape and give it any shape you like. In the demo, a rectangle and a polygon with rounded corners was used. After creating a freely definable shape, position the frame where you initially want to have it in the RT.

MultiTouch_FrameProperties

In the frame’s property settings you can individually define position settings and movability.

Now, to make the login frame movable, go to the frame’s property settings, activate “always in the foreground” and set the following positioning settings: Opening size as “frame size”, no limitation minimum and restrict the movability to display borders. The property “minimum frame margin” defines, how many pixels of a frame have to remain on the RT monitor and cannot be pushed outside the screen (however, in the case of free shapes only the bounding rectangle is considered). In the demo project, 70 px were defined as margin.

Afterwards, create a screen with this frame. Now comes the “creative” part: Fill the screen with elements (rectangle, templates, adapted tem¬plates, symbols…), align them and link functions and variables as usual, until you have your desired screen result.

Faceplates

When creating a faceplate screen with different screen containers such as an Alarm Message List, a Chronological Event List or trend elements, you now can use it as worldview as well. For instance, this feature could become interesting if you want to have an overview of a whole site containing several alarm or trend screen containers within one (touch) screen. The other way round, you can also use the screen container for displaying a worldview, e.g. when you want to display a touchable menu slider.

MultiTouch_Faceplates

The configuration of worldviews can also be used for faceplates and screen containers.

 

Graphics Technology Update in zenon

Friday, April 17th, 2015

zenon WorldviewErgonomics in HMI applications depends to a considerable extent on the capability of the runtime system to conduct graphical operations in a high-performance manner. The evolution of Multi-touch functionality and the contribution of natural user interactions particularly require a certain level of graphics performance in order to achieve a smooth and appealing behavior. One key parameter here is how fast the system can react to user interaction and, in turn, deliver the necessary responses on the graphical level. A screen refresh rate of around 25-30 frames per second is generally said to be sufficient to let a dynamic display feature appear “fluent”. This will – amongst other parameters – decide upon the level of user experience you´re about to reach in your application.

The use of native means of graphics processing prepares the foundation to handle high-performance HMI requirements with a respective level of responsiveness. The term “native” in that context relates to techniques which are designed for a close interaction with the operating system and which are optimized for use on a respective hardware platform.

Since version 7.00 zenon provides graphics rendering modes based on DirectX 11 for Windows server- and PC operating systems. Besides the option to operate the graphics processing on the CPU core(s) of the local system – which is available for any setup – the graphics processing can be supported by a dedicated DirectX graphics card. In this way, the system profits from an optimized computation of graphics-related operations and the freeing up of local computing resources.

The current zenon version comes with an upgraded version support for DirectX 11.1. What appears to be a minor step in the graphics engine’s version, offers considerable performance advantages due to optimized utilization possibilities in zenon. Specific interaction features, such as movable or resizeable frames or more complex worldview display constellations potentially benefit from this technological update. Hence, with the background of rapidly evolving HMI demands, another step is made to keep things running smoothly.

Smart Interfaces – The future of industrial user interfaces

Friday, January 31st, 2014

As a result of the emergence of new technical (mobile) devices for both visualization and control usage, new operating concepts and thus also visualizations have become necessary. In their private lives, users experience completely new interaction concepts thanks to tablets and smartphones and increasingly expect these in the professional field.

It is necessary to distribute the same information, regardless of device type, on the basis of a uniform central project. In order to achieve this, the trend is moving towards “responsive design”; design that reacts to the device and the user and adapts optimally to the situation.

However, not just the aspects of the device should be taken into consideration here. A project that needs to be specially adapted to each possible device in the production environment can no longer be implemented and maintained adequately. An example is the fragmentation of screen resolutions on Android devices (from a resolution of 240 x 180 to a resolution of 2040 x 1152 – see figure below). Also current “special solutions”, such as WPF controls, can only be used on Microsoft systems and cannot be adapted for different devices in any way.

 

 

Integration relates to more than automatic adaptation, such as for the screen resolution of the device. The goal is seamless design – the integration and automatic adaptation to different hardware and software platforms, users and tasks.

It is not just the visual part of the human-machine interface that plays a specific role, but also the behavior of an application in the background. Examples are context-sensitive navigation, automatic support tools to implement the current task at hand, or the provision of information related to a current piece of data.

SUNAa is what sometimes gets us into a sweat when faced with this problem. With this, its not about the sometimes heated discussions during the planning and optimization of a new version of zenon, but the four types of technologies that constitute a challenge for the future of human-machine interfaces:

Social

Social in the sense of HMI and SCADA applications does not refer to the automatic posting of the last alarm message on Facebook. The ordinary example of a user who uses only one single machine to carry out a task is often a thing of the past. More and more often it is about several users, working together and switching between different stations and machines in order to achieve a common objective. In terms of design, this is a difficult task for the integrator.

Agentive

Agentive algorithms, or machine-like artificial intelligence, recognizes not just our identity, but also our intents and possibly even our current emotional state and can react to it accordingly.

Ubiquitous

Information on the status of a machine, a production line or an entire factory is all part of our daily business, regardless of whether its directly on the machine, in the control room on the desktop or on the cell phone in the hand of the service technician. Nowadays users constantly change between these technological points of contact. Our objective must be to make this as seamless as possible for the user.

Natural User Interactions

NUIs describe a selection of technologies that take more from our body and its abilities than the classic WIMP interface (windows, icons, menus and pointer). Examples of this are haptic user interfaces, gesture-based user interfaces and language-based user interfaces.

Each of these technological aspects is already a powerful tool when used on its own. However the real added value is created when these are combined in a strong, comprehensive platform such as zenon.

Another challenge is working with several users simultaneously, such as for a joint alarm cause analysis on a Microsoft PixelSense Multi-Touch system (e.g. Samsung SUR40). It is not just the case of a sole operator in front of a monitor, but a smooth cooperation with all those involved, such as the automation technician, the production planner, the controller and the process technician. These systems offer, in addition to the possibility of several users working simultaneously on the same application –something that conventional Windows applications do not currently support – the possibility of pattern recognition. This allows the use of any desired object. For example as a physical rotary knob, and thus a new type of two-hand operation. Here the physical object must be consciously placed on the interface first, before any action is executed. At the same time no dedicated login is needed, because an action is only possible with the personal token of the user.

We at COPA-DATA are already facing the future challenges of stationary and mobile HMI and SCADA applications and are continually working to equip zenon as both an engineering tool as well as a (mobile) device application for the future.

zenon 7.10 and Windows 8 Multi-Touch – Programming in VSTA – Part 2

Thursday, July 18th, 2013

WP_000608In our previous blog entry we talked about the possibilities of custom Multi-Touch programming in VSTA with the help of the zenon API. We focused on the raw WM_Pointer messages. This time we will have a look at gestures in VSTA.

Gestures in VSTA

There is the same pre-filtering concept for gesture events as for the raw touch events. At each dynamic zenon element (e.g. a button) and on each screen, you can select the gestures you are interested in (more specifically, you are configuring an Interaction Context, which processes the raw touch events in the background and searches the gestures you selected).

In VSTA you can react on these events by using

  • ElementGesture
  • PictureGesture
void DynPics_PictureGesture(zenOn.IDynPicture obDynPicture, object interactionContextOutput)
{
CELMakeSeperator();
string CelString = string.Format(“VSTA: {0} on {1} “,
System.Reflection.MethodBase.GetCurrentMethod().Name, obDynPicture.Name);
Cel().WriteCelString(CelString);
WriteInteractionContextOutputToCEL(interactionContextOutput);
}

More detailed information about the contents of the interactionContextOutput-object can be found in the MSDN documentation.

zenon 7.10 and Windows 8 Multi-Touch – Programming in VSTA – Part 1

Thursday, July 11th, 2013

In the previous blog entries regarding Multi-Touch we talked about the natively integrated features in zenon 7.10. However, the creativity of our customers has no limit and zenon, being an open system, supports the integration of custom solutions e.g. via VSTA.

Fundamentals

The new features Windows 8 brings to Multi-Touch can be used directly via the zenon API (VSTA only, because of 64-bit). If you take a closer look at what’s happening in the background when moving your finger around the screen for example, you will find that a lot of single events are being generated. On the one hand this means that you get a lot of data you need to sort through by yourself and on the other hand a great deal of performance is wasted. When talking about Multi-Touch events we need to distinguish between two types:

  • Raw touch points (WM_POINTER messages)
  • Preprocessed, recognized gestures, e.g. a manipulation

Raw touch points in VSTA

This time we are focusing on the raw touch points: zenon 7.10 now allows you to pre-filter these messages at each screen, so if you are only interested in receiving PointerDown and PointerUp messages you don’t have to handle everything else and analyze thousands of events to find the right one. In VSTA you can react on events like

  • PointerActivate
  • PointerCaptureChanged
  • PointerDeviceChange
  • PointerDeviceInRange
  • PointerDeviceOutOfRange
  • PointerDown
  • PointerEnter
  • PointerHWheel
  • PointerLeave
  • PointerUp
  • PointerUpdate
  • PointerWheel

For example:

void DynPics_PointerDown(zenOn.IDynPicture obDynPicture, zenOn.IElement obElement, object vPointerId, bool bNew, bool bInRange, bool bInContact, bool bPrimary, bool bFirstButton, bool bSecondButton, bool bThirdButton, bool bFourthButton, bool bFifthButton, int lX, int lY)
{
string CelString = string.Format(“VSTA: {0} on {1} “,
System.Reflection.MethodBase.GetCurrentMethod().Name, obDynPicture.Name);
string ButtonMatrix = “”;
ButtonMatrix += (bFirstButton == true) ? “[1]” : “[x]”;
ButtonMatrix += (bSecondButton == true) ? “[2]” : “[x]”;
ButtonMatrix += (bThirdButton == true) ? “[3]” : “[x]”;
ButtonMatrix += (bFourthButton == true) ? “[4]” : “[x]”;
ButtonMatrix += (bFifthButton == true) ? “[5]” : “[x]”;
 
CelString += string.Format(“pointer-id: {0}, is-new: {1}, is-in-range: {2}, is-in-contact: {3}, is-primary: {4}, buttons: {5}, [X, Y]: {6},{7}”, Convert.ToString(vPointerId), bNew, bInRange, bInContact, bPrimary, ButtonMatrix, lX, lY);
Cel().WriteCelString(CelString);
}

For further information on this Pointer, please call up GetPointerInfo(VARIANT vPointerId) As VARIANT

 

 

zenon 7.10 and native Windows 8 Multi-Touch features – Part 3

Wednesday, April 24th, 2013

zenon_general_Alarmlist-w-HandLast time we went through the native Multi-Touch capabilities in zenon 7.10 for

cialis prix

dynamic elements and the screens. This time we will focus on the new Multi-Touch features in the Alarm Message List, Chronological Event List, Extended Trend Module and the new touch-optimized time filter controls.

Alarm Message List / Chronological Event List

One main goal was to make

cheap cialis online

our most important lists – the Alarm Message List and the Chronological Event List – Multi-Touch capable. You now can navigate and interact in the list directly with your fingers, making extra buttons for navigation etc. unnecessary.

The Reactions for Manipulation (allowing scrolling horizontally and/or vertically) are configurable as well as the reactions for “tap” and “double tap”:

  • no reaction
  • selection (tap only)
  • execute zenon function
  • acknowledge alarm (double tap only)
  • execute alarm function (double tap only)
  • open help for alarm (double tap only)
  • start/stop list

    (double tap only)

For the Chronological Event List the same possibilities are available for navigating the list and for “tapping”, just as in the Alarm Message List.

Extended Trend Module

The Extended Trend Module now offers the possibility to natively zoom and scroll and set the reactions for “double tap” and “tap and hold”:

  • no reaction
  • execute zenon function
  • zoom to 100% (double tap only; also available as button now)
  • step back (double tap only)

Time Filter

Another goal was to make our time filters more touch-friendly. New controls are now available for the Time Filter, Alarm Message List Filter and Chronological Event List Filter screens, which are highly configurable in terms of Multi-Touch as well as graphically, e.g. with line height.

My next entry will include the options for customized zenon Multi-Touch implementations in VSTA.

 

zenon 7.10 and native Windows 8 Multi-Touch features – Part 2

Monday, April 8th, 2013

Last time we focused on the key concepts behind Windows 8 Multi-Touch – direct manipulation and gestures. In the next two parts of this blog series, we will address the new Multi-Touch features integrated natively into zenon 7.10.

Multi-Touch in zenon 7.10

With the implementation of Windows 8 Multi-Touch into zenon 7.10  we focused on facilitating an out-of-the-box experience without the need for programming. Reactions to gestures can be parameterized directly in the zenon Editor  – the integrator can focus on defining interactions instead of programming gesture recognition.

Dynamic Elements

At each zenon dynamic element (e.g. a button) you now can configure the reaction for “tap and hold”:

  • no reaction
  • execute a standard zenon function, e.g. for executing a VSTA-Macro
  • open the context menu

    (where available)

Screens

As with the dynamic elements you can also define the reaction for “tap and hold” and for “double tap” at each zenon screen:

  • no reaction
  • execute a standard zenon function, e.g. for executing a VSTA-Macro
  • show status window

Another new feature is the more detailed configuration of zooming and scrolling in a worldview screen. You can now define the reactions for horizontal and

generic cialis pills

vertical scrolling and zooming separately, giving you the opportunity of easily creating custom sliding menu bars.

Next time, we`ll continue with the new Multi-Touch features in the Alarm Message List, Chronological Event ListExtended Trend Module and the new touch-optimized time filter controls.

zenon 7.10 and Windows 8 Multi-Touch features

Monday, March 25th, 2013

Introduction

Multi TouchMulti-Touch integration in zenon has already been in use since version 7.00. Other than some native features like zooming and scrolling in a worldview picture, doing anything else with Multi-Touch has been pretty cumbersome for the engineers. In this series we focus on the new Multi-Touch concepts of Windows 8 and how these were integrated in our latest product – zenon 7.10. We will start with a quick overview of the concepts of Microsoft for Windows 8, followed by an overview of the Multi-Touch features integrated natively into zenon 7.10. The last blog entry will focus on custom Multi-Touch implementations with VSTA.

Natural User Interfaces

Multi-Touch and Natural User Interfaces follow the key concept of directly manipulating an object. Windows 8 defines manipulation for scrolling or zooming a part of the application either by dragging the fingers across the screen or by pinching or stretching the fingers to zoom in and out. A gesture is defined as an interaction which causes some reaction from an UI element. During the implementation of Windows 8 Multi-Touch for zenon 7.10 we focused on easing the configuration of Multi-Touch. The

online canadian pharmacy

engineer doesn’t have to worry about how to recognize a gesture anymore; he just configures the reaction he desires, e.g. executing any zenon function by double-tapping a screen – making it easy to set parameters instead

cheapest cialis

of programming. Next time we will have a look at our new native Multi-Touch features in zenon 7.10.