admin 26 November, 2018 0

Touch screen applications

Touch Screen and the meaning of Multi-Touch

Nowadays, we all can frequently see the Touch Screen applications around our environment. Starting from our pocket games to ATMs, Service counter applications to Information displays, touch screen technology have been widely used and applied. So why did we call the name of Touch Screen? It was clear that we can refer to touch or contact to the display of the device by a figure or hand or a stylus. By theory, the touch screen has two main attributes [1]. First, it enables one to interact with what is displayed directly on the screen, where it is displayed, rather than indirectly with a mouse or touchpad. Secondly, it lets one do so without requiring any intermediate device, again, such as a stylus that needs to be held in the hand. Such displays can be attached to computers, terminals to networks and also can use such as the personal digital assistant (PDA), satellite navigation devices, mobile phones, and video games. The good beginning of first commercial touch screen computer HP-150 [1] had inspired for further development of touch screen technology and its applications. Here are a number of types of touch screen technology as known as

Resistive – Using electrical conductive layers
Surface acoustic wave – using ultrasonic waves that pass over the touch screen panel
Capacitive – classified in two types as surface capacitive and projected capacitive and
Optical imaging – for large units of touch screen application.

There are many ways to create or build a touch screen. Most of the key goals are to recognize one or more fingers touching on a screen to effectively interact with the command of the appropriate applications. Even though touch screen technology patents were filed during the 1970s and 1980s, within the short time they had been expired [1]. Touch screen components manufacturing and product design are no longer in encumbered with regard to patents and the manufacturing of touch screen-enabled displays were widespread. At beginning, touch screen technology started with single-touch. But in later time, have been developed to dual-touch and then now popular article “Multi-Touch”.

The meaning and development of Multi-Touch screens facilitated the tracking of more than one finger on the screen, thus operations that require more than one finger are possible. These devices also allow multiple users to interact with the touch screen simultaneously at the same time. Multi-Touch can explain as a set of interaction techniques which allow users to control the graphical interface with more than one finger at either application or system levels of computers or touch screen displays or mobile phones [2]. It can consists of a touch screen (possible in wall, overlay, table, etc) and the application software that recognizes multiple simultaneous touch points, it would oppose to the single-touch screen which only recognizes single touch point.

The actual research development of Multi-Touch had started from since 1982 when the University of Toronto developed the first finger pressure Multi-Touch display [2]. When the time came to 1983 after a long of a year, Murray hay from Bell labs published a comprehensive discussion of touch screen based interfaces. In 1984, Bell labs created a touch screen that could change images with more than one hand. So the University of Toronto have stopped for hardware research and specialized in software and interface development expecting that they would have access to Bell labs work. A breakthrough occurred in 1991 that when Pierre Wellner published a paper on his multi-touch “Digital Desk”, which supported multi-finger and pinching, motions [2]. But after that time, there were no further widely acceptance or popularity in this field except on special interested groups or research labs. When coming out of the evolutional product from Apple, “iPhone”, interesting of Multi-Touch technology has emerged again to the stage. The iPhone in particular has spawned a wave of interest in multi-touch computing, since it permits greatly increased user interaction on a small scale. And also the introduction of Microsoft Surface from Microsoft Cooperation in 2007 had got many attentions and interesting from publics. Recent years, the use of Multi-Touch technology is expected to rapidly become common place and will stand as one of the innovative techniques.

The evolution of human input “touch” to computer and other devices

The most basic fundamental concepts of Multi-Touch Technology are branching out from the concepts of Human Computer Interaction (HCI). To control everything with your hand or fingers are not so as easy as our expected. The good implementation of user interface and the consumption of processing time in application software are most critical saturation and would be needed to consider as first priority. Along side with history, people did endeavor the more moderated techniques in HCI for both hardware and software to useful and friendly than of the previous discoveries. So, nowadays we can see the different versions of computer monitors, mouse, game joysticks and application software that all are more advanced and suited with user’s requirements and flexibilities. And also in Multi-Touch, it had been for the long way in research and development regarding for HCI, product design and technically improvements. Here are some facts and time lines that have been roughly annotated as a Chronology of Multi-Touch and Related Works.

The beginning: Typing & N-Key Rollover (IBM and other researchers)

It may seem a long way to become a Multi-Touch screen, because the starting story of Multi-Touch had begun with keyboards. They were mechanical devices, hard type rather than of soft. But they did involve a sort of Multi-Touch. First, we can see the sequences of such as the SHIFT, Control, Fn or ALT keys in combination with others. These were the cases where we want Multi-Touch. Second, there were also the cases of unintentional, but inevitable, multiple simultaneous key presses which we want to make proper sense of, the so-called question of n-key rollover (where you can push the next key before releasing the previous one) [3].

Electro acoustic Music: The Early Days of Electronic Touch Sensors (Hugh LeCaine, Don Buchla & Bob Moog)

It was the early type of touch-sensitive control device, used touch-sensitive capacitance-sensors to control the sound and music being made. It could say touch pads rather than to say touch screen.

1972: PLATO IV Touch Screen Terminal (Computer-based Education Research Laboratory, University of Illinois, Urbana-Champaign)

It was the early work done by IBM, the University of Illinois, and Ottawa Canada [4]. All were single touch and there were nothing for pressure sensitive. As well as its use of touch, it was remarkable for its use of real-time random-access audio playback, and the invention of the flat panel plasma display.

1981: Tactile Array Sensor for Robotics (Jack Rebman, Lord Corporation)

A multi-touch sensor designed for robotics to enable sensing of shape, orientation, etc [5].

1982: Flexible Machine Interface (Nimish Mehta, University of Toronto)

The first multi-touch system that had been aware of designed for human input to a computer system [6]. It was consisted of a frosted-glass panel whose local optical properties were such that when viewed behind with a camera a black spot whose size depended on finger pressure appeared on white background. This with simple image processing allowed multi touch input picture drawing, etc. At the time we discussed the notion of a projector for defining the context both for the camera and the human viewer.

1983: Video Place / Video Desk (Myron Krueger)

The vision based system that tracked the hand and enabled multi fingers, hands, and people to interact using a rich set of gestures. It can implement in a number of configurations, including table and wall.

1985: Multi-Touch Tablet (Input Research Group, University of Toronto)

x

Hi!
I'm Moses!

Would you like to get a custom essay? How about receiving a customized one?

Check it out