Virtual Reality in Today’s Society

Virtual reality is a computer-generated simulation of the real world. This simulation is not static, instead it responds to the user’s input, whether vocal or tactile, in real time. In order to achieve this interactivity, the computer must constantly monitor the user’s movements or verbal commands and react instantaneously in order to change the synthetic world experienced by the user and in response to him or her. [1] By making use of all of a human’s sensory experience in this way, virtual reality takes the quality of interactivity achieved, say in a computer game, one stage further. Users of virtual reality can see and move objects, they can also touch and feel them. [2] This essay explores the evolution of virtual realities and the many uses of virtual reality in society today, as well as considering its ethical implications.

Burdea, and Coiffet comment that the history of virtual reality dates back more than forty years. The Sensorama Simulator virtual reality video arcade game was invented by Martin Heilig in 1962. This game had the capability to simulate a motorcycle ride through a city, using 3-D effects, seat vibrations, appropriate smells, sounds and wind effects using fans. [3] Head-mounted displays were introduced in 1966 by Ivan Sutherland, but were heavy and uncomfortable. In 1985, Michael McGreevey of NASA developed a cheaper and lighter version of the helmet, fitted with mini display screens and sensors to track movement. The sensory glove had been designed in the early 1980s, but it was in 1986 that Jaron Lanier designed a new glove to fit in with the helmet to create a full virtual reality. [4] Advancements continued to be made in graphics and then in 1993 virtual reality became the theme for a major conference of the Institute of Electrical and Electronics Engineers (IEEE) in Seattle, making it clear that virtual reality had entered the main stream scientific community. [5]

Since the end of the 1980s, new interfaces communicate three-dimensional images using the head-mounted display (HMD), using video cameras to track the image of the user in a virtual world where he can manipulate objects. More recently there has been a development called CAVE (Cave Automatic Virtual Environment), where the user is enclosed in a six sided environment surrounded by projection screens which they view wearing light stereoglasses, giving the impression of 3-D. [6] The suggestive impression is one of one of immersing oneself in the image space, moving and interacting there in “real time”, and intervening creatively’. [7] However, Burdea and Coiffet point out that with the swift advancements in technology, ‘virtual reality today is done mostly without head-mounted displays, by using large projection screens or desk top PCs’, and sensing gloves are now regularly replaced with joysticks. [8]

The world of computer games has become a major area of importance for virtual reality, where the sense of immersion is important for gaming excitement. This creation of interactive virtual worlds has used grand, sweeping cinematic sequences and other techniques used in traditional cinema, such as ‘the expressive use of camera angles and depth of field, and dramatic lighting of 3-D computer generated sets to create mood and atmosphere’. [9] Actors could be used, superimposed over 3-D backgrounds, or as the games became more advanced, synthetic characters were created moving in real time. [10] This means that the space in which the characters move can now change over time, rendering the same space different when visited at a later time during the game. These changes enabled computer designers to integrate the player more deeply into the gaming world cinematically and to create a sense of visual reality.

The immersion experienced when playing a computer game is made a much more total and intense experience when the player becomes a part of the game, that is, physically enters a virtual world. Virtual reality ‘provides the subject with the illusion of being present in a simulated world.’ [11] This virtual world, unlike the purely visual engagement of a computer game, allows for bodily engagement with the synthetic world. Virtual reality also allows the user to change elements of this simulated world: it gives an added feeling of control. Virtual reality allows people to experience elements of life without any physical commitments, possible dangers or general inconveniences of a real experience.

Lev Manovich comments that virtual worlds are sometimes put forward as the logical successors of cinema, that they are ‘the key cultural form of the twenty-first century just as cinema was the key cultural form of the twentieth century’. [12] Indeed, Grau and Custance compare virtual reality with film, saying: ‘virtual reality now makes it possible to represent space as dependent on the direction of the observer’s gaze: the viewpoint is no longer static or dynamically linear, as in the film, but theoretically includes an infinite number of possible perspectives.’ [13]

Technically, virtual reality ‘utilises the same framing’ as a cinema rectangular frame. This kind of frame only allows a partial view of a wider space. The virtual camera, as with a cinema screen, moves around in relation to the viewer in order to reveal different parts of the shot. [14] This framing device is vital to the virtual reality world in that it gives a small shot of a larger world, thereby providing a wholly subjective and totally personal viewing experience.

While Manovich looks to cinema as a basis for virtual technology, Grau and Custance look to art. They argue that the idea of virtual reality ‘rests firmly on historical art traditions, which belong to a discontinuous movement of seeking illusionary image spaces’. [15] Taking into account the lack of technology further back in history, Grau and Custance believe that ‘the idea stretches back at least as far as classical antiquity and is alive again today in the immersive visualization strategies of virtual reality art.’ [16] Indeed, for Grau and Custance, this basic idea of finding these ‘immersive spaces of illusion’ is threaded through the history of art.

Grau and Custance also point out the lack of natural involvement with the world through the technological illusion of power and control. They say, ironically that ‘the adherents of virtual reality … have often reiterated their claim that immersion in virtual reality intensifies their relationship with nature’. [17] Indeed, an experience so totally reliant on technology and devoid of anything natural can bring about this feeling of connection to nature due to its resemblance of the real world.

Manovich too comments on the illusive quality of any ‘natural’ involvement or control. He says that the user is only altering things that are already inside the computer, the data and memory of the virtual world. [18] The realm of virtual reality is driven by the desire to find a perfect recreation of the real world, a perfect illusion. The ideal interface seems to be one in which the interface or computer itself is entirely invisible, it seeks to block out the very means of creation of the virtual world, making the existence of the user in the virtual world seem totally ‘natural’. [19]

The experience means that the user is totally isolated from the actual world whilst at the same time given this feeling of total ‘natural’ immersion in a new world as well as a sense of omnipotence. The user in effect becomes a kind of fictional character that they have themselves created, doing whatever they like, whenever they like, always with a sense of immortality. There are ethical problems relating to the potential decrease in real physical interaction and normal human relationships as people may potentially come to prefer their virtual world to their real life. Indeed, in virtual reality, the physical world no longer exists at all, as all ‘real’ action takes place in virtual space. [20] There is another ethical concern, that of the possibility of children accessing unsuitable experiences in a virtual world, as censorship would be difficult. This is similar to the problem of violence and adult themes in films and on the internet being available to children today. Virtual reality is an area of even greater concern, however, as children will have the opportunity to take part in the action themselves. Another concern is that criminals could practice their crimes in a virtual world before acting in reality.

There are many positive uses for virtual reality today in areas such as: medicine, education, entertainment and psychology. For example, virtual reality can provide flight and driving simulation, operation simulation, it can help with architectural design or treatment of phobias. These things can be practised realistically without the fear of anything going wrong with flying training, driving experience or surgery. Virtual reality can also potentially be used in medicine to evaluate a patient and diagnose problems as well as possibly aid in operations. Disabled people have the opportunity to join in activities not usually available to them. An architect can use the method to plan out a building before starting work constructing it: using virtual reality avoids the need to build several different prototypes. Someone afraid of spiders can meet one in a virtual world under careful programming to reduce sensitivity over a period of time, indeed, any phobia could be treated using this kind of virtual reality exposure therapy. The field of education is a huge potential area of use for virtual reality; it can even be used to practice sport.

There is another important use for virtual reality that is not related to entertainment or education. Telepresence is an ever-increasing part of the digital and virtual world. Telepresence combines three kinds of technology: robotics, telecommunications and virtual reality. With telepresence, ‘the user of a virtual environment, for example, can intervene in the environment via telecommunication and a remote robot and, in the opposite direction, to receive sensory feedback, a sensory experience of a remote event .’ [21]

Manovich calls telepresence a ‘much more radical technology than virtual reality, or computer simulations in general’. [22] Indeed, Manovich explains that with virtual reality, the user controls a simulated world, that is, the computer data. In contrast, ‘telepresence allows the subject to control not just the simulation but reality itself’ because it allows the user to ‘manipulate remotely physical reality in real time through its image’, [23] that is, the user’s action affect what happens right then in separate place, useful for tasks such as, Manovich suggests, ‘repairing a space station’; [24] the technique can also be used successfully in battle to direct missiles. [25]

So, virtual reality operates on two very opposing grounds. On the one hand it allows great freedom for the user, as he feels he can move anywhere through space with the camera, but at the same time, virtual reality totally confines the body in its simulated world. Manovich recognises that the physical world is subordinated in this way as he says virtual reality renders ‘physical space … totally disregarded’, [26] However, with telepresence, the physical world is very much regarded. Indeed, Mark Hansen thinks Manovich’s comment on the lack of physicality overlooks the experience of space in the potential of virtual reality, even if the body is actually confined. [27] Hansen uses the example of telepresence to explain how simulation and space can coincide to be effective. Indeed, with telepresence, the physical actions, although limited in the space where the user resides, do have an effect at another location. In this way space has been found and used, if not in the same location as the user, their movements have still had a physical effect somewhere else. [28]

It seems that virtual reality has many uses in society today, from entertainment to medicine; from psychology to architecture. Telepresence is now a powerful and extremely useful part of the virtual and digital world. With the continuing advancement of technology and the many great uses virtual reality can surely have in society, it is important to bear in mind the negative consequences if virtual reality techniques are not closely monitored, especially as they become more widely available. The ethical implications of a society plugged always into their private, virtual worlds would not be a positive development for human relationships; children also need to be protected from an environment where anything and everything can appear real and personal to the user. However, as long as we are aware of the potential negative implications, the development of advanced virtual reality has great potential benefits for society.

Sources Used

Burdea, G. C. and Coiffet, P. (2003). Virtual Reality Technology. Chichester: Wiley-IEEE

Grau, O. and Custance, G. (2004). Virtual Art: From Illusion to Immersion. Cambridge: MIT Press

Hansen, M. B. N. (2004). New Philosophy for New Media: A New Philosophy for a New Media. Cambridge: MIT Press

Heim, M. (1994). The Metaphysics of Virtual Reality. Oxford: Oxford University Press

Manovich, L. (2002). The Language of New Media. Cambridge: MIT Press

Sherman, W. R. and Craig, A. B. (2003). Understanding Virtual Reality: Interface, Application, and Design. San Francisco: Morgan Kaufmann

http://library.thinkquest.org/26890/virtualrealityt.htm

Video Conferencing: Advantages and Disadvantages

The exponential growth in the knowledge based society triggered by the equally strong impact of information technology and its various tools have expanded the human intellectual creativity. Information technology portal has thus enabled both the analysis as well as the development of ideas and concepts between individuals with the access of a simple computer and a telephone connection. The combination of a computer, a telephone and the services of an Internet Service Provider have given birth to a number of users to accomplish targets previously deemed to be impossible. The synergy of both information technology and the people behind the computer have resulted in the accomplishment of goals, in turn providing excellent results for their respective organizations. One such area of this new mode of exchanging information amongst the various information technology portals is video-conferencing, a development which has further reduced costs and time to take decisions, meet people, interact, learn and teach even from the comfort of their living or board rooms respectively. Certainly one of the most informative modes of telecommuting, video-conferencing has emerged as a strong tool for exchanging information, imparting training, and learning/teaching varied courses in both the business and academic environments. The following paper will strive to present some of the salient aspects and characteristics of video-conferencing, its uses, advantages, disadvantages, as well as analyse it from the perspective of business organizations, with a particular focus on use of video-conferencing as a means of communication for venue providers and event management organizations.

Our present day environment is evidence of an era in which time is the essence, and in majority of instances of crucial importance. This is true for both the fiercely competitive business environment as well as the ever fast pace of the knowledge based industries. A brief overview of the developments in the last two decades would reveal that the global economy has shown a somewhat similar set of trends as was witnessed during the era of industrialization some three centuries ago. Thus, one can easily observe the gradual transition from the industrial based economies to the present day knowledge based economy. This can be evidenced in practically every sphere of life, including but not limited to businesses, private and social lives. The onset and spread of information technology and its various modes are largely responsible for this significant transition. Today, access to information is not the domain of a few groups/regions and individuals, neither can it manipulated; instead access to information is now possible through a personal computer, a telephone connection, and services of an Internet Service Provider. This has resulted in transforming information into one of the biggest challenges, and into fully developed knowledge based economy. Those with the latest information in their respective disciplines are assumed successful, and this is only possible through the appropriate use of the modern tools of information technology, with video-conferencing as being one such tool. Such is the gravity, and need to acquire knowledge that one has to practically stay a few steps ahead of their nearest competitor, simply to exist in the present day competitive environment. The market dynamics and realities of respective industries practically force individuals and organizations alike to stay abreast and compete in the face of the allied challenges successfully. This is only possible by accepting challenges, however intricate and large they may be, and converting them into effective source of knowledge. Using technology as a conduit for access to this knowledge not only saves significant resources, but also the factor of time as a crucial aspect is fully exploited and saved. It is this saving of time and resources that have given rise to such tools as video-conferencing, providing an edge to the patterns of doing business and living a successful life. Though marred by a number of drawbacks and disadvantages, video-conferencing has nevertheless emerged as one of the most effective tools of communications in the present day business environment; and it is this mode of modern communication, which will comprise a larger segment of the following paper.

According to the information accessed from the web pages of www.whatis.com, videoconference is a means of communication between two groups of people from separate locations. Generally, video-conference involves the use of a audio, video, and ancillary equipment enabling both the groups of people to see, hear and converse with each other from multiple locations. Emerging from the environment of a boardroom, classroom, or a manufacturing site, video-conferencing provides each party to interact with each other as if they were sitting in front of each other in the same room. The single most important advantage of video-conferencing has been the provision of or enhancement of speed for the business processes and operations, just as the use of e-mail and facsimile has speeded up access to information. Some of the major benefits derived from video-conferencing include, but are not limited to cost savings in travel, accommodation, staff time, greater and enhanced communication amongst employees at distant locations, and between suppliers and customers. (Video Conferencing UK, 2005)

As also briefly outlined in the opening paragraphs, it is the access to information and knowledge that has enabled individuals and organizations to stay abreast of their nearest competitors, an aspect that is true for businesses an academia alike. Simply put, a business organization cannot remain competitive if it does not have access to advance information in its respective industry; similarly a teacher cannot impart education/training to its pupils if he/she remains behind latest set of researches and information about their respective subjects. Acknowledging the fact that the present day era in fact comprises of a networked environment, the importance of video-conferencing takes on truly dynamic dimensions. This is all the more true in the face of global events which can leave a devastating effect on the local and international economy, and upon which no individual, organization or country can command any measure of control.

Examples of such global events that have shattered economies, devastated entire countryside’s, and left a trail of human misery and loss of property include the tragic events of September 11, the SARS (Severe Acute Respiratory Syndrome) virus of South East Asia, the devastating tidal waves of Tsunami destroying precious life and property from Island of Maldives in South East Asia to the shores of Dar-es-Salaam in the East African country of Tanzania. It is events such as stated in the preceding lines which makes the importance of communication tool of video-conferencing ever more critical in the present day environment.

The need for information technology tools such as video-conferencing is further precipitated in view of the diverse nature of our societies across the globe, which in turn give rise to political, economic, and social risks, the threat of global diseases, terrorism including bio-terrorism either or all of which then pose a significant challenge not only to the productivity and economics of a nation, but to the individuals and organizations across the globe as well. Just as the significant nature of advances in medical research that have triggered a revolution in the treatment and care of variety of diseases, the revolution in information technology has accomplished similar results, providing and collecting crucial data and information from every corner of the globe and atmosphere for the general benefit of global populations. Information technology tools such as video-conferencing have thus made it possible for providing better productivity and enhanced performance in our organizations allowing general populations to take preventive and corrective action in the face of emergencies, crisis situations, or even using it to raise production levels and launching new and better products in the face of severe competitions. Video-conferencing thus aids in the accomplishment of performance excellence, provides for an advance information portal to thwart off threats of disease, spread of virus, the onset of incoming natural calamities including storms, cyclones such as those witnessed in the Tsunami of December of 2004. It is thus essential for practically all businesses, academic institutions, government agencies, and the general populations to develop their respective multi-cultural and technology supported communication systems so that they are better able to address either of the said contingencies, and engage and use information technology tools including video-conferencing to accomplish the same. (Andersen, 2004)

Though the above sections have briefly outlined the growing importance of video-conferencing as an important tool of information technology, the following review of articles are a further attempt to provide evidence to this respect. The first article is titled “Online In the Outback: The Use of Videoconferencing by Australian Aborigines” authored by Mark Hodges and published in “Technology Review” issue of April 1996.

Upon reading the said article by Mark Hodges, it was evident that while the use of video-conferencing still remained a remote idea and its application still under-utilized in countries such as the United States of America and other European countries, the Warlpiri aborigines of Tanami region of Australia’s Northern Territory have been effectively using this technology since 1993. The exchange of information through the use of video-conferencing given the name of ‘Tanami Network’ taking its name from the region links some four settlements of Walpiri aborigines, as well as with the major Australian cities of Sydney, Darwin, and Alice Springs.

The use of video-conferencing for these aborigines has proved to such a successful venture that the aborigines are able to communicate and gain vital information from a number of government service providers located in the said urban cities; while at the same time video-conferencing has also provided these Walpiri aborigines access to customers and business organizations for Walpiri arts and crafts, established links with other Australian aborigines and with indigenous populations living in countries of the world.

Also used for consultations amongst the aborigine leaders to arrive at important decisions for their traditional ceremonies and community related issues, the use of video-conferencing has successfully been expanded for such applications as access to educational programmes including adult and secondary education, teacher training, legal assistance, social security, and access for remote health care.

In essence, the Tanami Network, using the video-conferencing tool of information technology has thus provided these Australian aborigines an excellent portal for enhancing their quality of family and community life. Perhaps the single most important advantage gained by the use of video-conferencing technology by the Australian aborigines has been to overcome lack of communication factor within the close circle of family and friends, which even today stands threatened by alarming influence of Australian western culture as well as the geographic isolation of these fragile aborigines across the Australian continent.

Thus, video-conferencing has been successfully used in areas of education, ceremonial functions, decision-making, and access to health care, promotion of Aborigine artifacts arts and culture, and access to businesses located in urban areas of Australia, as well as far off places such as London and the United States of America respectively. The link created by video-conferencing with the aborigines living in other parts of the world is yet another major accomplishment of this technology. The use of video-conferencing has thus resulted in the creation of a close network with Saami of Scandinavia, the Inupiat of Alaska, the Inuit of Canada, and the Little Red Cree Nation living in the state of Alberta in Canada.

A similar video-conferencing network also in Australia provided aborigine students of New South Wales the opportunity to continue secondary education. Providing a link between 4 schools situated in remote locations, the students use the video-conferencing technology to finish the final 2 years of their education, against the option to either drop out of school, or the more expensive option of joining a boarding school located at a distance ranging from 200 to 400 kilometers. In addition to the crucial opportunity to continue education for the aborigine students, the video-conferencing technology also provides these populations with topics and subjects otherwise not available within the confines of the aborigine community. (Hodges, 1996; Fischer, 1992; Munn, 1973; Young, 1995)

The above sections have briefly provided some of the salient features and uses of video-conferencing in present day environment, as well as touched upon the subject of some of the situations where video-conferencing as a tool of information technology can save precious lives and property. The following section comprises of a brief overview of the development of video-conferencing over the last 5 years in particular, and its introduction as an important tool for exchanging information over the last few decades.

A brief on the development over the last 3 decades of information technology shows that, indeed video-conferencing emerged as one of the most viable forms of communication as compared to the standard telephone set originally created by Graham Bell. Some of the first impressions of video-conferencing reveal that it comprises of being expensive, does not portray the images as may be required, may not work due to inadequate bandwidths or unavailability of a suitable phone connection, difficulties in establishing the ancillary equipment such as the monitors and the network of cords and wires, or as simple excuses as the way people would actually appear on a monitor screen, and the list simply may go on.

Yet, all these and other excuses are now history, as the last 5 years have witnessed a tremendous growth and development of an entirely new set of equipment together with relevant advances in telecommunication technology. This has made the use of video-conferencing mode of communication not only cost effective; but the hardware and software now in use are fairly easy to use with minimum of training required. This has fulfilled the two most important demands of the business circles across the globe; first video-conferencing has brought a significant reduction in travel expenses, and secondly, it has made communication between people scattered across continents fairly simple and within the grasp of general populations/communities.

In fact studies carried out by Wainhouse Research noted that since the onset of easy-to-use software, cost effective hardware and access to telephone lines in the last 2 years, there has been a steady growth of approximately 30 percent in annual revenues across the video-conferencing industry.

The availability of such equipment as web-camera is yet another evolution which has turned a simple desk-top computer into a ‘digital-media’ thus changing the traditional video-conferencing technology into a new spectrum, and providing practically everyone with a desk-top, a telephone line, and a good Internet connection with a modern video-conferencing technology.

The last 5 years have also witnessed the introduction of Integrated Services Digital Network (ISDN) -based networks with Internet Protocol (IP) systems, even though the first still dominate majority of the videoconference industry across the globe. Studies carried out by Frost & Sullivan on the use of Internet noted that more than 95 percent of the videoconferences used the ISDN networks; the same study also noted that 20 percent of the entire video-conferencing by groups and organizations was done through the Internet Protocol, and more than 92 percent of personal video-conferencing was IP based respectively.

A brief comparison between IP based networks for video-conferencing and ISDN networking shows that IP based networking for video-conferencing is economical, provides for an exchange of information and data in a better manner, offers an easy integration option of video-conferencing and desk-top computers, and the facility of a better managed video-conferencing network. The same study also show that by next year, the differences between ISDN based network and IP-based networks for video-conferencing will be practically eliminated.

Another major development in the video-conferencing industry is the growing demand for managing video-conferencing by organizations at their own premises and using the same staff. Respective employees in the information technology departments such as storage of data and e-mail management in addition to the responsibilities already handle this. With the new responsibilities of managing video-conferencing over traditional networking functions, this is indeed a major shift in the video-conferencing industry. The new trends of using desktop computers as hubs for video-conferencing are also a source of worry for companies and organizations engaged with or providing specific software and equipment for the video-conferencing industry. Some of the organizations worthy of mentioning involved in products and services for the video-conferencing industry include Avaya, Cisco, Microsoft, and Nortel Networks.

With the desktop computer already in use as a hub for video-conferencing, the video-conferencing industry is coming up with ever-new developments and technologies constantly in search of upgrading the quality of both audio and video images to be transmitted over the network.

Some of the modern tools introduced include the videophone, a product launched by Motorola/World Gate Communications, which transmits full-motion video images with an excellent audio levels requiring a high speed Internet connection, yet in appearance it is simply a cellular (mobile) phone.

The LCD-Integrated Display is yet another modern tool for communication. This is an advanced version and a combination of integrated video-conferencing codecs, cameras, microphones and speakers all installed within the desktop computer. Already introduced by three major manufacturers, namely Polycom, Sony and Tanberg, each of the companies have successfully launched their products featuring the said characteristics for videoconferencing. Sony’s model PCS-TL50 perhaps stands out as the most advanced version, as it can perform the double function of desktop computer display, as well as easily switched on to video-conference monitor.

Another development is the software based video-conferencing technology. Polycom’s desktop model PVX is one such example of this new technology, which only requires a USB web-cam, a desktop computer, and software from either of the vendors in the video-conferencing industry. The significant feature of software-based video-conferencing is that it offers high-resolution pictures and high levels of audio. Polycom’s PVX model offers a 30-frames per second picture frame, while the quality of sound is at 14kHz; making it one of best performing information technology tools in video-conferencing. (Regenold, 2005)

As also reiterated in the above sections of the paper, the information technology portal of video-conferencing has proved its worth due to its tremendous potential to reach anywhere and at any time. In addition, the physical presence is totally eliminated for imparting training, education, or merely exchanging information with employees of the same organization. An overview of the different situations and sectors where video-conferencing is widely applied includes education and professional training, though it is also used in vital meetings amongst board members of an organization situated in distant locations across the globe.

Though professional training and corporate application in business organization is said to be the most important application of video-conferencing, it is the arena of education where its application has proved most beneficial. As also described in the above case studies of Aborigines of Australia receiving feedback and information from distant locations as far as London and the United States of America, or receiving education within the vast territories of the Australian continent, video-conferencing has truly added new dimensions in the discipline of education.

One may note that though video-conferencing in the arena of education has been in practice for a number of years, its combination with online form of education has added significant value to the discipline of education. Both these technologies of video-conferencing and Online have thus not only improved the quality of education as visual cues and body language are utilized in video-conferencing, the technological pairing of the two has allowed for the provision of education experts without the need to physically call them. Thus, both the factor of time and place have been made independent, as also bringing a significant reduction in the costs of travel that would otherwise be required to move experts from one location to another. (Reed & Woodruff, 1995; Willis, 1996)

From the above it would be evident that video-conferencing and Online mode of education when combined truly offers an excellent form of imparting education minus the numerous obstacles that may be required in the absence of both the said technology portals. However, there are numerous studies which provide significant evidence that video-conference even when combined with Online form of education has its own set of limitations, and perhaps these limitations are the reasons for the inability to make video-conferencing a virtual success.

One such limitation, and perhaps greatest obstacle is the lack of interaction amongst the participants of a conference-conference. Also termed as “talking heads”, this format of imparting education and training is observed to loose its viability in the absence of true interaction, or failure to encourage participants to actively participate in the respective education/training program. In this context, one may observe that a face-to-face presentation comprising of no less than 50-minutes is it a tiring experience for the participants, and to bear a lecture through video-conferencing is practically an impossible exercise.

As also evident through a number of studies, a one-sided lecture can only remain productive, or majority of participants remain active listeners for a maximum of 20 minutes only. After the passage of approximately 20 minutes into the one-sided lecture, an atmosphere of drowsiness can be witnessed amongst the participants. It is this fact, due to which video-conferencing even with the assistance of Online technology has not really been a favorite form of imparting education or training.

There are however two methods or solutions for addressing such dilemmas as the lack of interaction amongst the participants. First is the pedagogical approach, while the second solution is through the effective use of technological aides.

In the pedagogical approach for addressing the lack of interaction amongst the participants, there are three basic principals, which can provide avenues for active participation from the participants.

First point is breaking the ice. These are creation of an atmosphere which provides for a motivating factor, in turn pushing the participants to actively take part in the ongoing lecture while there are amidst a video-conference; this motivation and the respective atmosphere also allows for overcoming feelings of self-consciousness. This is also called breaking the ice.

Secondly, the shorter a lecture and more focused it is, the better outcome in the shape of interaction by the participants, as well as easy transfer of knowledge/training text is observed. One way to accomplish this, and make presentations short is to provide a break after every 20 minutes, and engage the participants in some form of activity.

Third point, and perhaps the most important is the officering participants to get involved in the interaction, and not to leave upon them to decide whether or not to participate. This factor is also important, as it allows for both breaking the ice, as well as breaking the same lecture or training session into a number of segments, each supported by a separate form of activity from the participants. Involving participants and engaging them for active interaction can be accomplished by involving them in debates between number of experts of the same discipline, through the adoption of role models or role-playing, putting controversial questions to the participants so that they are able to offer a variety of answers to the same question, instead of asking a question which only has one answer. This third point of involving the participants also implies that interaction amongst the participants has to pre-planned prior to the actual video-conference session, and cannot be simply pursued during the respective session or educational text. Though this form of inviting and engaging the participants is truly effective in delivering a truly successful lecture or training program whether professional or educational, its single largest drawback lies in the fact that this can only practiced and implemented in a live presentation or videoconference.

Addressing the dilemma or failure to actively participate in a videoconference from a technological perspective can be accomplished through the application of recorded messages, or training programmes. In this manner, the participants can gain access to the respective educational/ training material at the their own disposal, normally through the use of Internet. (Shearer, 2003; Kunz, 2000)

It allows for the utilization of existing and proven technologies.

There is significantly little training required.

Video-conferencing can be used in a number of settings, environments, and configurations.

It is one of the most practical tools for creating a direct liaison with both audio as well as visual linkages amongst the participants.

The operating costs are comparatively less, and this too depends on the distance and number of sites.

Taking the case of an interview of a potential candidate by a committee of officials within an organization (such as interviewing a candidate to fulfill a faculty position in an academic institution) shows that advantages of video-conferencing far outweigh the disadvantages. First of all, convenience of the applicant is at the forefront followed by significant reduction in travel costs, time otherwise needed for the primary responsibilities. Then there is the additional advantage of videotaping the entire proceedings of the interview, for later screening, as well as for those concerned officials who may not be available for the interview.

One of the profound and proven advantages of video-conferencing has been observed in the teaching/learning environment of academic institutions. With exponential growth in the learning/teaching environment, in particular through the use of ‘Online’ forms of education, videoconference has provided new dimensions to the teaching and learning situations. Though there emerges the need for specific equipment and personnel for video-conferencing, the basic requirement of an Internet Service Provider, a laptop or computer and a web-camera are all that is required for video-conferencing to take place.

Video-conferencing has also found tremendous advantages amongst teachers and pupils for a one-to-one teaching format, and communication with small groups of students located in distance locations. This is particularly true since the onset of ‘Internet’ as a means of direct communication. The same application has also found tremendous advantages for business communications for both long distance meetings, and one-to-one contact with employees located in distance branches of the respective organization.

Though relatively less in usage, the use of ISDN conferencing is an advanced version of video-conferencing, which provides for significantly better quality of both audio and video. The principle usage of the ISDN form of conference-conference is in the learning/ teaching environment where there exists the need to ‘ask the expert’. It is this advantage of calling upon external experts in far off locations that this ISDN video-conferencing is best applied. Another advantage of this form of video-conferencing is the facility to support entire group of professionals or students and involve them in the teaching/learning environment through direct interaction.

One of the disadvantages of video-conferencing is observed in the initial establishment costs, which can be high as compared to traditional modes of meetings.

Video-conferencing is still considered an evolving technology, hence standardization and its usage is yet to be fully developed.

One of the major restraining factors and a disadvantage of video-conferencing is the inadequate infrastructure of local telephone networks, which is one of the prime requisites.

Expansion of video-conferencing facilities and locations require substantial financing, hence its utility remains limited.

The operational costs of videoconference also serve as an impediment.

Taking the same example of an interview of a candidate by a team of officials of an organization, there also exist disadvantages of video-conferencing; these can include potential technical difficulties such as problems with the software, hardware, and/or failure of the network. Though these problems could well be tested prior to the actual event, such as the interview, there is always the possibility of an unexpected technical problem to emerge either before or even during the actual video-conferencing activity.

A major impediment in video-conferencing is the lack of personal interaction, a factor that is often regarded as an important feature of any meeting, interview or feedback. A prime example of lack of personal interaction can be observed in the ever-important handshake that is considered an important aspect in the conclusion of a business meeting, or the successful completion of an interview.

Then there is the aspect of eye contact, which too remains absent during a videoconference; as eye-contact serves as an important feature for physical assessment of an individual (such as an applicant during an interview), and situations during a videoconference.

Another disadvantage observed during a videoconference is the absence of trained and support personnel, in turn creating a host of problems for participants who may be unfamiliar with the video-conferencing equipment/environment, with the result that the same videoconference would make matters worse instead of providing facility for the participants.

The disadvantages observed in the ISDN form of video-conferencing are the relative high costs incurred in the installation, rental and call charges. In addition the specific equipment for video-conferencing required for supporting ISDN too is costly. Then there is the difficult pattern of understanding data collaboration in ISDN, which is difficult to use, making it a disadvantage for video-conferencing.

Conclusion

The above paper strives to present the topic of video-conferencing in a number of perspectives, and provides evidence in respect of the popularity one of the most advanced forms of communication prevalent today in various industries. Whether it is the arena of academia, business organizations, professional trainers, to government offices, the information technology portal of video-conferencing h

Tree Topology: Advantages and Disadvantages

INTRODUCTION

Network Topology is a systematic layout of nodes over a network. This layout also determines the manner in which information is exchanged within the network.

Tree topology is the combination of the bus and the star topology. Tree topology allows users to have many servers on the network. Tree topology follows a hierarchical pattern whereby each level is connected to the next higher level in a symmetrical pattern. It connects multiple star topologies to other star topology network. Tree topology is the best when the network is large and not for a small network because it is a waste of cables to use it.

Tree topology has some features such as:

Three levels of hierarchy: In a tree topology network topology there are three levels of hierarchy and work together based on the root network.

Two types of topology: Tree topology is the combination of star and bus topology.

There are some considerations when choosing a topology which are:

Money: The user should look if the topology is costly or not.

Length of the cable needed.

Type of cable to be used in the topology.

BACKGROUND STUDY

Merits mean advantages or benefits of an object and Demerits means disadvantage or limitation of an object.

MERITS OF TREE TOPOLOGY

HIGHLY FLEXIBLE: In tree topology computers can be added by simply adding a hub in a network topology.

CENTRALISED MONITORING: It makes users to control and manage a larger network easily and also it is easy to reconfigure the tree topology.

COMPUTERS HAVE ACCESS: Because tree topology is a large network, all computers have better access to the network.

POINT-TO-POINT CONNECTION: In tree topology each computer is connected to the hub and also each part of a network is connected to the main cable.

Tree topology is supported by many hardware and software venders.

In tree topology it is easy to add a computer by simply extending using cables to connect computers.

DEMERITS OF TREE TOPOLOGY

3.2.1SINGLE POINT OF FAILURE: In tree topology, if the backbone of the entire network breaks both part of the network may not communicate to each other but a part of the network continues to communicate alone.

3.2.2 DIFFICULT TO CONFIGURE: It is difficult to configure tree topology because is a large topology and also wiring the network is difficult.

3.2.3 In tree topology, the length of the network is limited by the type of cable to be used on the network.

3.3 USAGE OF TREE TOPOLOGY

According to www.google.com, tree topology has some usage such as:

3.3.1 It is easy to identify the system in the network and also connect to a larger network.

3.3.2 To share information across a larger network.

3.3.3 Tree topology allows the users to have many servers on the network.

3.3.4 Tree topology reduces network traffic.

CONCLUSION

Tree topology is the combination of star and bus topology. This topology is best to be used on larger network. The tree topology has some advantages such as it is highly flexible, centralized monitoring and point-to-point connection and the disadvantages of this topology is that it is difficult to configure, there is a single point of failure. Tree topology is used to identify the system on the network, to share information across network and allows users to have many servers on the network. Tree topology is the best topology because the signals that are transmitted by the root nodes are received by all the computers at the same time.

APPENDIX

The figure below shows a structure of a tree topology.

The above figure indicate that if the main cable breaks the entire network would not be able to communicate to each other but a part of the network communicate and if the hub breaks in part of the network that will affect that part.

REFERENCE

Douglas.E.Comer (2006), Computer Networks and Internet, Department of Computer Science Purdue University, Westhafayette.

Tom.S.ed., 2001. Encyclopedia of Networking and Telecommunications Tate McGraw Hill.

http://www.computerhope.com [accessed on the 10 July 2009 at 1030hrs].

http://www.google.com [accessed on the 13 July 2009 at 1234hrs]

http://www.novell.com [accessed on the 10 July 2009 at 1200hrs]

Touch screen applications

Touch Screen and the meaning of Multi-Touch

Nowadays, we all can frequently see the Touch Screen applications around our environment. Starting from our pocket games to ATMs, Service counter applications to Information displays, touch screen technology have been widely used and applied. So why did we call the name of Touch Screen? It was clear that we can refer to touch or contact to the display of the device by a figure or hand or a stylus. By theory, the touch screen has two main attributes [1]. First, it enables one to interact with what is displayed directly on the screen, where it is displayed, rather than indirectly with a mouse or touchpad. Secondly, it lets one do so without requiring any intermediate device, again, such as a stylus that needs to be held in the hand. Such displays can be attached to computers, terminals to networks and also can use such as the personal digital assistant (PDA), satellite navigation devices, mobile phones, and video games. The good beginning of first commercial touch screen computer HP-150 [1] had inspired for further development of touch screen technology and its applications. Here are a number of types of touch screen technology as known as

Resistive – Using electrical conductive layers
Surface acoustic wave – using ultrasonic waves that pass over the touch screen panel
Capacitive – classified in two types as surface capacitive and projected capacitive and
Optical imaging – for large units of touch screen application.

There are many ways to create or build a touch screen. Most of the key goals are to recognize one or more fingers touching on a screen to effectively interact with the command of the appropriate applications. Even though touch screen technology patents were filed during the 1970s and 1980s, within the short time they had been expired [1]. Touch screen components manufacturing and product design are no longer in encumbered with regard to patents and the manufacturing of touch screen-enabled displays were widespread. At beginning, touch screen technology started with single-touch. But in later time, have been developed to dual-touch and then now popular article “Multi-Touch”.

The meaning and development of Multi-Touch screens facilitated the tracking of more than one finger on the screen, thus operations that require more than one finger are possible. These devices also allow multiple users to interact with the touch screen simultaneously at the same time. Multi-Touch can explain as a set of interaction techniques which allow users to control the graphical interface with more than one finger at either application or system levels of computers or touch screen displays or mobile phones [2]. It can consists of a touch screen (possible in wall, overlay, table, etc) and the application software that recognizes multiple simultaneous touch points, it would oppose to the single-touch screen which only recognizes single touch point.

The actual research development of Multi-Touch had started from since 1982 when the University of Toronto developed the first finger pressure Multi-Touch display [2]. When the time came to 1983 after a long of a year, Murray hay from Bell labs published a comprehensive discussion of touch screen based interfaces. In 1984, Bell labs created a touch screen that could change images with more than one hand. So the University of Toronto have stopped for hardware research and specialized in software and interface development expecting that they would have access to Bell labs work. A breakthrough occurred in 1991 that when Pierre Wellner published a paper on his multi-touch “Digital Desk”, which supported multi-finger and pinching, motions [2]. But after that time, there were no further widely acceptance or popularity in this field except on special interested groups or research labs. When coming out of the evolutional product from Apple, “iPhone”, interesting of Multi-Touch technology has emerged again to the stage. The iPhone in particular has spawned a wave of interest in multi-touch computing, since it permits greatly increased user interaction on a small scale. And also the introduction of Microsoft Surface from Microsoft Cooperation in 2007 had got many attentions and interesting from publics. Recent years, the use of Multi-Touch technology is expected to rapidly become common place and will stand as one of the innovative techniques.

The evolution of human input “touch” to computer and other devices

The most basic fundamental concepts of Multi-Touch Technology are branching out from the concepts of Human Computer Interaction (HCI). To control everything with your hand or fingers are not so as easy as our expected. The good implementation of user interface and the consumption of processing time in application software are most critical saturation and would be needed to consider as first priority. Along side with history, people did endeavor the more moderated techniques in HCI for both hardware and software to useful and friendly than of the previous discoveries. So, nowadays we can see the different versions of computer monitors, mouse, game joysticks and application software that all are more advanced and suited with user’s requirements and flexibilities. And also in Multi-Touch, it had been for the long way in research and development regarding for HCI, product design and technically improvements. Here are some facts and time lines that have been roughly annotated as a Chronology of Multi-Touch and Related Works.

The beginning: Typing & N-Key Rollover (IBM and other researchers)

It may seem a long way to become a Multi-Touch screen, because the starting story of Multi-Touch had begun with keyboards. They were mechanical devices, hard type rather than of soft. But they did involve a sort of Multi-Touch. First, we can see the sequences of such as the SHIFT, Control, Fn or ALT keys in combination with others. These were the cases where we want Multi-Touch. Second, there were also the cases of unintentional, but inevitable, multiple simultaneous key presses which we want to make proper sense of, the so-called question of n-key rollover (where you can push the next key before releasing the previous one) [3].

Electro acoustic Music: The Early Days of Electronic Touch Sensors (Hugh LeCaine, Don Buchla & Bob Moog)

It was the early type of touch-sensitive control device, used touch-sensitive capacitance-sensors to control the sound and music being made. It could say touch pads rather than to say touch screen.

1972: PLATO IV Touch Screen Terminal (Computer-based Education Research Laboratory, University of Illinois, Urbana-Champaign)

It was the early work done by IBM, the University of Illinois, and Ottawa Canada [4]. All were single touch and there were nothing for pressure sensitive. As well as its use of touch, it was remarkable for its use of real-time random-access audio playback, and the invention of the flat panel plasma display.

1981: Tactile Array Sensor for Robotics (Jack Rebman, Lord Corporation)

A multi-touch sensor designed for robotics to enable sensing of shape, orientation, etc [5].

1982: Flexible Machine Interface (Nimish Mehta, University of Toronto)

The first multi-touch system that had been aware of designed for human input to a computer system [6]. It was consisted of a frosted-glass panel whose local optical properties were such that when viewed behind with a camera a black spot whose size depended on finger pressure appeared on white background. This with simple image processing allowed multi touch input picture drawing, etc. At the time we discussed the notion of a projector for defining the context both for the camera and the human viewer.

1983: Video Place / Video Desk (Myron Krueger)

The vision based system that tracked the hand and enabled multi fingers, hands, and people to interact using a rich set of gestures. It can implement in a number of configurations, including table and wall.

1985: Multi-Touch Tablet (Input Research Group, University of Toronto)

The negative effects of technology

Technology is everywhere. It is a tool that certainly changes the world and how it operates. Many people today are familiar with the technology and its use; it might become extremely important in aspects of our life also evolved in over the past decades and even now made our life simpler, easier, convenient and more comfortable. This notion of the technological development and obvious human capability could cause a massive impact on how the world operates. Unfortunately, nowadays, technologies possibly play both positive and negative rules; depending on how can we invest it. Furthermore, if technology invested and used in useful and positive ways then it might give us a good influence whereas, if it used in negative ways, it will probably cause us a negative influence. In my opinion, it seems that technologies have had a great effect on today’s lifestyle. On the other hand, there are many people omit to believe about some negative effects that related to the use of technologies. This essay will focus on the extent to which negative and positive influence of the technology on some areas of the human’s lifestyle.

HT Media Ltd (2014) argues that the characteristics of technology certainly provide unlimited belongings while their negative influence on personal relationship may be examined methodically also it could take too long time to recognize the problem, so we should re-study the function of the technology impact on our lifestyle. A part from this, it sometimes makes us restive and confused. These harmful impacts might produce serious problems that we should deal with. Thus, it may be observed that the principle accessibility of technology may reduce the distance between us in order to the growth of the social relationships, and then we might need much more technologies to keeping and strengthening our personal relationships.

It could also possible say that the personal relationships possibly changed significantly. Several people might become used for intelligent and modern devices such as computers, laptops and phones. Thus, they could browse the Internet and will may use the most common social networking applications like Twitter, Skype, Facebook, WeChat, WhatsApp, etc. The communication technology and social sites certainly provided interaction smother and easier with each other, whereas, also have certainly provided public separated from another people because that it could reduce the demand to make communication face-to-face conversation. As a result, it is fair to say that the impacts of those social networking sites are very obvious not only on our personal relationships, but also on many parts of our routine such as privacy, freedom, person’s independent and education (HT Media Ltd, 2014).

The relationship between the students and their families and friends could have a massive effect not just on the health such mental and physical problems. In recent years, many people spend a long time of their day in front of the systems and electronic devices, which might lead to obesity and lastly a great threat to the health, but also on the education. What is more, the family probably support student to achieve the aim of their studies, and may help them to reduce the harmful impacts of stressful life actions due to; there are some students could live a lone which might make them more socially separated, and they may become more relay on the modern electronic devices to getting on engagement and social support. Thus, students certainly use the Internet to communicate frequently with friends and family by emails and text messages being the favourites instruments and technology devices (Weber, 2003).

Certainly, some technological developments might cause populaces to be distracted, too worried, and gradually out-of-the-way also many people may be tangled in many numbers of societies with the technologies today however; the property of these relationships might make people feeling qualitatively empty. Clearly, technology has had a reflective impact on what it means to be social Robert (2014). In figure (1) shows communities, social networking sites, and today’s communication tools that use it students. This study of students and technology observed that 97 percent of students graphed used social networking sites to stay in touch with their friends. As a result, technology might have strongly affected to students on their personal relationship.

Source: Robert (2014).

Pew Internet & American Life Project (2002) suggests that nowadays, generation of the learners that attend to the institutions of study like universities, colleges and schools could be have unprotected to technology so early and they have been more familiar with it. In the same way, Anderson (2001) states that about twenty percent of the learners attend to schools are used modern electronic devices, when they were between 5 and 8 years old and all these children certainly activated using the computers whenever they increased using it as well. In one sense, there are many reasons for huge uses of the Internet and the technology could be connecting with family, classmate, and friends. It seems that some communication applications and sites like Email, MySpace Twitter, Facebook, YouTube, WeChat, LinkedIn, Tango and scores of others by these all sites we could contact each other in any position on over all the world easily and smoothly. Lickerman (2010) mentions that it appears that these fantastic ways and interconnectedness might be solution of our problems. In fact, it may be too expensive.

Furthermore, there is a strong connection between technologies and education. Technology may rapidly blossom in the last twenty years. It would become not just a familiar tool, whereas also certainly improved the knowledge and our skills of the research as an educator. Moreover, it would engage us to study since it could be easy to access and able to share learning materials. Educational technology could be a study and moral training for facilitating teaching and improving performance also it in education may be just an additional chance to achieve education, if we do not time or opportunities to do it another way. In that case, then it could result in the good changes of pedagogy and teaching ways all over the world in order to we could observe the advantages that educational technology and also might create educational chances for students and teachers. On the other hand, it might have a negative effect on students at classroom, which causes no controlling device use, distractions, the risk of cyber-bullring and limiting face-to-face communication due to the difference in available a wonderful technology and preparation needed for operating technologies in teaching could also discover various disadvantages (Sosnowski, 2014).

Above all, there is no universal argument in the legal society about organization of the computer crime, probably just one reason for that which certainly the rapidly developing state of computer technologies day by day. In 1979, U.S. Department of Justice Publication, divided computer crime into three main parts: First of all, Computer abuse “…the broad range of international acts involving a computer where one or more perpetrators made or could have made gain and one or victims suffered or could have suffered a loss” .In the Second part, Computer crime “…Illegal computer abuse implies direct involvement of computers in committing a crime. In the last part, computer related crimes “…any illegal act for which a knowledge of computer technology is essential for successful prosecution”.

In short, these definitions of these parts possibly become cleared by the massive production of computers and electronic devices related products over a few years ago. Thus, the development of effective computer network security law and public policy could not be accomplished without co-operation between the technical and legal communities. Unfortunately, in many countries there are no substances of laws that could protect a person’s privacy when they browse the Internet. The rules that try to set a standard of privacy are substances of the laws beginning with the constitution and remaining down to local laws. These laws are not geared for the Internet. These laws may be to protect a person’s informational privacy.

In the other way, privacy might be one of the most conditions where technology effects significantly signs both the real and the practical landscape. There are a number of demands where in order to increase personal privacy sources, especially in the private areas, demand a great deal of personal information. So the information in the right sides makes the chances for a huge convenience, allowing people access and share information more methodically associated to them. In contrast, the incorrect sides, this information might cause confusion on individual in the shape of financial damage, or identity stealing. Some agencies sector might be going as far as to secure the information and the law enforcement (Hale, 2005).

Hale (2005) states in report explaining identity hazards that face recognition technology (FRT) might lead to a full investigation personal freedom prohibiting privacy as organization might use it to detect people at anytime and location. Clearly, it might lead to eliminate not just people’s freedom, but also their independence as well.

To sum up, the usages of technologies on various areas of the human’s lifestyle could be has both positive and negative consequences. Great site by the way, the connecting technologies in the educational process might makes education enjoyable and more comfortable for the educators and the learners as well and also helps to combine connection education and employment. The technology should be used when it completely benefits us also when it is needed as well and people should try to communicate with each other by meeting in order to improve their society. In contrast, it is not possible to reverse the negative effects of technologies; people should try to avoid it in order to get benefited which helps us in money saving and use it more comfortable and securable. The larger our sense of the freedom and independency as human, we continually try to free ourselves from the limits forced by nature, society, and a new technologies which may lead to more control on our lifestyle. It appears that the use of these technologies probably increasing annually that let us nowadays to look at it more critically.

References:

1)- Anderson, K. J. (2001). Internet use among college students: An exploratory study. Journal of American College Health, 50 (1), 21–26.

2) – HT Media Ltd, (2014) Technology and Social Relationship. The Financial Express, 1 Mar. Available at: http://ezproxy.bcu.ac.uk:2073/docview/1503206602?accountid=10749.

3) -Hale, B. (2005). Ethics, Place, and Environment. Rutledge Publishing. 8(2).

4)- Robert, J. (2014). Dimensions Of Leisure For Life. Human Kinetics. Available at: http://www.humankinetics.com/excerpts/excerpts/technology-can-have-positive-and-negative-impact-on-social-interactions [Accessed 24 Aug 2014].

5)- LaRose, R., Eastin, M. S., & Gregg, J. (2001). Reformulating the Internet paradox: Social cognitive explanations of Internet use and depression. Retrieved April 2, 2005, from: http://www.behavior.net/JOB/ v1n2/paradox.html.

6)- Likerman, A. (2010) The Effect of Technology on Relationships. Happiness in this world. Available at: http://www.psychologytoday.com/blog/happiness-in-world/201006/the-effect-technology-relationships [Accessed 20 Aug 2014].

7) – Pew Internet & American Life Project. (2002). The Internet goes to college. Washington, DC: Author.

8) Sosnowski, J., (2014) Advantages and Disadvantages of Technology in Education. Ehow contributor. Available at: http://www.ehow.com/about_4815039_advantages-disadvantages-technology-education.html#ixzz1DSB9fPaG [Accessed 16 Aug 2014].

9)- United States Department of Justice (1979). Computer Crime, Criminal Justice Resource Manual.[online].Retrieved October 1999,from: http://www.studymode.com/essays/Computer-Security-And-The-Law-804.html [Accessed 19 Aug 2014].

10)- Weber, L. (2003).Relationships Among Spirituality, Social Support and Childhood Maltreatment in University Students. Counselling and Values. 47 (2), pp.82–9

1

Risks and Benefits of Modern Technology

The Internet and other emerging technologies have been a source of help to organisations in the modern time. It could be argued that having a form of modern technology is inevitable in the bid to remain competitive. Information Technology industry and media commentators claim that businesses stand to gain many benefits from use of the Internet. There is no doubt that the internet has transformed the way organizations operate at present, not only do these improve processes, financial benefits are also increased. While successful examples of companies using the net have been widely reported, few studies have seriously examined the issues involved.

However, there is always a flip side of the coin in everything, as the internet and the other emerging technologies have their benefits to organizations. There are also the disadvantages. This paper will seek to analyze both sides in detail and put forward possible solutions backed up by research which will reduce the risk of digital business.

George (1988) defined Emerging Technology as science based innovations that have the potential to create a new industry or transform an existing one. They include discontinuous technologies derived from radical innovations. Examples are bio therapeutics, high-temperature super conductors, MRI imaging, Touch screen kiosk and the Internet. The Touch Screen Kiosk is a new java-enabled technology. Touch-screen kiosks go beyond the capabilities of a traditional time clock in that they are true information appliances. They provide a bridge between the organization and the workforce, both capturing workforce information and extending organizational information back to the employee. A kiosk-based solution for time, attendance, and labour tracking also offers organizations a tool for empowering employees with self-service. With the touch of a finger, employees can view Internet-enabled applications that display their schedules, vacation balances, and corporate messages or allow time and labour data entry. Workforce management can now be a two way street. (Rizzo, 2001)

The Internet started in the 80’s and since then it has had a drastic impact on culture, commerce and business and due to the speed, litheness, and effectiveness that it offers, it has become the means for accomplishing a growing amount of business between suppliers and large international companies. With this, the Internet has created new business to developing countries and has hastened the distribution of knowledge throughout the world of business. It thus creates unprecedented opportunities for developing countries because it can remove barriers thereby fostering full participation in the new global economy. Some organisations use the internet for almost every part of their organization such as buying and delivery of goods, stock control, manufacture arrangement, communications plans, sales programs, service departments and support programs. “The change from traditional communication channels such as salespeople, telephone and snail mail to the internet and e-mail happened quickly in some companies. However, this transition took place slowly in others”. (McKenna and Bargh, 2000)

Emerging Technologies have instigated a new transaction method that decrease costs, hastens the pace at which transactions are carried out, and supply acces to new markets, new customers, and new business relationships of all kinds. Even with all these benefits that the internet has brought to the business world, there are still a lot of risks involved such as fraudulent transactions, scams and more. This research would explain all these risks and benefits in detail.

The Internet has enabled business to go far beyond previously set or imagined boundaries. Today, change is constant, its speed is accelerating, and its global impact is felt everywhere.

While the Internet offers many potential benefits, there are a number of unresolved issues in conducting business on it. Despite recent advances, security remains the most fundamental concern and the main reason why companies hold back from full use of the net (van Kirk, 1994; Baron, 1995).

Iver (2003) explains that online business is an infinite term about the different business procedures that plan to assimilate the merchants with the consumers and suppliers using the Internet. The whole procedure of putting up a website, helping the potential customer steer through the site, showing them the accessible products, offering cut rate and coupons and doing everything possible to persuade the potential customers and changing them into customers, comes under the process of e-business. He further went to discuss about Electonic commerce which is a compartment of e-business and is defined as online business that can be accounted for in financial terms. For instance, paying for goods and products with credit card by the consumers or shopping and paying online are examples of electronic-commerce. E-commerce could be explained as the last phase of e-business which involves the payments of the products sold by the organisation.

The use of the internet has several disadvantages when conducting business transactions. According to Iver (2003), the main disadvantage of online business is the poor growth rate in certain segment on account of goods. For example, the food sector has not profited in terms of expansion of sales and resulting profits generation due to some certain reasons like food products being an unpreserved item. Customers hardly search for food products and items on the Internet because they rather go to the stores to purchase the required product when it is needed.

It is also very easy to go into the online business as almost everyone has a laptop and is connected to the internet. This gives room for fraudulent activities since there isn’t proper security in place to monitor the number of people that develop shopping websites.

Trust is also a major issue in online business because any problems with your business website will be immediately be obvious to the world and the customers typically have little loyalty. Due to competition, once your website is unavailable, customers will simply move on to one of your competitors. In addition, technical failure can also have a significant impact on a company’s key trading partners. Sid (2007) also discussed about the disadvantages of online business based on internet services and trust issues.

According to Sid (2007), he discussed about the disadvantages of practising online business. The first point is that the website of the business must keep functioning at all times; this is the equivalent to a physical store staying open. If an Internet business goes offline due to technical issues, it can cost them profit; therefore, it’s essential to either have the technical skills or have someone in the company that possesses the technical skills to keep the company functional at all times.

Sid (2007) also explained about the behaviour of people shopping online. In many people’s minds, purchasing products over the Internet is still not as safe as purchasing products in stores. Since people cannot see the person on the other end of the computer, they might have issues purchasing products. This is also dependent upon the products and the sector. Lacking consumer’s trust can significantly impact sales and overall success.

According to Lewis (2002), internal network protection will be a big issue for the organizations that wish to offer their services to Internet-using customers around the world. Hackers and other internet criminals can infiltrate company files and infect them with a virus, which is then sent to other computers to infect them. If the networks are not secured such people could also find important information about the company. Newly publicized weaknesses in the basic structure of the Internet indicate that the worldwide computer network may need a time-consuming redesign before it can be safely used as a commercial medium. Lewis (2002) further went to explain about how hackers can easily get access to different companies and can cover their traces without any major concerns. They like to trade data illegally and sell data to the hacker community. Calling-card numbers from long distance telephone service providers, cellular service activation codes, stolen credit-card numbers, security-penetrating algorithms and pirated software codes are among the data most frequently traded on the Internet. Hackers might also bombard a company with thousands of mail messages using automatic remailer tools. The posting of messages can knock out communications at a critical time in a competitive situation and even firewalls cannot protect a company well against these attacks.

Jerkins (1995) explained about Information security which is based on three foundations:

Data integrity

A company must be sure that its data have not been changed.

Confidentiality of data

Companies have to be able to keep to themselves what they do not want others to know, such as their customer database, credit card numbers, etc.

Authenticity

Companies need to be sure that messages they receive from the Net are from the people they claim to be.

If any of these factors can be infiltrated by hackers, the company is no longer secure.Another risk is the increasing number of information brokers who use online communications to match buyers and sellers. Criminals always seem to always have an edge over law enforcement agencies.An online Internet organization also faces other challenging risks. Cafasso (1996) explains the case of staff members that always download adult material from the internet and go further to show them around in the work environment. This can create a hostile working environment and employees that view the adult material might feel sexually harassed within the company and could press charges. The company will have to ensure that employees cannot download such media files or other offensive materials from the Internet. It will have to develop strict guidelines to protect its own interests and the uncontrolled surfing habits of its employees.To do business on the Internet successfully, companies will have to ensure that the customer is indeed the person he or she claims to be. Verification and confirmation, written with an e-mail system, will work well enough for the time being. The advantages of using the internet or emerging technologies were explained by Iver in 2003.

According to Iver, 2003, this is the major benefit of transacting dealings online. An organization using e-business can have a countrywide or a global presence. For example, IBM was one of the first firms to start practicing e-business to service customers and work in partnership with business associates all over the globe. Dell Inc. also had a prosperous business selling computers all over the US, through the telephone and the Internet till 2007. Amazon.com is also an accomplishment story that assists customers buy globally from third parties. Hence, global presence is guaranteed if organizations reorganize their business by using the Internet.

The use of the web to advertise products ensures global reach at a small price. Advertising techniques like pay per click advertising guarantees that the promoter only pays for the advertisements that are really watched. “Affiliate marketing, where customers are directed to a business portal because of the efforts of the affiliate who in turn receive compensation for their efforts meeting with success, have emerged on account of e-business”. Associated marketing has assisted both the business and the associates. Organizations making use of e-business have handled to use cost effective online advertising strategies to their advantage.

Organizations need to have a competitive strategy over their rivals because without an effective strategy, it will be difficult to sustain the advantage and make proceeds. The strategy, that the organization can chase, can be a price strategy or a differentiation strategy. For instance, till the year 2007, Dell Inc. was selling PC’s only through the Internet and the phone. It took up a differentiation strategy by selling its PC’s through the internet and personalizing its laptops to meet up with the requirements of the customers and it really helped their profits. Thus, engaging business online resulted in Dell Inc. being able to acquire a large sector of the market by making use of the differentiation strategy.

Iver (2002) explained that doing business online has produced an improvement in customer service. Occasionally, when going through a website, the customer is always met by a pop-up chat window. Readily available customer service may assist in encouraging the customer to know more about the product or service. Furthermore, the products can be paid for online and can be shipped to the customer’s home. Internet services allow for asynchronous interaction and provide convenience for the client. The Web also makes access to people in remote areas feasible. This assistance, says Snow (2001), is able to “bridge distances and help overcome a wide assortment of isolation – economic, physical, emotional, geographical.”

Doing business on the Internet will become even more essential in the future. More organizations will have the right to use to the Internet and, with the price of communication reducing; more customers will come to the Internet as well. The Internet offers remarkable possibilities and could, in the long run, surpass conventional distribution channels. It is very simple to get connected to the Internet and people can browse and shop on the Internet 24 hours a day, if they want to. The Internet offers an exceptional way to get in touch with the customers on a one-to-one basis. Better emerging technologies will help organizations to represent themselves better and sell more products. So for people to keep enjoying the benefits on the Internet, it is important that the security issues must be addressed.

Security on the Internet is a very vital issue, and some good solutions are beginning to form. Organizations that use the internet often need to protect themselves in three major areas: data integrity, confidentiality of data and authenticity. It becomes more and more key to avoid hackers from stealing or tampering with data stored in organisation’s systems. This can be done by installing firewalls or routers. When data travels on the Net, it is normally intended to be read only by sender and recipient, which can be ensured with encryption systems. Finally, an organization wants to be sure that the parties it is communicating with are really who they claim to be. Authenticity can be obtained with digital signatures. Doing business online involves some risks, like any other business transaction. But if attention is devoted to installing secure procedures, it is no more risky than other business practices.

Since security is essential in online business because of all the risk involved, one of the major organizations that provide such security for banks and other large organizations that deal with online business is Panda Security. It is one of the major leading IT security providers and an antifraud service for doing business online to guard against identity theft through malware attacks targeting banks that operate online, platforms for payment and electronic-commerce. This procedure advises companies when there is a targeted attack and provides the gears to spot and obstruct affected users which reduce any possibility of online fraud. This solution allows banks and organizations providing online services to ensure that customers initiating businesses on their websites are not contaminated by any malevolent code that affects the service. They can see the security status of their customers and efficiently supervise the danger involved in online business. This radically reduces the possibility of fraudulent actions.

The benefits that this service offers organizations include:

Reduction of online fraud

Panda Security for Internet Transactions provides tools and information to stop fraudulent activity. Organizations are instantly notified whenever there is a new malware attack targeting their customers, and delivered the information needed to respond in time. Organizations will see how the malware operates and how they can protect themselves. They can prevent infected customers from accessing their website, reducing the effects of online fraud.

Control and management of threats to online transactions

This is the only service on the market offering information about the security status of clients. This can be used by an organization to create risk profiles, meaning the service can be configured to restrict the permission to carry out transactions to those clients that meet pre-determined security requirements.

Expanding business thanks to improved security

Users’ lack of confidence in online security is one of the prime reasons for their reticence to use Web services. This tool reduces online fraud and allows companies to offer clients a secure environment for online transactions

For the business community, the Internet is a new limit, offering matchless prospects for development and growth. Organizations can provide their services throughout the world, with the variety of services multiplying daily. It is also obvious that the advantages of using the internet and other emerging technologies for business is by far more than the disadvantages so organizations would continue to use them and keep trying to find ways to reduce online fraud and scams. Practising online business involves some threats just like any other business deal. But if concentration is dedicated to installing secure measures, it is no more dangerous than other business practices. As a matter of fact, it is not safe for your organization not to be represented on the Web if you are related to the technology industry in any way: the Web will be the first place prospective customers will look for you, expecting to find you there. There are unique opportunities on the Web for marketing a company’s services, selling products and gathering information.

References

BBC (1999), “Internet scam file”, BBC On-line Network, 7 April,news.bbc.co.uk/hi/english/business/your_

Brewer E. (15), “Internet flaws a setback for commerce”, New York Times – Personal Technology.

Cafasso, R., “When security isn’t what it seems”, Enterprise Systems & Network Management, March 1996, pp. 72-84

Cyber-Smut: Regulating Obscenity on the Internet; Kaplan, Rebecca Dawn Stan. L. & Pol’y Rev.; 1998 Vol. 9, p189,

Edward Rizzo 2001 “Emerging Technologies and the Internet Enable Today’s E-Workforce”

http://community.19actionnews.com/_zhangyan-has-added-Advantages-of-E-Business/BLOG/2275620/2165.html

http://www.abestweb.com/forums/showthread.php?goto=lastpost&t=103922

http://www.buzzle.com/articles/advantages-and-disadvantages-of-e-business.html

http://www.ehow.com/list_6023059_disadvantages-e_business.html

http://www.marketingservicestalk.com/news/pan/pan113.html

http://www.pandasecurity.com/virus_info/exports/rss/pandaes.xml

Jenkins, L., “Doing business in a global marketplace: secure electronic commerce”, CiphertextThe RSA newsletter, Vol. 3 No. 1, Winter 1995, pp. 1, 8.

Jim Drinkhill, 2001,” Journal of Financial Crime”, Computer Fraud, Vol. 4 No. 3

Lewis, P. (22), “Security of personal data is lost in cyberspace”, Computer Information Systems.

McKenna, K. Y. A. and Bargh, J. A. (2000) Plan 9 from cyberspace: The implications of the Internet for personality and social psychology, Personality and Social Psychology Review, Vol. 4, pp. 57-75.

money/newsid_313000/313051.stm.

SEC (1998), Internet Fraud: How to Avoid Internet Investment Scams, US Securities and Exchange Commission, Washington, DC, October, www.sec.gov/consumer/cyberfr.htm

Sid, J. (2007), World Wide Web Marketing – Integrating the Internet into Your Marketing Strategy, John Wiley, New York. NY

Snow, S. (2001) Is Online Counseling Ethical? Available: http://www.commcure.ethicsonline.html. Accessed Aug. 26 2003.

Implications of Internet Piracy

Internet Piracy

“Internet piracy in the digital age has put great pressure on both the individual and the organization within the modern business world. Ethically and morally Internet piracy is regarded as a negative force on business and the way in which companies do business. Moreover, there is increasing pressure on governments and world leaders to set up, administer and enforce laws that minimize the use of Internet piracy for illegal and destructive behavior. As the Internet is expanding and opening up in new markets, aiding faster live online connection throughout the world, it increases accessibility to software and information (Balkin, 2008; De Castro & Shephard, 2008). In turn, this accessibility issue is one that many businesses may fail to address thereby leading to insufficiently protected and encrypted software. Internet piracy has arguably paved the path for software development demand, making it a very lucrative business (Balkin, 2008)”.

“Web Piracy has developed into a phenomenon due to created Web systems as well as file-sharing programs. With all the expansion on the Web plus the elevated quantity of Web people in the world, the globe is becoming too many digitalized. Customers are expecting electronic data being easily accessible on the effect of a key. It has caused an astounding need with regard to electronic merchandise, where by piracy has become on the lead on the file-sharing phenomena. Napster, KaZaA as well as Microsoft include just about all also been in a negative way suffering from Web piracy. Even so, through the ashes connected with been unsuccessful business ventures there’s still also been some sort of need by simply Web users with regard to available, inexpensive legal answers from the dominion connected with electronic advertising. It has just about all stimulated the progression connected with brand-new technologies, experienced entrepreneurship, as well as organizations which right now harvest the benefits of studying via some others errors, re-organizing company products as well as adjusting the way company is conducted in the modern world despite the fact that Web piracy is always uncontrolled. This specific thesis suggests which Web piracy is really a primary drivers with regard to Entrepreneurship in a few ways from the creation connected with brand-new thoughts and the springboard with regard to brand-new organizations as well as a frontrunner connected with engineering”.

“Dahlstrom et al. (2006) discuss the technological Internet piracy phenomenon from the beginning of its presence on the Internet. Choi and Perez (2007) go a step further and take into account the fact that Internet piracy has existed since the Internet was chiefly used as a distribution tool for researchers at universities and government institutions. It is important to note however, that this copying and sharing of information was not originally referred to as ‘Internet piracy’. It was an important way for academics and government officials to share important information. Choi and Perez (2007) state that due to software mainly being open source code it was free and easily distributed and only when software companies started putting a price tag on their products, did Internet piracy become a regularly used word in the IT vernacular. This in turn has made Internet piracy a large and worldwide phenomenon, which greatly affects us all and has greatly influenced the development of this thesis”.

“Napster’s good results started out using Fanning’s perspective in order to shut the particular difference concerning supply in addition to require within the songs sector. Napster earned the latest time connected with file-sharing using today’s technology by using the internet in addition to therefore solved the particular indexing issue connected with looking for songs as a result of normal Google search motors. (Oram, 2001). Napster perhaps created a new require with the supply connected with contributed digital mass media and also the availability of this. This problem in this was the particular violation on the copyrighted product. Although Napster created the latest form of technologies pertaining to document discussing this would not occur without having the problems; Internet piracy includes a negative significance inside the particular songs organization simply because eliminate buyers, profits and it probably likewise loss their reputation in addition to brand name. (Gupta, Kamala & Srinivasan, 2005) While P4 mentions, the particular change in which Napster produced built the particular record companies think in addition to “It can force the particular record-labels in order to confirm in addition to match the buyers. ” This fight Internet piracy is constant and it isn’t going to seem like Internet piracy web-sites in addition to related application can vanish any moment rapidly because of the particular excessive file-sharers and also the require pertaining to readily readily available on-line items”.

“Provided that there exists a require as well as a supply which can be met as a result of P2P in addition to Bit Torrent technologies in addition to request there will be an opportunity pertaining to Internet piracy and it is people to produce the particular mass media readily available (Gibert, 2010). Napster noticed enable you to lower the particular require by simply increasing the particular supply; sadly this would not look at the lawful implications which could comply with. Even so, a large handful of application companies who have revolutionized the particular market segments using lawful alternatives – Apple company Inc. in addition to Spotify for instance. While P6 expresses:

“Would we’ve designed remedies like iTunes & Spotify devoid of the behavior connected with cutthroat buccaneers and also the lawful behavior attempting to stop piracy? We could dispute that any of us have seen the particular growth connected with application techniques in addition to economic designs good social-technical-legal-political situation in addition to consequently we could dispute in which piracy devices a few varieties of scientific progress””.

“Warner (2002), Picard (2005) along with Roth (2004) just about all focus on the implications connected with new technological know-how and also the popular distribution connected with software package, audio along with video tutorials on the net. Especially, many people tackle the Bit Torrent along with P2P technological know-how. Bit Torrent along with P2P was being connected with considerable importance on the improvement connected with record revealing technological know-how. Konigsberg (2002) looks at these technological know-how comprehensive along with explains the worthiness the technological know-how (and the cause code with the applications) have had for the emergence connected with record revealing software and also the Net. With nowadays, several important celebrities inside Net record revealing sector come about: Rimmer (2005) as an illustration, looks at the implications the Napster application got for the hiburan sector and also the technique these kinds of corporations treated Net piracy. Honigsberg (2002) additionally looks at legislation meets along with implications set forth simply by the hiburan businesses exactly who sued along with picked up legitimate fights towards Napster, KaZaA as well as other record revealing real estate agents”.

“Strangely enough, World Wide Web piracy does help the particular progress regarding fresh technological innovation as well as aid in entrepreneurial growth. And also assisting organizational growth yet at the same time placing key obstacles in the way on their behalf. The entire world has to come across stability in which buccaneers tend to be definitely not hunted as witches, but some common soil ought to be fixed specifically simply by authorities as well as lawmakers as a way to target the World Wide Web piracy phenomenon. From the research regarding the way it is studies, it is obvious in which even when technological innovation is not blatantly designed for piracy it could all of which will whenever possible provide for your function. World Wide Web Piracy nevertheless, in addition has allowed pertaining to technical developments that we might or else not have observed. Finally, piracy possesses in many circumstances started out fresh opportunities pertaining to business people with whom are actually capable of utilize the fresh technological innovation pertaining to authorized as well as effective business”.

References and Bibliography :-

Balk in, D., B., De Castro, J., O. and Shepherd, D., A., (2008): Can entrepreneurial firms benefit from product piracy?. Journal of Business Venturing, Vol. 23, No. 1, pp. 75-90. Beckman, E. (Responsible publisher), Pettersson, B. (Broadcast producer) (2012).

Chin., W., W., Khalifa, M. and Limayem, M., (2004): Factors motivating software piracy: a longitudinal study. IEEE Transactions on Engineering Management, Vol. 51, No. 4.

Choi, D.Y. and Perez, A., (2007): Online piracy, innovation, and legitimate business models. Technovation, vol. 27, no. 4, pp. 168-178.

Darity, W., A., Jr., (2008): Demand. International Encyclopedia of the Social Sciences. Ed. 2nd ed. Vol. 2. Detroit: Macmillan Reference USA, 268-271. Gale Virtual Reference Library. Web. 26 Mar. 2012.

McDonald, V., L., (2009): Before-and-After Case Study Design, in A., J., Mills, G., Durepos & E., Wiebe (eds), Encyclopedia of Case Study Research, SAGE, Thousand Oaks, CA, pp. 52-5.

Oram, A., (2001): Peer-to-Peer: Harnessing the Power of Disruptive Technologies. O’Reilly Media, p. 448.

Porter, T., (2006): Practical VoIP Security. Rockland, MA: Syngress.

Rao, L., (2011): Skype Revenue Up 20 Percent To $860M In 2010; Paid Users Up 19 Percent.

TechCrunch, available at: http://techcrunch.com/2011/03/07/skype-revenue-up-20-percent-to-860m-in-2010-paid-users-up-19-percent/, viewed 25 April 2012.

Rimmer, M., (2005): Hail to the Thief: A Tribute to KaZaA. University of Ottawa Law and

Technology Journal, Volume 2, No 1, pp. 173 – 218.Internet Piracy and Entrepreneurial Growth Andersson, Eventorn, Nilsson 55

Warner, M., (2002): The New Napsters. Fortune, 146, 3, pp. 115-116.es”.

The bluetooth technology

Abstract:

Bluetooth Technology has been widely spreading every single day due to its availability in most of the electronic devices that are dominant nowadays. As any other technology, when widespread, will have a huge impact on the users and societies. In our study, we will search the impact of Bluetooth technology on the society. To get a realistic knowledge, a survey contains 10 questions on our topic was conducted. 100 people from Multimedia University and Limkokwing University were randomly chosen to participate in our survey. In addition, we relied on other sources such as the internet to collect information regarding our topic. From our study we have found that, the spreading of the Bluetooth technology has made life easier. As any other technology Bluetooth is, sometimes, used in a negative way and consequently has a negative influence.

Acknowledgement:

We would, firstly, like to thank our God who gave us the ability to accomplish this research. Secondly, many thanks to our EHM 3066, Engineers and society, subject lecturers who have been giving us a very good example of diligence and hard work. We send our gratitude to the participants who have participated in our questionnaire and given us from their precious time. Thanks to our families, friends and all our beloved people who has always given us the inspiration to be successful. We would finally thank our institution, the Faculty of Engineering (FOE), Multimedia University.

1.0 Introduction:

Technology is considered to be the mount to climb up and attain the desired degree of development of a society. Man has been trying to facilitate his life through inventions and innovation. He first invented the “wheel” and hasn’t stopped so far. He lately invaded other planets. Telecommunication means, when computerized, played a very important role in spreading data which enabled scientists utilize the outcome of other scientists’researches and that lead to more and more advancement in various fields of science and thus more technical devices. These devices didn’t have the present shape. A lot of effort was exerted in the process of betterment with regard to performance, size and ease of use. This trial is to have a clear idea of how Bluetooth technology is affecting the society and our lives.

1.1 Overview of Bluetooth:

In the striving for reduction of cables between computers and their connected units, Ericson Mobile Communication, in 1994, started the project named and named it Bluetooth.

What is Bluetooth?

Bluetooth is the name of a new technology that is now becoming widespread on a commercial basis. It promises to change significantly the way we use machines. Instead of using cables to transmit data between components of a PC,for example, the printer, the mouse and so on, a small and cheap radio chip to be plugged into these components will do the job. In short it is a Cable-replacement technology.

The name was first used as a code name, but it stuck as time passed. It is named after the 10th Danish King, Herald Bluetooth, who had united Scandinavian Europe at a time it was severely divided. The founders of the Bluetooth technology found the name fitting as the Bluetooth technology is able to unite various industries like cellphone, computing and automotive markets. By Bluetooth technology one is able to simplify and combine several forms of wireless communication into a single, safe, power-saver, inexpensive, globally available radio frequency.

2.0 Bluetooth Mechanism:

By embedding short-range transceivers that are inexpensive and tiny, into new electronic devices that are available nowadays, Bluetooth achieves its goal. In addition to three voice channels, Bluetooth can transfer data at data transferring speed up to 721 Kbps. As Bluetooth operates on radio band, at frequency of 2.45 GHz, that is unlicensed and globally available, people or “international travelers” have no obstructions to use Bluetooth-enabled equipment. Moreover, Bluetooth units might be either externally adapted or built into electronics devices. E.g. in a personal computer, Bluetooth devices can be built into the motherboard as a PC card or on the other hand it can be used as an external Bluetooth adaptor through connecting it to a USB port. From the IEEE 802 standard, each Bluetooth device has its own 48-bit address. One feature is that connections are not only point to point, but they can also be multipoint connections. Usually, Bluetooth devices have a maximum range of connection up to 10 meters. However, by increasing the power, the range of connection might be extended to 100 meters. Bluetooth devices have a technique called frequency hopping. The main purpose of this technique is to protect the devices from radio interference. The mechanism of the technique is that Bluetooth devices change arbitrarily their frequency maximally 1600 times/s. when an error occurs, it will be immediately corrected by the three complimentary error correction schemes that any Bluetooth device has. Bluetooth devices are also provided with built in verification and encryption.

When Bluetooth devices are in a “hold” mode they consume approximately 30 micro Amperes from the battery of the electronic devices, such as cell phones and laptops, while they consume a range of 8 to 30 milliamps, less than one tenth of a watt, when they are in the active transmission mode. Moreover, only 0.3 mA are consumed by the radio chip, when in a standby mode, which means that it’s less than 3 % of the power that is used by a mobile phone. Also, the radio chip has an excellent power-saving feature that is as soon as the traffic volume lessens, the chip shifts automatically to a low-power mode. The above indicates that Bluetooth devices do not drain precious battery life.

But beyond un-tethering devices by replacing the cables, Bluetooth devices can form a small and private ad hoc grouping of devices that are away from fixed network infrastructures by providing universal bridges. These bridges are to connect between a device, data networks and a peripheral interface. Furthermore, noisy radio frequency environment does not affect Bluetooth devices since they are designed to use a scheme called frequency hopping scheme and as well as a fast acknowledgment in order to make the link active and strong. After sending or receiving a packet, Bluetooth radio modules keep away from interference from other radio signals by jumping “hopping” to a new frequency. The Bluetooth radio uses shorter packets and jumps “hops” faster than other systems that are operating in the identical frequency band the thing that makes the Bluetooth radio more active and stronger than other systems. In addition, these fast hopping from a frequency to another and the short packages decrease the impact of domestic microwave ovens. Random noise may affect long-distance links. However, Forward Error Correction (FEC) is used to eliminate this impact. The encoding is, then ideal for an uncoordinated environment.

At the bit level, Bluetooth security is guaranteed. Users can control the authentication by using a 128 bit key where radio signals are coded with 8 to 128 bits. The Bluetooth radio transmissions match the safety standards that are required by the countries where the technology is used with respect to the effects of radio transmissions on the human body. Bluetooth enabled devices’ emissions are less than those from industry-standard cordless phones. Also, Bluetooth module do not interfere, cause harm or even affect public or private telecommunications network.

2.1 Bluetooth Operation Modes:

A feature of Bluetooth technology is that, once the Bluetooth devices come in range of each other, they will instantly form networks between each other.Another feature is that a number of devices can be connected together via Bluetooth in an ad hoc form. This feature is technically known as “Piconet”.

In a piconet two or more devices can be connected together. A scatter net can be formed by multiple independent and non-synchronized piconets. Moreover, any device that is in a piconet can be a member of another piconet through a technique called time multiplexing. This technique’s mechanism is to share the time suitably, thus a device can be a member of two or more piconets. As Bluetooth system supports multi-point connections as well as point to point connections, Bluetooth devices can be connected to a maximum of another 7 points, a point to multipoint connection. Every piconet has different frequency hopping sequence and hence a number of piconets may be created and linked together. All users that are sharing onepiconet are synchronized piconet’s hopping sequence. A Bluetooth device uses different hopping sequence when the device is connected to two or more piconets where a piconet starts with 2 connected devices, e.g a laptop and a mobile phone, and may grow to 8 connected devices. All Bluetooth devices have the same implementations. However, there are two types of unites in a piconet which are master units and slave units.A master unit is the unit which synchronizes the other devices by its clock and hopping sequence, where the other devices in the piconet are called slave units. In order to differentiate between units participating in the piconet, a 3-bit MAC address is used. When a unit does not have a MAC address, it is called a park unit. Park units are usually synchronized. Anyway, since parked units have an 8 bit address, a maximum of 256 parked units may be existed.

3.0 Bluetooth’s positive impact:
3.1 Huge Impact:

What could the practical use of Bluetooth be on Society? Well, it is unlimited and depends on the way it is used. From a practical viewpoint, we can adapt all computerized

Have a look at the list below!

Printers
Desktop and laptop computers
Modems
LAN access units
http://www.sysopt.com/img/icons/war_ex.jpghttp://www.sysopt.com/img/icons/log_v450.jpgFax devices
Phones and pagers
Headsets
Keyboards
joysticks
Notebook computers

Practically, most of the digital devices can be a part of the system of Bluetooth. The dynamic of Bluetooth’s connectivity nature has the ability to replace USB (Universal Serial Bus). A Bluetooth-mouse can be in existence by using the improved plug-and-play-systems. The installation can take effect after the operating system is rebooted.

3.2 Bluetooth’s Applications
3.2.1Bluetooth and the Internet

One of the most important advantages of Bluetooth is that it enables you to connect a device provided with internet connectivity to another device that is not. E.g. you might connect your hand phone that has a built-in Bluetooth to your laptop that has a Wi-Fi via Bluetooth connection. Then, via you laptop’s Wi-Fi, if exists, to a Wi-Fi router. Once your laptop is connected to the internet you can enable your hand phone to be connected to the internet as well. Moreover, this example goes correctly to most of the devices that have Bluetooth technology not only hand phones.

Buying a device that has a Wi-Fi or buying another that has a Bluetooth technology is sometimes a confusing choice. One may think that by choosing Bluetooth he/she will not be able to connect to the internet and in this case choosing a Wi-Fi would be better.

However, this is not such a really correct choice. Since, from the above example, you can modify your device to get connected to the internet via the Bluetooth technology purchasing a device that has a Bluetooth technology is a better choice as you can get two technologies in one ; internet connectivity (the function of Wi-Fi is achieved) and Bluetooth technology.

3.2.2 Some Other Applications:
A Bluetooth-mouse could be used at a further distance from a monitor, and while moving about in the room.
A Bluetooth-keyboard could be used further away from the monitor. This would reduce eye-strain for persons who are long-sighted. Increasing the distance would also reduce exposure to electromagnetic radiation from the monitor.
A Bluetooth-keyboard could also be used to address more than one computer, in a dynamic, switchless manner.
You can use your e-mail while your laptop lies in a briefcase; when your laptop receives an E-mail your mobile phone will immediately alert you and then, you can read the received e-mails in your mobile phone’s display.
A businessman may enable his laptop to find a suitable printer once he enters a company.As soon as a suitable printer is found, data will be sent from the laptop to that printer via Bluetooth connection to print it out.
Make a connection to printers and faxes without messy cables.
Wireless connection to video projectors and digital cameras.
An easy and elegance connection from cell phone to handsfree headset.
A useful connection between Bluetooth interface to office private branch exchange (PBX).
Smooth creating of dial up networks and automatic e-mail.
Use mobile phones as office wireless phones.
Use of personal computers or PDAs (Personal Digital Assistants) as hands-free phone.
Automatic transferring and swapping of files, software, electronic business cards, calendars etc.
Dancing couples at a dance hall could receive the music through their headsets and pick the dance of their choice

Not to mention many more to come.

3.3The influence of the Bluetooth technology on Society:

Thanks to the Bluetooth technology, a wireless LAN (Local Area Network) can be implemented without wires. This means that all the functions of a conventional fixed LAN are available in a WLAN including file sharing, peripheral sharing, Internet access and many more.

3.3.1 Mobility and low cost:

Mobility and cost-saving installation are the main advantages of wireless networks in society. Most of the application scenarios of wireless are related to these two features. Mobility enables users to roam on while being connected to backbone networks. Many jobs require wandering workers. Portable computers are indispensable for people like inventory clerks, healthcare workers, police officers, and emergency-care specialists. Wireless networking provides important cost savings in the areas where cables cannot be easily installed, such as historical buildings and residential houses. In distant sites, branch offices and other situations where on-site networking expertise might not be available or fast networking is needed, computers equipped with wireless LANs can be pre-configured and shipped ready to use.

3.3.2 Circulation of Information:

The wireless local area network business has been focused on offices since the industry began. But recently, home networking is seen to be a fast growing market. The personal computer has become a powerful platform for education, entertainment, information access and personal finance applications. At home, with the wide use of PCs and the Internet becoming the main way to access information, the role of the PCs has expanded and will continue to expand especially in the area of education. On the social side, this means that wireless networks is the easy way to access internet, find information and at the same time the users gain more knowledge through browsing the internet.

3.3.3Avoiding wire tangle

Bluetooth is also one of the examples of using wireless range other than PDA and etc. The objective of Bluetooth technology is to replace cables and infrared links used to connect unrelated electronic devices with one universal short-range radio link.

3.4 Industrial boom

Bluetooth applications reflect the mobile phone industry background of the inventors of some famous phone models for example Nokia, Sony Ericsson, Motorola and many more. There are many useful things that Bluetooth had given to our society.

3.5 Information Interchange

In meetings and conferences, users can share information instantly with all participants without any wired connections.

3.6 Convenience:

A user can also cordlessly run and control, for instance, a projector, or can connect his headset to his laptop or any wired connection to keep hands free for more important tasks while in the office, home or in the car. When laptop receives an email, the user will get an alert on mobile phone. Users can also browse all incoming emails and read those selected on the mobile phone’s display. Public Bluetooth wireless access points could enable free or charged access to information and services through laptop computers and PDAs. For example, imagine being able to browse the catalog of a public library on your handheld PC as soon as you enter the building. Or, imagine how helpful it would be to have instant PDA access to a building map and to customized real-time flight arrival and departure data as you make your way through a busy airport. Consider the convenience of having automatic, wireless access to a shared hotel printer as you make last-minute changes to a presentation in your hotel room.

3.7 Productive and Time saver:

In a Bluetooth-enabled business world, the cellular phone could provide a link to everything beyond the 10-meter range limitation. Without even removing their Bluetooth-enabled cell phone from their briefcase or luggage, mobile professionals would be able to check voice mail, send a fax, receive email, verify inventory levels, and surf the Internet through their laptop computers. This would extend the concept of anytime, anywhere data access well beyond the current standard of cell phones and beepers.

3.8More Freedom:

In considering the above scenarios, some may argue that the access of Wireless LAN would increase our freedoms and improve our professional lives because it allowing us to decide when and where we do our work. Wireless certainly has the potential to improve society, and our personal and professional lives in various ways. It gave us many advantages.

4.0 Negative Impact:

The convenience of Bluetooth technology cannot be denied, but neither can the way they have negatively impacted daily living. Some of the effects can be dispensed with if boundaries are set. Here we will discuss some negative impact of Bluetooth.

4.1 Violation of Privacy:

Mobile phones and PCs usually contain private stuff such as family pictures, bank account numbers, passwords and so on. Unfortunately, these devices are able to be hacked by Bluetooth. The hacker sends a user a file (let’s say an image), when received, it will be opened as a usual image where actually it has a built in hacking software. Once the user opens the image, the hacker will be able to control the user’s device.

4.2 Sabotage:

In the same way as downloading from the internet, transferring data via Bluetooth may harm your device if the data contain viruses. The strength of the viruses varies from one to another; some viruses are easy to be removed while some viruses totally damage the device once they reach the device.

4.3 Health Damage:

While the topic remains controversial, there are people who believe the microwave radiation the phones emit can cause such problems as cancer and Alzheimer’s disease from prolonged use.

4.4 Use in inappropriate places:

Using Bluetooth to transfer data in inappropriate places is considered to be harmful and such behavior expresses a negative attitude. E.g. Students in a classroom may get busy transferring data via Bluetooth using their hand phones while their lecturer is giving a lecture.

4.5 Sleazy Contents:

A child or a teen-ager may innocently receive a file that contains sexual or immoral contents which are not suitable for their age.

5.0 Conclusion and Recommendation:

This project was done to assess the impact of Bluetooth technology on the society in the near future. We have investigated in how and why Bluetooth technology has widely spread and what impact that Bluetooth technology has been giving. In addition, we have explained in details the reasons of this impact.

Mainly, the bases we had for this report are from a survey that we have done on 100 students and lecturers from the whole population of Multimedia University and Limkokwing University, Cyberjaya campuses. We used the collected data from this survey to explore the impact of Bluetooth technology on the society as we used MMU and Limkokwing Universities as small societies. We have heavily used the internet to collect information about Bluetooth technology and also to find out its impact on other societies. Why our survey was on only 100 people is that, making a survey on a bigger number of people would be expensive and time consuming.

This report might be useful to social specialists to find out why some technologies have more interest on people, Bluetooth developers as they can decrease the negative impact of this technology, and to other people who are interested in the relationship between the technology and the society.

Future researchers who are going to do a research on the same topic should keep this in mind, this research has been done based on a questionnaire that was made on people who have almost the same range of age as most of them are university students. Future researchers who have a wider research ranges can make their research on different age ranges.

The results of the survey showed that 90 % of the people, who have answered the survey, have Bluetooth-enabled devices. The reason of this huge number of Bluetooth users is that, most of the new electronic devices that are available nowadays have Bluetooth technology.

Moreover, about 25 % of the participants, who have Bluetooth-enabled devices, have faced negative impact of Bluetooth technology such as getting hacked or received viruses during data transfer. The reason is that most of them didn’t know that Bluetooth may carry viruses or hacking software when data is transferred.

Even though Bluetooth technology is widely spread so far, we think it will make life easier and more effective if it is spread on a wider range. For example, providing Bluetooth connection in universities’ laboratories would make data transferring between researchers, students or lecturers easier. However, care must be taken when using data transfers via Bluetooth since data may have viruses or hacking software. A good solution is to use anti-virus and anti-hacker software. Moreover, children and teenagers have to be watched when using Bluetooth-enabled devices.

References:
Book:

Christian Gehrmann(2004), Bluetooth Security, Artech House Publishers

Robert Morrow(2002), Bluetooth : Operation and Use, McGraw-Hill Professional (Telecommunications)

Tom Siep (2000), An IEEE Guide: How to Find What You Need in the Bluetooth Spec, Institute of Electrical

Online article:

Pyramid Media Group, Inc Constantly Connected: Beyond WiFi and Bluetooth, http://www.findarticles.com/p/articles/mi_m0QXQ/is_2005_June_30/ai_n15341855

http://www.swedetrack.com/images/bluet00.htm

Articles from magazines:

Clive Akass (28 Jan, 2006), Bluetooth to hit 100Mbits/sec, Pc Magazine pp. 41-42

Michael Kwan (Thursday April 6, 2006), Jabra shows off Bluetooth goods at CTIA, Mobile Magazine pp. 23

Appendices:

Appendix A: Survey on Bluetooth and its Impact on Society in the near future:

Topic: Bluetooth and its impact on the society in the near future

We are from EHM 3066 (Engineers and society) class. We are conducting a mini survey regarding the Bluetooth and its impact on society in the near future. Kindly spend a few minutes to answer this questionnaire. Your answers will be kept confidential. We would like to express our advanced gratitude for your co-operation.

Instruction: Please TICK (v) in the appropriate boxes or write your responses in the given space:

Survey on Bluetooth and its Impact on Society in the near future:

We a group of students

We would like to kindly invite you to participate in this survey to help us gain valuable information of the Blueooth and its impact on society in the near future.

Please tick or write your responses in the given space.

A- GENDER

omale ofemale

B- AGE

o18-24 o25-30 o30-35

Other : ____

C- Occupation

Do you have any Bluetooth-enabled device?

oYes oNo

If your answer is yes, what kind of device that you have is equipped with Bluetooth?

oHand phone or Personal digital assistance (PDA) oLaptop oPersonal Computer

o I dont have oOther : _________

how often do you use bluetooth?

onever osometimes ooften oalways

What kind of data/file that you use Bluetooth to transfer?

oImage oMusic oDocument oSyncronisation

Give your opinion on what impact of the Bluetooth on the societ is?
Do you support the idea of installing Bluetooth devices in all computers in your company/university Computer Lab in order to increase the usage of Bluetooth?

?Yes ?No

Have you known that Bluetooth-enabled devices are able to receive viruses and to get hacked?

?Yes ?No

Have you ever got your Bluetooth-enabled device hacked by Bluetooth?

?Yes ?No

Have you ever received unwanted data by unknown people ?

?Yes ?No

How often do you get busy using Bluetooth to transfer data with another person in places such as classrooms or while you are driving?

onever osometimes ooften oalways

Long Term Evolution (LTE) Technology

Long-Term Evolution [LTE]
Abstract

The 3GPP long-term evolution [LTE] is the step towards the radio air-interface evolution for 3G technology to deliver “Mobile Broadband”. It is being defined and standardized by the 3GPP to functionally evolve the radio access technology and enhance the performance of 3G technologies to meet user-expectations over long-term i.e, 10 years and beyond. LTE targets to achieve this by improving the 3G coverage, system capacity, data rates and spectrum efficiency. It also aims at reducing the latency and enhance other radio performance parameters while reducing user and operator costs. The above LTE requirements would be fulfilled by the use of new multiple access schemes on the air interface: OFDMA (Orthogonal Frequency Division Multiple Access) in downlink and SC-FDMA (Single Carrier Frequency Division Multiple Access) in uplink. Furthermore, Multiple-Input and Multiple Output(MIMO) antenna schemes are used to achieve higher bit-rates. The first section of the article presents the evolution of the 3GPP-LTE, while the second section lists the physical performance targets as defined by the standards. The section that follows presents the technical building blocks and the architecture of the LTE system. The article concludes by discussing the economic target defined by the standards and the current status of the LTE system.

Introduction:

The large-scale deployment of the Wide-band Code Division Multiplexing (W-CDMA) or the 3G technology across the globe prompted the 3GPP to take steps towards the evolution of the 3G air interface. The High-Speed Downlink Packet Access (HSDPA) was introduced in 3GPP Release 5[1] to increase the performance of the downlink while High-Speed Uplink Packet Access (HSUPA) was introduced in 3GPP Release 6[2] to enhance the uplink data rates. HSPA+(High-Speed Packet Access Plus) is being introduced in release 7[3] to enhance performance of HSPA based radio networks in terms of spectrum efficiency, peak data rate and latency, and exploit the full potential of WCDMA. The characteristics of HSPA+ such as the use of the downlink MIMO (Multiple Input Multiple Output), higher order modulation for uplink and downlink, improvements of layer 2 protocols, continuous packet connectivity and enhanced uplink meet immediate and mid-term needs of the end-users. However the operator and end-users expectations are growing rapidly and alternative competitive access technologies are emerging continuously. To ensure long-term competitiveness of 3G technology, the 3GPP included the “Evolved UTRA and UTRAN “work item in 2004[3][4]. The aim of the work item is to investigate the means of achieving enhanced service provisioning by improving data rates, capacity, spectrum-efficiency, and latency thereby providing optimum support for packet-switched services[5][6].

Physical air-interface Performance Requirements of the 3GPP Long –term Evolution [LTE]:

The requirements for the design of the 3GPP LTE system is prescribed in the 3GPP specification 3GPP TR 25.913[3] and is summarized as follows:

Providing significantly higher data rates compared to the existing technology such as the HSDPA and enhanced uplink, with target peak data rates up to 100 Mb/s for the downlink and up to 50 Mb/s for the uplink.
The capability to provide three to four times higher average throughput and two to three times higher cell-edge throughput when compared to systems based on HSDPA and enhanced uplink as standardized in 3GPP Release 6.
Increased spectral efficiency upto four-folds compared to 3G technology.
Improved architecture and signalling to significantly reduce control and user plane latency, with a target of less than 10 ms user plane RAN round-trip time (RTT) and less than 100 ms channel setup delay.
Support scalable bandwidths of 5, 10, 15, 20 MHz and including bandwidths smaller than 5 MHz for more flexibility. In order to protect the investments already made by the operators, updates and modifications to the existing radio network architecture is being proposed. This involves a smooth migration into other frequency bands, including those currently used for second-generation (2G) cellular technologies such as GSM and IS-95.
Support for operation in paired (Frequency Division Duplex / FDD mode) and unpaired spectrum (Time Division Duplex / TDD mode) is possible.
Support for end-to-end Quality of Service
Support for inter-working between the existing UTRAN/GERAN and other non-3GPP systems. The handover delay between them to be less than 300 milliseconds for real-time services and less than 500 milliseconds for non-real-time services.
An enhanced Multimedia Broadcast Multicast Service(E-MBMS) shall be supported.
Reduced capital and operational expense shall be ensured.
Optimized support for low mobile speeds (0-10 mph) as well as support for high mobile speeds (10 -30 mph).
LTE System Building blocks:

The following technological building blocks enable to meet the LTE system requirements as prescribed by the 3GPP:

Radio Interface Technology:
In order to meet the requirements of higher data rates, a new radio transmission technology called the Orthogonal Frequency Division Multiplexing (OFDM) has been selected for the downlink and Single Carrier-Frequency Division Multiple Access (SC-FDMA) for the uplink. In an OFDM system, the available spectrum is divided into multiple carriers, called sub-carriers, which are orthogonal to each other. Each of these sub-carriers is independently modulated by a low rate data stream. Different bandwidths are realized by varying the number of subcarriers used for transmission, while the subcarrier spacing remains unchanged. In this way operation in spectrum allocations of 1.25, 2.5, 5,10, 15, and 20 MHz is supported. OFDM enables transmission adaptation in frequency domain in E-UTRA. OFDM has several benefits including its robustness against multipath fading and its efficient receiver architecture. It is used in WLAN, WiMAX and broadcast technologies.
In order to achieve higher throughputs and increased spectral efficiency so as to meet the coverage, capacity and data rate requirements, Multiple Input Multiple Output (MIMO) antenna solutions are used by the LTE systems. MIMO refers to the use of multiple antennas at the transmitter and the receiver side. MIMO beamforming could be used to increase coverage and/or capacity, and spatial multiplexing, sometimes referred to as MIMO, can be used to increase data rates by transmitting multiple parallel streams to a single user[7].
In order to meet the improved latency requirement, it was required to reduce the number of network nodes involved in data processing and transport. A flatter architecture[8] as prescribed by the standards would lead to improved latency and transmission delay. Figure below depicts a simplified LTE system architecture and it consist of two types of network nodes one at user plane and the other at the control plane.
Evolved NodeB(eNodeB): It is the enhanced BTS that provides the LTE air interface and performs radio resource management for the enhanced LTE radio interface
Access Gateway(AGW): It provides the termination of LTE bearer and acts as the mobility anchor point and packet date network gateway for user plane.
SAE is a study within 3GPP targeting at the evolution of the overall system architecture. The focus of this work is on the packet-switched domain with the assumption that voice services are supported in this domain. This study envisions of an all-IP network [8] and the support of heterogeneous access networks in terms of mobility and service continuity[9].
LTE Economic Targets – Benefits to Operators and End-users:
Performance and capacity of LTE systems as discussed in the earlier sections shall facilitate the provisioning of high-quality multimedia-rich applications. While the users are catered with innovative services, the operators generate revenue from alternate avenues other than from voice.
Avoidance of complicated architectures and unnecessary interfaces, reuse of existing system and spectrum, efficient operations and management along with the optimized performance by the radio technologies yield an overall reduction cost per bit. This benefit the end-users to access services at a low cost and operators benefit from low OPEX and CAPEX.
Current Status and Future of LTE:

For LTE that offers high-performance radio interface, it requires a high-performance core network inorder to experience commercial success. Impact on the overall network architecture including the core network is being investigated in the context of 3GPP System Architecture Evolution (SAE). It aims at optimizing the core network for packet-switched services and including the IP multimedia subsystem that supports all access technologies. The combined evolution of LTE and SAE forms the basis for the 3GPP release-8. As of today[3GPP website], 3GPP has approved to freeze the functional requirements of LTE as well as SAE as part of release-8.[10]

There is proof of substantial industrial commitment towards LTE deployment in form of contributions and intellectual inputs to the 3GPP LTE specification groups. Also, many recent press announcements from vendors and operators indicate the same[11].

References

[1] 3GPP TS25.855, “High Speed Downlink Packet Access;Overall UTRAN Description”, version 5.0.0.

[2] 3GPP TS25.999,”High Speed Packet Access Evolution, Frequency Division Duplex”, version 6.1.0

[3] 3GPP TS 25.913; Requirements for E-UTRA and E-UTRAN(Release 7)

[4] 3GPP, RP-040461, ”Proposed Study Item on Evolved UTRA and UTRAN“, www.3gpp.org.

[5] H. Ekstrom et al., “Technical Solutions for the 3G Long-term Evolution”, IEEE Communications Magazine, March 2006.

[6] E. Dahlman et al. “The 3G Long-Term Evolution – Radio Interface Concepts and Performance Evaluation”, Proceedings of the VTC 2006 Spring.

[7] http://www.3g4g.co.uk/Lte/Tutorials/RandS_WP_LTE.pdf

[8] 3GPP TS 22.978; All-IP Network (AIPN) feasibility study (Release 7)

[9] 3GPP TS 23.882;” 3GPP system architecture evolution (SAE): Report on technical options and conclusions”, Release 7.1.9

[10] 3GPP TS 36.201;” Evolved Universal Terrestrial Radio Access (E-UTRA); Long Term Evolution (LTE) physical layer; General description”, Release 8.0.1

[11] http://www.etsi.org/WebSite/document/Barcelona_2008.pdf

[12] http://www.ericsson.com/technology/whitepapers/lte_overview.pdf

Technology of Ultrasound Scans

2.1 Ultrasound
2.1.1 Physics of Ultrasound

Sound is a mechanical wave that travels through an elastic medium. Ultrasound (US) is sound at a frequency beyond 20 000 Hz, the limit of human hearing. Bats orientate themselves with the help of US waves at 100 000 Hz. Ultrasound at frequencies of 200 000 Hz is used for navigation. The frequency range of diagnostic US is between 1 and 20 MHz.

When sound encounters a boundary between two media of different densities some of the sound bounces back as an echo, a phenomenon called reflection. The rest of the sound continues through the medium but is deflected from its original path, this is called refraction. Acoustic impedance is the resistance of a medium to the propagation of sound and decides how much sound will be reflected at the interface between the media. Some of the energy of the sound is converted by friction into heat when propagating, this loss of energy is called absorption.

When ultrasound waves encounter a surface, a small part of their energy is scattered away in random directions while most of the sound continues to propagate, a phenomenon called scatter. Reflection, refraction, impedance, absorption and scatter are all phenomena important for image formation in diagnostic ultrasound use. Artifacts, echoes that do not correspond to an anatomic structure but result from the physical properties of ultrasound propagation in the tissues, are also important to be aware of when using ultrasound. This phenomenon can also be of diagnostic help. One example is the acoustic shadowing of a gallstone, caused by total absorption of the sound by the stone.

Diagnostic ultrasound is based on the pulse-echo principle. The smallest functional units of the transducer are the piezoelectric crystals. The crystals are embedded in the probe, and each crystal has a specific frequency. A pulse is initiated from each crystal in the probe and a longitudinal sound wave propagates through the body. Some of the energy is absorbed in the tissue and some is reflected. The reflected energy is received by the probe, which calculates the depth of the interface by measuring the time taken to return.

We can say that the human body is composed of three basic materials differing in acoustic impedance: gas with a very low impedance, bone with a very high impedance and soft tissue with an impedance somewhere in between. The large mismatch between air and bone and tissue (“impedance mismatch”) causes 100% of the sound to be reflected at air/tissue interfaces and almost all the sound at bone/tissue interfaces. There is a small mismatch between different soft tissues in impedance, a fact that is the basis for diagnostic ultrasound.

Different frequencies of ultrasound are used for different diagnostic examinations. Higher US frequencies (7-16 MHz) have higher resolution but are strongly absorbed by soft tissue and are therefore used for superficial structures. Very high frequencies (16- 20 MHz) will only travel for a few millimeters within tissue and are limited to intravascular and ocular examinations. Lower frequencies (3-7 MHz) are used for deeper structures, being less strongly absorbed and of lower resolution.

There are different modes of displaying the amplitude of reflected sound waves: A- mode, M-mode and B-mode. A-mode (amplitude) calculates only the depth of the interface and is mainly of historical interest. M-mode (motion) is used to display moving structures and is used in cardiac ultrasound. B-mode (brightness) is the routine US image for most surgical applications. Here the returning echoes are displayed as shades of grey with the echo amplitudes represented by a grey level ranging from black to white. The individual image lines are stored, assessed and assembled on the monitor to create a two-dimensional B-mode image.

Doppler ultrasound uses the Doppler effect. When US is reflected from a moving structure (i.e. blood) the frequencies of the waves change and the amount of frequency change is determined by the speed and direction of blood flow. The use of Doppler is obvious in vascular US but is also of use in other areas of diagnostic ultrasound.

2.1.2 History of Ultrasound

Scientists, including Aristoteles, Leonardo da Vinci, Galileo Galilei, Sir Isaac Newton and Leonard Euler, have been studying the phenomena of acoustics, echoes and sound waves for many centuries. It was though not until 1877 that John William Strutt, also known as Lord Rayleigh, published a description of sound as a mathematical equation in “The theory of sound” which became the foundation for the science of ultrasound. Some years later, 1880, Jaques and Pierre Curie discovered the piezo-electric effect; that an electric potential is generated when mechanical pressure is applied to a quartz crystal, an important discovery that eventually led to the development of the modern- day ultrasound transducer which contains piezoelectric crystals.

The first study of the application of ultrasound as a medical diagnostic tool was published by the Austrian brothers Karl and Friedrich Dussik in 1942. They attempted to locate brain tumours and the cerebral ventricles by measuring ultrasound transmission through the skull and concluded that if imaging of the ventricles was possible, the interior of the human body could also be visualized using ultrasound. Unfortunately it was later determined by Guttner, in 1952, that the images produced by the Dussiks were variations in bone thickness. Nevertheless, their scientific work marked the beginning of diagnostic ultrasonography in the medical field and Dussik wrote in an article a decade later: ”As knife and forceps in surgery, the chemical agent in chemotherapy, the high frequency electric field in diathermy and X-ray application, so has medicine taken on a new physical tool in the last decade: the ultrasonic field”.

George Doring Ludwig, working together with Francis Struther, was the first scientist to visualize gallstones, implanted in the muscles and gallbladders of dogs, with ultrasound. His studies also resulted in the finding that the mean velocity of ultrasound in soft tissue is 1540 m/sec, a discovery that was to prove very important for future research. Much of his work was however considered restricted information, because he was employed by the military, and therefore not published in medical journals.

John Julian Wild and Douglass Howry were also important pioneers in the ultrasound field. Wild was a surgeon who was able to visualize bowel wall thickness with ultrasound, and he also discovered a difference in echogenicity between benign and malignant tissue. Wild also developed transrectal and transvaginal transducers and a scanning device for screening patients for breast cancer. Howry built the first B- mode scanner in 1949 and, together with the two engineers Bliss and Posanky, he also developed the first linear contact scanner. The somascope, the first circumferential scanner, built in 1954, was also developed by Howry. The problem with these scanners was that the patient had to be immobilized and immersed for a long time. In the period 1957-58 an ultrasound scanner was developed by Howry and his colleagues where the patient was strapped to the plastic window of a semicircular pan filled with saline solution. Although not immersed, the patient had still to be immobilized for a long time. Finally, in the early 1960s, Howry developed the first hand-held contact scanner, together with Wright and E Myers.

During the same time Ian Donald was carrying out ultrasound research in England and 1958 he published an article that came to be a landmark, (“Investigation of abdominal masses by pulsed ultrasound”), where he describes how ultrasound changed the treatment of a woman diagnosed with advanced gastric cancer dramatically by diagnosing a cystic mass with ultrasound; the mass was later resected and found to be a benign ovarian cyst. Donald contributed significantly to the field of obstetric and gynecological ultrasound for example by discovering the urinary bladder to be a natural acoustic window for the pelvic organs and by measuring the biparietal diameter of the fetus for the first time.

A century earlier the Doppler effect had been discovered by the famous Austrian scientist Christian Andreas Doppler and presented in 1842 in a paper called “Uber das farbige Licht der Doppelsterne und einiger anderer Gestirne des Himmels” (“On the colored light of the double stars and certain other stars of the heavens”). In Lund, Sweden, the principal pioneers of echocardiography Inge Edler and Carl Hellmuth Hertz, developed the first echocardiogram in October 195323 . Subsequently Hertz and Asberg invented the first two-dimensional real-time cardiac imaging machine 1967 and Edler and Lindstrom registred the first simultaneous M-mode and intracardiac Doppler blood flow recordings at about the same time.

Ultrasound has in the last decades developed quickly and the first digital scanners were released onto the market in 1976, providing better and reproducible images.

Interventional ultrasonography dates back to 1969 when Kratochwill proposed the use of ultrasound for percutaneous drainage. Regarding ultrasound for trauma the first report of the method for evaluating blunt trauma was dated 1971, by Kristenson in Germany.

The development is still going on and in the light of advances in technology leading to smaller available machines combined with the prices of machines decreasing rapidly speculations have been made about the possibility that doctors in the future will routinely be equipped with their own ultrasound stethoscope for use in their daily clinical work.

2.1.3 Ultrasound Instruments

It is important to have a basic knowledge in which an ultrasound image is produced. The components of scanner include

Transmitter: Emits electrical impulses that strike the transducer piezoelectric crystals and cause them to vibrate thus producing ultrasound wave.
Transducer: Transducer is one which converts one form of energy to another. In ultrasound it converts electric energy to mechanical energy and viceversa. It converts the electrical energy provided by the transmitter to the acoustic pulses directed into the patient. It serves as the receiver of reflected echoes, converting weak pressure changes into electric signals for processing.
Receiver: When returning echoes strike the transducer face,minute voltages are produced across the piezoelectric elements. The receiver detects and amplifies these weak signals and provides a means for compensating for the differences in echo strength which result from attenuation by different tissue thickness by control of time depth compensation. Another important function of receiver is the compression of the wide range of amplitudes returning to the transducer into a range that can be displayed to the user.
Scan Processor: Processor detects and amplifies the back scattered energy and manipulates the reflected signals for display.
Control Console
Display: Display presents the ultrasound image or data in a form suitable for analysis and interpretation. Over the years imaging has evolved from simple A mode display to high resolution real time gray scale imaging.
Recording Device: Interpretation of images and archival storage of images may be in the form of transparencies printed on film by optical or laser cameras and printers, videotape or through use of digital picture archiving and communications system (PACS). Increasingly digital storage is being used for archiving of ultrasound images.
2.1.4 Transabdominal Ultrasound, Use and Limitations

Transabdominal ultrasound of the female pelvis has been the conventional approach in imaging of the female pelvis. With this approach) a full urinary bladder is required to provide a window for imaging and to displace bowel gas. Transabdominal scanning (TAS) therefore required deeper penetration and a lower frequency transducer, usually 3 -5 MHz, must be used. The resolution of images is limited by the relatively lower frequency transducer that is required, and it also has great limitations in the obese lady, especially in the elderly who often cannot hold a full bladder. In the study of uterine hemodynamics in patients who are pregnant, these disadvantages may not be very significant, because the uterine arterial signal from these patients are usually strong. However, in the non-pregnant state, especially in postmenopausal ladies, studies of uterine hemodynamics with TAS could be very difficult.

2.1.5 Transvaginal Ultrasound, Advantages and Disadvantages

Widespread availability of ultrasound imaging in the past two decades has dramatically changed the practice of obstetrics and gynecology. These specialists rely heavily upon this technology to make major decisions about management of their patients.

Transabdominal sonography (TAS) images the pelvic organs through the anterior abdominal wall in the supra-pubic region. A distended urinary bladder is essential to displace the bowel loops and to provide an acoustic window. There are two major limitations of TAS. First is the need to use lower frequencies for imaging due to the longer distance between the transducer and the pelvic organs. Other disadvantage is the beam degrading effect of the anterior abdominal wall especially in obese patients. Both these limitations lead to degradation in image quality.

To overcome these limitations of TAS special transducers, which could be introduced in the vagina, were designed in 1985. The vaginal approach reduces the distance between the probe and the pelvic structures allowing the use of higher frequencies. Trans-vaginal sonography (TVS) produces greatly improved resolution as compared to TAS, primarily due to the higher frequencies employed and also due to the absence of beam deformation by the anterior abdominal wall, Major advantages of TVS over TAS are better image quality and avoidance of patient discomfort due to full urinary bladder. Comparison of TVS and TAS is given in Table 2.1.

2.1.5.1 Indications of TVS

TVS is indicated whenever a better look at the pelvic structures is required. Common indications include the following

Early pregnancy
Lower uterine segment in late pregnancy
Ectopic pregnancy
Pelvic masses
Retroverted or retroflexed uterus
Obese or gaseous patient
Emergency cases when bladder is empty
Follicle monitoring
Oocyte retrieval
Endometrial study to assess suitability in IVF ET techniques
Cervical canal mucous
Doppler examination of pelvic organs
Interventional procedures

The list is not exhaustive and newer indications are continuously being added.

TVSTAS

Full bladder

Not essential

Essential

Probe frequency

5-7.5 MHz

3-5 MHz

Resolution

Very high

Moderate

Field of view Small

Large

ContraindicationsVirgins, Vaginal obstruction

Premature rupture of membraneNone

interventional uses

Many usesLimited role

Table 2.1 Comparison of TAS and TVS

2.1.5.2 Scan Technique

Once the probe and the patient have been prepared, the transducer is gradually inserted while monitoring the ultrasound image. The urinary bladder’s normally consistent position in the pelvis relative to much more variable position of the uterus and the ovaries makes it a good landmark to use when making initial assessment of the transducer orientation.

Three basic scanning manoeuvres of the probe are useful to scan the pelvic organs comprehensively:

Sagittal imaging with side to side movements,
90° rotation to obtain semi-coronal images with angulation of probe in vertical plane,
Variation in the depth of probe insertion to bring different parts within field of view/focal zone.

A pelvic survey should be done first to ascertain quickly the relative position of the uterus and ovaries as well as to identify any obvious masses. This is obtained by slowly sweeping the beam in a sagittal plane from the midline to the lateral pelvic side walls followed by turning the probe 90 degrees into corona’ plane and sweeping the beam from cervix to the fundus. In multi-frequency probes proper selection is important for best results. Setting of appropriate focus in electronic arrays is equally important. In mechanical sector fixed focus probes the organ of interest is brought in the focal zone by changing the depth of insertion of the probe. Proper selection of frame averaging is also important. It should be low for fast moving structures like foetal heart and high for studying solid immobile tissues.

For Doppler studies a steady probe position is essential and it helps if the examiner’s forearm is well supported.

2.1.5.3 Dynamic uses of the TVS probe

The ultrasonographic examination can be enhanced by placing a hand over the lower abdomen to bring pelvic structures within the field of view/focal range of the probe. Localisation of the point of maximal tenderness by the probe will help in identifying the cause of pain. Dense pelvic adhesions can be diagnosed by the ‘sliding organ sign’. In the absence of adhesions, the organs move freely past each other and the pelvic wall in response to pressure by the TVS probe tip. Absence of this free movement may suggest pelvic adhesions.

2.1.5.4 Interventional uses of TVS

There are many interventional uses of transvaginal sonography. Newer indications are constantly being added to the list. Some of the more common ones are given below:-

aspiration of ova for in vitro fertilisation (IVF)
aspiration of ovarian cyst
drainage of pelvic collection
multi-foetal pregnancy reduction
non-surgical etopic pregnancy management
early amniocentesis
chorion villous sampling
transvaginal embryo transfer
sonohysterosalpingigraphy
2.1.5.5 Limitation of TVS

It should be remembered that TVS provides a more limited field of view than TAS. A survey trans-abdominal scan usually be performed prior to the TVS to rule out the possibility of overlooking a mass lying outside the field of view of the TVS transducer. To avoid the need of a full bladder it has been suggested that a TVS examination may be followed by a TAS scan with bladder empty. The rationale behind this approach is that a mass lying outside the field of view of the TVS probe will be sufficient in size to be seen trans-abdominally even if the bladder is empty.

The advent of the transvaginal sonography in 1985 has had a tremendous impact on the practice of obstetrics and gynaecology. The pelvic organs can now be imaged with a resolution not possible earlier. The management of infertility due to female factors depends mainly on the TVS. Addition of Colour Doppler to TVS now gives added information about the vascular supply of various pelvic organs. Details of foetal anatomy that can be depicted by TVS are far superior to that shown by TAS. As a new technique TVS has proved very useful and has a bright future.