The whole process through which information is passed through different computers is not magic; it is something which has worked. In order to understand the whole process the open system interconnection (OSI) model is used. Remember computers are linked for the purposes of sharing information or sources globally.
The open system interconnection (OSI) is like an industry standard framework that is used to divide the functions of networking. The OSI model of explaining how information is passed from the computer to another is divided into seven layers. A message begins at the top of the layer and moves down the OSI layers. Layer 1 is known as the physical layer. The layer physically transmits an unstructured bit over a physical link. It consists of voltages, wires and connections that receives signal sent from the ISP then sends it to layer2 (Erjavec, 2004).
Layer 2 is known as the Data link. It adds reliability to the transfer of data across media or physical link and the network topology. The layer is concerned with making the delivery of data reliable but layer one basically deals with delivery of data.Layer3 is called the network layer. It provides a link and path selection between two end systems. It provides functionality for the delivery of data from source node to destination node. Layer 4 is called the transport layer. It handles the information flow control, fault detection and recovery, and data transport reliability. It partitions larger messages into smaller data bits for easy flow across the network (Eli, 2007)
Layer5 is known as the session or protocol layer. It manages and ends connections between cooperating applications. It involves the control structure for communication between applications. Layer 6 is called the presentation layer and it performs useful transformations on data. It ensures that data are readable by the receiving system. Layer7 is the application layer. The layer receives data from layer6 and sends it to the user of the system. It generally provides connection system in application processes such as email.
Layer Number
Layer name
PDU
Purpose
7
Application
Data
Provides network services to application Processes (such as email, ftp and telnet)
6
Presentation
Data
Ensures that data are readable by the receiving System. Deals with data presentation
5
Session
Data
Manages and terminates sessions between applications.
4
Transport
Segments
Handles information flow control, fault detection and recovery, and data transport reliability.
3
Network
Packet
Provides connectivity and path selection between two end systems.
2
Data link
Frame
Provides the reliable transfer of data across media using physical addressing and network topology
1
physical
bits
Receives the signal sent from ISP and sends it to layer2
Source: Self
The term multiplexing refers to the process of merging multiple communication paths into a single signal path. It is used in the wireless and landline telecommunication industry. The path is created by connecting each module or system which collects, calculates exchange or display data to a single serial communication bus. The advantage of multiplexing is that it allows the transmission of multiple communications over a single line. There are two methods of multiplexing namely: frequency division and time division multiplexing. Frequency division multiplexing in communication system is a way of enabling separation between channels so that several messages can be handled by the same circuit at the same time, each message being transmitted at a different frequency. It is a scheme in which numerous signals are combined for transmission on a single communications line. Each input signal is sent and received at maximum speed all the time. However, if many signals must be sent along a single long distance line, the necessary bandwidth should be large and a careful engineering is required to ensure that the system performs properly. FDM uses a common transmission path, but each signal is modulated onto different carrier frequency and switching is achieved by providing each outlet with a demodulation which can have its carrier frequency achieved. This is used to increase the tratic capacity of the satellite by making all carrier frequencies available to all ground.
Time division multiplexing (TMD) is a method of putting multiple data streams in a single signal by separating the signal into many segments, each having a very short duration. Each individual data stream is reassembled at the receiving end based on the timing. The circuit that combines signals at the source end of a communication link is known as a multiplex. It accepts the input from each individual end user breaks each signal into segments and assigns the segments. The segments are assigned to the composite signal in arotating and repeating sequence. If many signals must be sent along a single long distance line, then careful engineering is required in order to ensure that the system functions properly. An asset of time division multiplexing is its flexibility. The scheme allows for variation in the number of signals being sent along the line, and constantly adjusts the time intervals to make optimum use of the available bandwidth. More compactly, multiplexing in modern communication system enables many paths of communications into a single path. For example in a DVD player, it basically means that DVD is translating the files into a proper DVD format (Mark, 2001).
Internet 2 is a joint project by over 120 research universities working with partners in industry and government, to develop a new family of advanced applications to meet emerging academic requirements in research, teaching and learning. This group came together to address the following challenges facing the next generation of university networking performance. The most important one is creating and sustaining a leading edge network capability for the national research community, second one is to direct network development efforts to enable the new generation of applications to fully exploit the capabilities of broadband networks, and lastly they wanted to fasten the transfer of new network services and applications to all levels of education use and to the entire internet community both nationally and internationally. The objectives of coming up with internet 2 was developed when researchers and academicians found out that the current internet lacks applications and capability to process and access very large complex data. This was to help the new generation full of knowledge to exploit all the opportunities across the world because it was found to be a way of building better connectivity as a means of accessing applications and reaching other sites (Guang, 2006).
Protocols such as TCP normally provide a reliable end to end delivery by helping in recovery from network errors but are also inadequate in real time applications such as voice, remote imaging or time sensitive data collection system which require quality of services and minimization of loss. Such application cannot be performed by commodity internet (Donald & Derek, 1963).
Internet 2 is deploying and developing advanced network applications and technologies for research and higher education acceleration of the creation of tomorrows internet. The advanced networks include the Abilene and middleware. Abilene is referred to as an advanced backbone network that supports the development and deployment of the new applications being developed within the internet 2 community. It connects regional network aggregation points. The middleware is a layer of software between the network and the applications. It provides services such as identification, authentication, authorization and security. Nowadays internet applications usually have to provide these services themselves which will lead to high standards. While internet1 is open to mostly anyone with a computer, access to internet 2 is limited to a few and its backbone is made up of entirely of large capacity fiber-optic cables. Therefore internet 2 is expected to speed up the rate of moving the data for example at 10 gigabits per second or more Alexendrar Russo (2005). This internet was therefore targeting higher education and teacher educations preferably at the university level (Richard, 1997).
Although update internet2 is available to a limited number of institutions it is expected that the technology will be available to everyone in the near future where it is to enhance the possibility of enabling collaboration between united States and other countries around the world. A zigbee (802, 15.4) is the name of a specification set of high standard or level communication protocols designed to use small, low power digital radios based on the IEE 802. 15.4 standards for wireless personal area networks. It was developed with an aim of enabling reliable, cost effective, low power and wirelessly networked monitoring and control products based on an open global standard. The targeted produce and application of Zigbee (802.15.4) are medical monitoring, home industrial and building automation or control.
A Bluetooth (802.15.1) system operates in the 2.4 GHZ unlicensed band. It uses a lot of power and the protocols are too complex therefore too expensive to be embedded in virtually every kind of device imaginable. It uses spreads-spectrum technologies with each device, when transmitting hopping pseudo-randomly in frequency at a 1600 hops/sec rate. The modulation used is binary Gaussian-shaped frequency-shift keying (GFSK). The Bluetooth wireless technology is set to revolutionize the personal connectivity market by providing freedom from wired connections for portable handled devices. A Wi-Fi (802.11b/aig) system is also referred to as the IEEE802.11. It is a term for certain types of wireless local area networks (WLAN) that use specifications conforming to IEE 802 11b. It has been used in areas such as airports and hotels where people can access Wi-Fi networks to access internet and receive emails on the move. The locations are known as the hotspots. The Wi-Fi uses low power microwave radio to link one or more groups of users together or link between two buildings. It can span several kilometers point to point but cannot be used where trees are on the way. Therefore, Wi-Fi uses wireless local area network that uses highfrequecy radio signals and since it is a wireless then it is capable of transmitting the data to the 3G or third generation and T-mobile.
References
Clifford L. (1999) Information about internet 2 Mc Graw- Hill Publishers New York
Donald W. & Derek H (1963) Communication Networks for Computers.Amazon.com Publishers. London
Elinet, S. (2000) Wireless Telcom Faq S M Graw Hill Professional Publisher. Newyork
Eli N. (2007) Absolute beginners Guide to Networking. 4th Ed Safari Books Publishers UK
Erjavec, S. (2004) Automotive Technology: A System Approach. Thomson Delmar Publishers Newyork
Guang Z (2006) Body Sensor Networks.Mc Graw-Hill Publishers New York
Mark L. (2002) Firewall Policies and VPN Configuration. Amazon. Com Publishers London
Mark, P. (2001) what is the Matter with Internet: A critical theory of Cyber-space. University of Minnesota press.
Richard V. (1997). Motor Control Electronics. Mc Graw Hill Professional Publishers London.
Robern, B. (2003) Electronic Measurement and Instrumentation. Safari Books publishers UK
The monitor and the television were in the past, viewing devices that accept signals and display them on the screens. Computer monitors accepted signals from the Central Processing Unit of the computer with the help of connectors which do not have audio circuits like the television but are controlled by specialized adaptors like the monochrome or the graphic adaptors. Televisions traditionally have data processing systems known as the MINI SUB D15Connectors. Until the recent past, media and communication systems have been separate and distinct aspects, with the television being associated with broadcasting television programs while the monitor is associated with the display of content on the computer.
However, in recent times, separate functions have begun to merge and the boundaries of these separate entities have become broader and more encompassing. In todays times, it is a common phenomenon to watch TV on the Pc monitor and more and more desktops and notebooks are developed and manufactured with in-built TV tuners. It would not be surprising to find a 42-inch PC-TV combo in the technological market and this process which enables such the collapse of disparate technology, equipment and services into a set of common and ubiquitous technology, equipment and services (Internet Industry Association, 2002) is called convergence.
This paper attempts to analyze the convergence between the television and the monitor.
Main body
Television was traditionally a medium of broadcasting programs produced by television channels but is developing into an interactive medium of communication. The monitor, which was initially a simple device for the display of contents from the Central Processing Unit of the computer has now evolved into a multitasking device that allows consumers to not only use it as a monitor screen but also double up as a television. The recent innovations of high-definition plasma monitors have changed the face of television and the technology used in it.
The convergence between the television and the computer industry has enabled some of the most mind-blowing innovations in the field of technology. The latest is the merger of the large screen high-resolution television which is powered by a Pentium processor and doubles up as a multimedia computer monitor too. The new Personal Computer is converged with a Television monitor and this has resulted in a unique monitor with optimal features of a high-resolution monitor in combination with the convenience of a high-quality television (Business Wire, Oct 1, 1996). This new monitor allows the dual functions of watching the television without actually switching on their computer. These modern-day monitors are converged with the television enhancing technology of built-in cable ready tuners and can function as Personal Computers as well as television sets (Business Wire, Oct 1, 1996). Additionally, the monitors also include sound features with built-in speakers, which solves the problems of installing extra speakers at the desired place saving on time, energy, and financial resources. What is extremely pleasing is the fact that monitors are now geared with multimedia entertainment through CD-ROMS (Business Wire, 1996).
Conclusion
Thus we see that televisions and monitors are converging with the help of latest technological innovations which have enabled the two to function in unison with one another so that the individual features are enhanced and provide greater advantages to the users.
References
Internet Industry Association. (2002) IIA Convergence Virtual Taskforce.
Business Wire, (1996). Leading Convergence Company Ships New PC+TV Monitor Multimedia Options Without Disrupting TV Viewing Habits.
The purpose of this project is to provide fast effective means of secure telecommunications to protect government interests from parties that intend harm to national security. The scope of this project is limited to personnel and equipment involved in Government Special Programs and Security as required by our agency and not to include gear provided by outside agencies.
Introduction
Information without exaggeration can be considered as one of the determinant resources of development. In the modern world, information actively influences all spheres of vital functions of not only separate states, but also the global community. However, in certain cases the information can be used not only for a good cause, but also to the detriment of the interests of the individual, the society and the government. Therefore the role of information safety within the system of national security not only essentially increases, but also leads to priorities. Accordingly, the protection of communication channels occupies a special place in the main structure of a complex system of information protection.
Communication channels are among the most vulnerable components of Informational systems, where in their structure it is possible to specify a large number of potentially dangerous threats, through which malefactors can access the informational system. Modern technologies and methods of protection of telephone conversations from interception counteract practically all the means of its realization. Nevertheless, in the matter of government interests, the security priority in communications can be influential in making decisions of changing the platforms of communication and upgrading the existing communication principles. In that regard, new technical decisions would allow solving the issues of communicative security in a more complex approach without sacrificing such aspects as flexibility, price, speed and development and implementation of new functions.
This report aims to propose replacement of the existent governmental standards in securing communications, STU-III and STE with a sufficiently new and effective approach to communication, i.e. Voice over Internet Protocol. The report provides the general overview of the existing technologies and their drawbacks giving recommendations on how to overcome the obstacles in developing effective and secure communication platforms.
Background
As of 2008, the US government secure communications are based on Secure Terminal Equipment (STE), which replaced the Secure Telephone Unit Third Generation (STU- III). STE are capable to operate in PSTN, Tri-TAC, or direct connections to Radio Frequency (RF), although are mostly used through Integrated Services Digital Network (ISDN). (NSA, 2007)The current implementation of secure communications using STE and STU III considers the specifications listed in table1.
Following security as the main factor, the technical characteristics of STU III and even STE comprise of a lot of shortcomings that do not comply with modern communications features. Such shortcomings are apparent through the following outlines:
Specific equipment and vendor requirements.
Slow connectivity rates, where comparing to the mass implementation of broad band connection, ISDNs maximum of 128kbs does not satisfy communication demands. These demands include, but not limited to voice clarity, caller id, video transmission, and etc.
The requirements for additional efforts and investments to secure the calls.
High price.
Following the progress of secure communications, it can be seen that the choice is limited. Although the encryption of STU-III and STE is solid, the reliance on outdated characteristics prevents the security government communication sector to be updated accordingly. STU-III, for example is still relying on public switched telephone network (PSTN) the drawbacks of which can be summarized as following:
Developed specifically for voice transmission, the data traffic is following voice data.
The inability of fast development and implementation of new capabilities. The PSTN network is based on infrastructure the owners of which independently develop application for the hardware. The monopolization prevents satisfying all clients needs.
The impossibility to merge data, voice and video networks within the PSTN. Although broad band access seems to solve this issue with bandwidth merging different networks require a whole restructure of the PSTN for the change to occur at end users terminal.(Davidson, 2007)
Thus, it is recommended to use IP networks for establishing secure communications. This implementation is commonly called Voice over Internet Protocol, better known as VOIP. The recommendation is specified to using encrypted networks for making calls. Addressing the disadvantages of STU-III and STE, VOIP characteristics consider the following:
Minimal maintenance, none on the user
Implementing such features as Caller id for all parties in the special programs database
High connection rates, and accordingly higher voice clarity
Phones slim design saves space
Cheaper required equipments which lead to less long term costs
Operational rate is much faster than the STU IIIs and STEs
Reducing call bills costs
Proposed solution
Overview
The proposed solution implies the usage of a technology commonly known as VOIP. According to how and where the connection is formed, there are several directions that could be distinguished, such as Internet Protocol telephony (IP telephony), internet telephony, and LAN telephony, and often DSL telephony. The technical implementation in all the cases is identical and has the designation of VOIP.
In VOIP the main aspect is not as much using the internet as a medium for speech transfer, as it is using the IP protocol and supporting technologies to provide reliable and high-performance transferring voice data in packet networks. Originally, VOIP was implemented as a voice data transfer between software installed on two personal computers. Today the application of VOIP is mostly concerned with functioning within a private network and the application of VOIP over public networks. The technology of transferring voice data on networks with IP routing is attractive, first of all, because of its universality speech can be transformed to a stream of IP-packages in any point of the network infrastructure: in operators backbone network , on the borders of territorially distributed network, in a corporate network and even in end user terminal. Eventually, it will become the most widespread technology of package telephony, as it is capable of capturing all market segments, being thus well adapted for new conditions of deployment.(Witowsky, 1998)
In the reliance over IP networks VoIP can be categorized into two approaches, i.e. IP centric and IP enabled. The IP enabled architecture can be considered as combined solution where VoIP terminals are provided to the end user while the core processing and switching is based on Time-division multiplexing (TDM) circuit switch, see fig 1. IP centric, on the other hand, TDM functions are performed solely by an IP based core switching system. Connectivity to TDM, PSTN, and others is implemented through dedicated channels, see fig 2. (DISA, 2006)
Potential Areas of Concern
VoIP Technology as a transfer of voice data using IP protocol is not only a prospective technology, but also a reality. More often the companies use it as a cheap way to communicate with remote departments. Conferences using these protocols are being used more often. Specific figures of market coverage differ for analysts, but converge in one VoIP grows at tremendous rates and becoming a more significant service. In particular, according IDC, it is estimated for residential VoIP customers to be anywhere between 12 million to 44 million in the US by 2010.(VoIP by the Numbers Subscribers, Revenues, Top Service Providers, Blogs and more&, 2006).
As the area of concern should be directed toward security, in order to understand potential threats it is necessary to distinguish accurately VoIPs two main components: Voice a vocal component, important from the point of view of the end user (quality of a and clarity of voice data, delays, and etc), and IP a network component. That is the basic difference between VoIP technology and the traditional telephone system which is separated from the remaining infrastructure. Certainly, the connection of external modems etc is limited. However, it is seldom that regular checks of the configuration of a telephone exchange are conducted. Besides, security management poorly interferes with traditional telephone system work. In a case of a traditional IP-network there are some areas of concerns.
When building the infrastructure of VoIP, a set of devices is added to the existing network topology: gateways, proxies, registrators, locators, and the actual IP-telephones, where to each of these elements irrespective of, whether it is a specialized device with software built-in or usual computer with general OS and any additional software, it is possible to access this device on the network as a usual computer.
Each VoIP element has its own processor, software, TCP/IP stack which can be attacked. Attacks can be organized on the device as well as on the transmission. Attacks such as Denial of Service (DoS) are capable of overloading the network with false voice packages, reducing the productivity or even preventing the transfer of voice and data.
The attacked gateway can be used for the organization of unauthorized calls. In case of process vulnerability establishing a connection, the call can be intercepted, and sent to different address. Vocal packages can be intercepted and all communication session is restored and heard in real time. Software telephones organized on computers by corresponding software can be attacked where it is possible to install Trojan programs that will allow the attacker not only to make unauthorized calls, but also to develop an attack on VoIP infrastructure. In general VoIP threats can be summarized in table 2.
Solution
The proposed VoIP solution, considering the security concerns, can be divided into several approaches that can be summarized as follows:
Encryption
Encryption is an approach similar to the one implemented in STE and STU-III. There are numerous encryption options that can deal with securing the voice data in VoIP, such as TACLANE Encryption(Tanase, 2004), VPN setups, IPSec protocol, and Secure RTP protocol. Encrypting the voice data is also known as Secure Voice over IP (SVoIP). The key factor for the protocol is fast, as the speed and efficiency advantage of VoIP can be eliminated by the encryption reducing bandwidth. Such a solution can be characterized by the necessity of using compatible encryption algorithms among all terminals.
All inclusive platforms
All inclusive platforms are commercial products for integrating and migrating to VoIP such as HDX and SLICE 2100 proposed by REDCOM. (REDCOM, 2009) These platforms facilitate the process of installing VoIP solutions additionally providing security over the communication lines. The disadvantages of such solutions can be seen in their cost, as switching the entire network can be consuming, although the installation and setting of the system can be easier in the process of migration.
Voice over Secure Internet Protocol (VoSIP)
This solution is based on securing each element of a VoIP network. The recommendations for building secure VoIP network can be by securing the transport device within the IP centric VoIP network. This includes the physical security of components such as routers, switches, gateways and servers. Additional security measures include standard safety procedures within the network such as applying the latest security patches, updates and the software and hardware firewalls. (McCarthy, 2007)
Implementation
Installing VoIP can be determined via selecting appropriate network architecture, or in case adjusting existing networks to fulfill VoIP demands. Building the network from scratch implies the necessity of equipment which is not limited to the proposed below.
Equipment
One Closed Partition
A partition forces mandatory access control. An open partition refers to systems that not have a specifically enforced access control. Closed means that mandatory access controls are enforced. Only specified accounts can download data from that partition. Source will provide a closed partition at a predetermined level.
Computer and Peripherals
The computer providing VOIP will be used to access email, create office documents, and load data from CD or DVD to the network. These are the minimum computer requirements:
Processor, monitor, anti-virus software, Microsoft Windows XP Professional, and Microsoft Office Professional, and a printer.
Multiprocessor distributed processing
Optional redundant control per shelf
Computer Telephony Integration (CTI)
Uninterruptible Power Supply
Cabling
Media Converters/Transceivers
Power cables
VOIP Capable Phones
System Capacity
4,000 ports (non-blocking)
16,000 ports (traffic engineered)
Up to 512 ports per shelf
The recommended security implementation is a hybrid of the VoSIP and SVoIP which is Secure Voice over Secure Internet Protocol (SVoSIP).The aforementioned implementation is the most suitable as it would raise the security level to that satisfies the governmental needs. The network layout can be seen through the architecture implemented in the Department of Defense implementation guide which can be considered within the highest levels of security level, see fig. 3. (DISA, 2006)In such a way, it can be seen that the recommended installation of VoIP technology would satisfy the governments special needs in terms of security.
List of Illustrations
Table 1(NSA, 2007).
Terminal
Data Rates
STU-III/Low Cost Terminal (LCT)
Supports voice/data rates of 2.4, 4.8, 9.6, and 14.4 Kbps
Communication can be defined as the act of transmitting information (Merriam-Webster Online Dictionary). It is a process whereby people appoint and transmit meaning in order to establish a common understanding. In order to effectively communicate a message, intrapersonal and interpersonal skills are needed. Skills like listening, speaking, questioning are needed in order to effectively communicate. Furthermore, skills such as processing, observing and analyzing are also needed in order to create a more effective communication.
The rapid change that is happening in the world today, affects how people communicate with each other, both in business and interpersonal level of relationships. The advancement of technology made communication and interaction with other people more accessible.
The use of computer-mediated communication both enhanced and inhibited interaction between people. Computer mediated communication has been applied in education. Researches suggest that this method of learning can enhance the learning of students. Researches also conclude that through computer mediated communication, satisfaction and motivation increases. Computer mediated communication can also reduce the feelings of isolation for students. On the other hand, some aspect of face to face communication can be diminished through computer mediate communication. Being comfortable and trust to the medium being used is also an issue in computer mediated communication.
This paper discusses up to what extend computer mediated communication enhances or inhibits interaction. The first section discusses the definition of computer mediated communication. The second section discusses the definition of interaction. The third section presents the factors that might enhance or inhibit interaction.
Definition of Computer Mediated Communication
Computer Mediate Communication or CMC refers to any communication transaction that happens by means of two or more computers that are linked with each other via a network (McQuail, 2005). Originally, the term only referred to communicative transactions between computers or formats that are attributed to computers. Such computer-mediated formats are electronic mails, instant messages and chat rooms.
Electronic mails or simply e-mails or email is referred to as any way of composing, sending-receiving, and saving generally text-based communication through the use of digital communication systems. At the beginning, different systems of electronic mails that are designed by different companies are most of the time not compatible or can not be operated with each other. When the Internet was established in the early 1980s, it paved the way for the effort to standardize the structure of the Internet in order to implement a singe standard in electronic mails which is called the Simple Mail Transfer Protocol or SMTP. The protocol is originally included in RFC 821 when it was first introduced in 1982 as Internet Standard 10.
The modern electronic mails system is now on a store-and-forward model. In this model, the computer server of the electronic mail system is the one that receives, sends and stores the messages instead of the users themselves. The user of the electronic mail only needs to connect to the electronic mail infrastructure through their personal computer or other devices that enables network connection for the period of the transmission of the message or retrieval of the message from its specific server. It is very infrequent that an electronic mail is directly transmitted from one persons device to another persons device.
When the electronic mail was first introduced, it can only support text messages under the ASCII character set, but now any type of media format is supported by the electronic mail system. These various types of media format can be sent and can also include attachments of images, audio and video files.
Instant messaging is considered as a form of real-time communication. It considered as such because the sending and receiving of information by the other party of the communication happens at almost the same time. Instant messaging involves the communication of two or more individual on a typed text format. The text-based message is transmitted through the use of devices that are linked on a network like the Internet or an intranet.
The difference between instant messaging and electronic mail is the synchronicity aspect. Instant messaging happens in real-time. In the contrary, some instant messaging systems permits the user to send messages to other users who are not logged on, in this context, it erases the primary dissimilarity among instant messaging and electronic mail.
Instant messaging systems can also allow the user to save a previous conversation for reference in the future. When a message log is installed, the instant messages will be logged in a message history file in the computer. In this way, it removes the advantage of persistency of electronic mails over instant messaging.
Chatrooms are generally described as a form of synchronous conferencing. In reality, chatroom nowadays can be both synchronous and asynchronous conferencing. Thus, chatroom can refer to both real-time online chat and instant messaging to online forums that acts like a digital and graphical social environment.
Through online chat, people can communicate via text-based messages to other users in the same chatroom in synchronously or in real-time. Historically, the first and oldest chat rooms can only support the transmission of text-based messages. Talkomatic, which was introduced by PLATO System in the year 1974 is being claimed as the first ever text-only chatroom. Another significant text-based chatroom was the Freelancing Round Table which gained popularity during the 1980s. Another popular text-based chat system is the Internet Relay Chat or IRC. Now, other chatrooms like the one provided by Yahoo! allows the use of text and voice chat at the same time. The gaining of recognition of the chatrooms paved the way for the development of instant messaging. Other noteworthy chat rooms are the one provided by AOL and other web chat sites.
Other impressive feature of chat rooms is that it uses graphical user interface or GUI. In this way, the user is permitted to choose from different icons that identify him or her and he or she can also change the appearance of the chat environment.
At present, interactions that are based on text are also considered as computer-mediated communication. An example of which, is the use of text messaging (Thurlow, et al., 2004).
Studies regarding the use of computer mediated communication usually centers on the social effects of those communication channels that are supported by computer technology. A lot of new researches focus mainly on the so-called social networking in the internet which is supported by social softwares.
Social networking or social network service has gained popularity overtime. It centers on creating online communities for people who have common interest and activities. Almost all of the social network services are based on the web. Social networking services provide the users different methods of interacting and communication with each other such as electronic mail and instant messaging.
Fresh and new ways of communicating and sharing information has been introduced by social networking systems. Social networking websites are currently being used by millions of people around the globe. It is now almost evident that social networking will be occupying a great part of everybodys everyday life.
Social networks became the most talked about channel for media for many years now. The major strength of these social networks is its capability to get together millions of different users. The major drawback of the social networks is its lack of ability to convert the quantity of the signed-up clients into monetary value.
It is true that electronic mails and websites also have the main features of social network services. The thing that separates social network services from the two is the idea of ownership of the service. Social network services allow the user to create their own home page.
The most popular social network service providers are MySpace, Friendster, Facebook, Twitter, LinkedIn, Tagged, Hi5, Nexopia, Bebo, dol2day, Skyrock, Orkut, Xiaonei and Cyworld.
Attempts to standardize the social network services in order to eliminate the redundancy or duplication of entries have been unsuccessful because it called for some issues regarding the privacy of the users.
Computer mediated communication covers a variation of fields. Scholars from different fields make studies about phenomena that happen in computer mediated communication. Internet studies.
An example of study regarding phenomena associated with computer mediated communication is the one done by J.B. Walther. In his study, he looked at how humans utilize computers or any other digital media in handling interpersonal interaction. Also covered in his study is the way human make us of computers or the digital media in dealing with forming and maintaining impressions and relationships (Walther, J. B. 1992 & 1996). These researches have most of the time covered then dissimilarities among offline and online interactions. At present, studies regarding computer mediated communication is on its way in creating a perception that computer mediated transactions must be considered as a very important part of a persons everyday life (Haythorntwaite and Wellman, 2002). Other studies regarding computer mediate transactions focuses on the utilization of paralinguistic aspect. The paralinguistic aspect of computer mediated communication refers to the use of emoticons.
Emoticons started off from text representations. Emoticons are basically textual portrayal of the mood or facial expression of the writer of the message. They are mostly utilized in order to give an idea to the responder the temper or the tenor of the message. It has the ability to change or enhance the interpretation of the text message. The word emoticon is a combination of two English word, emotion or emote and icon. In using web forums and instant messages, the emoticons in text format can be automatically changed to a specific image that corresponds to the emotion being portrayed by the text emoticon. These images are also referred to as emoticons. The graphical emoticons can be simple or complex. The most popular of all emoticons are the smiley face : and the frowny face 9.
Way back in the 1800s, the use of emoticons can already be traced especially for casual and humorous literature. The emoticons in the internet can be attributed to the proposal of a guy named Scott Fahlman in a message he sent on the nineteenth of September, 1982.
The sequential analysis and the different sociolects or terminologies used in different computer mediated environment are also receiving much attention from researchers (Herring, 1996 and Markman, 2006).
Sociolects are defined as the different languages that are used or associated only to a specific group of people or society. Different people categorized on different characteristics like socio-economic status, age, gender and occupation might have their distinct type of a language.
Pragmatic rules in the computer mediated communication environment such as turn-talking (Garcia and Jacobs, 1999), and the different registers and styles are also being studied by scholars. Computer mediated discourse analysis refers to the study of the languages used in the computer mediated communication environment in text-based forms (Herring, 2004)
Styles are the various language associated with the different topics or aspect of society. There is the so called language of politics, religion, business, advertising, and many more. These languages belong to a particular situation or period of time.
Registers are the various languages that are categorized according to the specific purpose in which that language is used more particularly in its social setting.
Other researched on computer mediated communication is concerned with the way people interact in social, professional as well as educational set up. In these kinds of settings, the way of communicating between the people involved differs not only because of the environment itself but at the same time the method of communication the participants use. The use of information communication technology affects the communication and interaction among people. Computer supported collaboration is the term used to refer to researches regarding communication in order to gain collaboration or the so called common work products.
The most famous forms of computer mediated communication are the electronic mail, chat through video, audio and text conferencing and instant messaging. Online bulletin boards, MMOs and list-servs are also popular forms of computer mediated communication. These platforms for communication are quickly changing as new inventions and technologies are developed. Weblogs or simply blogs already gained popularity and the barter of RSS data made it even easier for blog users to publish their own blogs.
Computer mediated communication can be categorized into synchronous and asynchronous modes. In the synchronous mode of communication, the users or parties in the conversation are online at same moment. On the other hand, asynchronous mode of communication has no restriction in time. The parties in the conversation can respond to the messages at a time different from the time the message was sent.
Definition of Interaction
Interaction refers to the type of action that happens between two or more objects when they affect each other. The concept of having a two-way effect is very important in interaction as compared to the one-way causal effect. The term interconnectivity is strongly associated with interaction. Interconnectivity refers to the interactions between the interactions within systems.
The concept of interconnectivity is widely used in various fields like non-linear dynamics, biology, network theory, cybernetics and ecology. Interconnectivity simply says that all the participants of a system closely interact with each other. They rely with each other just because they are within the same system. It should be recognized that it would be very difficult or even impossible to examine or study a system through its specific parts considered as independent to each other..The combination or the merger of several simple interactions might lead to emergent phenomena.
Different kinds of sciences define interaction in different ways.
The media art uses the term interactivity as term to refer the feature of the media that is characterized as the level of accessibility of the media to the masses.
In physics, the primary interaction or fundamental force is a process where the elementary particles interact with other elementary particles. Basically, nature has four primary interactions. These interactions are electromagnetic, gravitational, strong and weak interactions.
In the field of sociology, social interaction is the continuous change in the social actions among individuals and groups. These individual and groups changes their actions and reactions because of the action and reactions of the other parties. The foundation of social relations is social interactions.
In the field of medicine, interactions between the effects of the combinations of different medicines are monitored. This is usually done by pharmacists. In molecular biology, the study of the genes interaction between each other and their metabolites is called molecular pathways.
In the field of communications, interactive communications happens as the sources or parties of the communication process alternately transmit messages between one another. It should be noted that interactive communication is different from transactive communication. In transactive communication, the sources or parties in the communication process sends messages simultaneously. Transactive communication includes newer modes of communication. These modes mentioned may include teletext, teleshopping, video, video on demand, internet, computers, tele-conferencing and many more. Tele-communication can also be considered as transactive communication. On the other hand, the use of cell phones, mobile phones, pagers, electronic mails, and instant messaging are considered as interactive communication. These can also be classified further into three: Interpersonal, group and mass. Interpersonal communication includes the use of telephone and other services provided by telephone companies. Group communication will have to include the use of tele-conferences and video-conferences. Lastly, mass communication involves the use of internet and the World Wide Web.
In the parlance of distance education, interaction refers to the exchange of information, ideas, opinions between and among the learners and teachers, frequently, through the use of technology that is aimed to help improve and facilitate the learning process (Global Distance Education Net glossary)
Factors that might enhance or inhibit Interaction
Interaction between individual can be affected by a lot of factors. Some factors would tend to enhance the interaction while some would inhibit effective interaction. In order to identify what factors affect interaction and the capability of the people to communicate, the process of communication and interaction must first be understood. The sender who already has an idea in mind, which is usually a creation of research or his thought. The sender must put his or her idea into words or actions, as the case may be, and then he or she must transmit or send the message to the recipient of the message. When the recipient receives the message sent by the sender, the recipient should understand the message and draw conclusion from the message. After, understanding and drawing meaning from the message, that is the time the recipient should give feedback to the original sender. This simple process of sending, receiving and providing feedback can create problems when factors that inhibit interaction are present. At the same time, improving these factors would lead to an enhancement of the interaction process.
A factor that affects interaction is the competence of the parties. Competence refers to the ability of the person to communicate. The person must communicate to others in a way that it is approved by others and it effectively achieves its goals. If the person doesnt have much competence, the person would experience difficulty in putting his or her ideas into a form that other people would understand its meaning. People who can easily transform his or her ideas into a form that can be easily understood and accepted can enhance the interaction between the parties.
Another factor that affects interaction is the language barriers between the parties. An effective communication requires understanding between the parties involved. If the parties doesnt share a common language, it would be very difficult to effectively communicate, thus interaction between them is affected. Problems regarding language barriers doesnt only rise from difference in the language itself, sometimes eventhough the language is the same, problems still occur. This is true most especially when the words used have different meaning. A word or a statement may have a denotative definition, a dictionary meaning and a connotative meaning. Problem will occur when the recipient mistakenly perceived the message in a way different from what the sender think.
The wrong perception of a person to another may also inhibit interaction. Perception can be defined as the way you view other people and also the other way around, the way other people view you. This perception, which is basically mental in nature, significantly affects the way messages are interpreted by one person.
Differences in culture might also affect interaction. There are things that are appropriate to one culture but are deemed inappropriate to another. This encompasses both verbal and nonverbal features of a message. Furthermore the context of the message, the time, place and the relationship between the two parties interacting must also be considered.
Environmental factors such as visual, auditory and individual factors can affect interaction. Visual factors such as the lighting, distractions, distance, talkers face, viewing angle and vision can inhibit or enhance interaction. Poor lighting and physical distractions can affect the persons ability to interpret a message. The same with, the distance, vision, and viewing angle of the people that interact, this will ultimately have an impact in their way of receiving and understanding the message being conveyed.
Auditor factors such as noise, distance and echo can also be a cause for problem or enhancement of interaction. Individual factors such as fatigue, illness, stress, ventilation, attitude, preparation and the situation of the parties involve in the interaction are also important factors to consider.
Conclusion
A lot of the different features of an interaction have been greatly affected by computer mediated communication. Several of these issues are being studied. Such issues involved impression formation. The formation of relationship on computer mediated communication received much attention, as well as behaviors of deceit and lies and group dynamics.
Computer mediated communication is analyzed and compared to other methods of communication through characteristics that are believed to be present in all methods of communication. The characteristics mentioned earlier involve anonymity, synchronicity and persistence. The different method of communication shows different association in the characteristics mentioned. Like for example, the use of instant messaging basically have high synchronicity from its very nature. On the other hand, it lacks persistence or recordability because the users cannot save all the previous conversations they had when the dialog boxes are closed not unless the user installed a message log or the user copy-pasted the entire conversation manually.
Electronic mails and message boards on the other hand lacks synchronicity because the time from sending the message to the receipt of reply varies greatly. If instant messaging is low in persistence, electronic mails and message boards excel in this characteristic because through electronic mails and message boards, the messages that are sent and received can be automatically saved. In general, computer mediated communication differs from other methods of communication because computer mediated communication is transient and naturally multimodal. At the same time, computer mediated communications do not have the codes of conduct that should govern such form of communication (McQuail, 2005). Computer mediated communication has the ability to break the barriers of physical and social limitations that are present in other methods of communication. Thus, computer mediated communication enhances the interaction among the people who can not physically communicate with each other because they are not in the same physical location.
Computer mediated communication gives massive chances for language learners to enhance their skill in the language they want to learn. A study done by Warschauer discussed the effects of using electronic mails and discussion boards in language classes. The study concluded that the use of information and communications technology serves as the linking bridge between speech and writing (Warshauer, 2006)
Anonymity, privacy and security of users of computer mediated communication are based on the specific program used or website visited. Some programs and sites support communication that provides the user full anonymity, privacy and security. Some other sites and programs at the other hand, fails in this aspect. Most studies done by researchers are concerned in the significance of taking into account the social and psychological implications of the said factors.
Interaction can be enhanced by computer mediated communication because it reduces the issue of the parties confidence and trust. Some people are more comfortable to talking through chat or electronic mail instead of talking to other people personally. This enhances the interaction between the people communicating. This is because of the anonymity and privacy features of computer mediated communication. Users are freer to express themselves. At the same time, inhibitions to effective interaction may also arise. Deception and lies can easily go undetected. Because of the lack of physical contact, it will be very hard to identify the truths from the lies. Unlike, in physical interaction, the tone of the voice, the body language and other factors can help people detect if the person is lying or not.
Computer mediated communication inhibits interaction because it is susceptible to misinterpretation especially on text-based messages. The tone and emotion of the message can not be easily recognized in text-based messages. On the other hand, the use of emoticon might help in reducing this problem. Furthermore, computer mediated communication today also includes other forms of communication like voice and video. Through these, it is almost like talking to the other party in person.
The most important thing that computer mediated communication does, is it eliminates the physical barrier for interaction. Before, people located in different physical location are impossible to communicate with each other in a physical level. The use of the traditional mail system, helped to eliminate this but it lacks the aspect of synchronicity. It takes quite a long time before the message is received by the recipient up to the time the response is received by the original sender. The use of telephones also helped in eliminating the physical barrier for interaction. Though, this is mainly auditory in nature. A lot of misinterpretation can still occur. Furthermore, though it has synchronicity, it lacks the aspect of persistence unless a person recorded the entire conversation. Through computer mediated communication, people that doesnt share the same physical location can now interact with synchronicity and persistence. Computer mediated communication also provides features that will let you see the person in a camera and talk to the person by using microphones.
Computer mediated communication permits a more effective and more efficient communication in the sense of the timely receipt and acknowledgement or response of the message. Other computer mediate communication allows the users to talk not only on text-based communication but also via the use of microphones and web-cams. In that way, you can talk to the other party and see them at the same time.
In an educational setting, computer mediate communication has the ability to enhance the learning of the students. It also has the power to reduce the feelings of isolation that the students experience in traditional educational setting. Furthermore, increase satisfaction and motivation in the course being studied can also be expected.
The markets get anti-social with social networks. Deloitte TMT Predictions. Web.
Markman, K. M. (2006). Computer-mediated conversation: The organization of talk in chat-based virtual team meetings. Dissertation Abstracts International, 67 (12A), 4388. (UMI No. 3244348).
Ahern, T.C., Peck, K., & Laycock, M. (1992). The effects of teacher discourse in computer-mediated discussion. Journal of Educational Computing Research, 8(3), 291-309.
Bannan-Ritland, B. (2002). Computer-mediated communication, elearning, and interactivity: A review of the research. Quarterly Review of Distance Education, 3(2), 161-180.
Boyd, Danah and Ellison, Nicole. Social Network Sites: Definition, History, and Scholarship. Journal of Computer-Mediated Communication, volume 13, issue 1, 2007.
Bruns, Axel, and Joanne Jacobs, eds. Uses of Blogs, Peter Lang, New York, 2006.
Cockrell, Cathy, Plumbing the mysterious practices of digital youth: In first public report from a seminal study, UC Berkeley scholars shed light on kids use of Web 2.0 tools, UC Berkeley News, University of California, Berkeley, News Center, 2008.
Cooper, M.M., & Selfe, C.L. (1990). Computer conferences and learning: Authority, resistance, and internally persuasive discourse. College English, 52(8), 847-869.
Garcia, A. C., & Jacobs, J. B. (1999). The eyes of the beholder: Understanding the turn-taking system in quasi-synchronous computer-mediated communication. Research on Language & Social Interaction, 32, 337-367.
Global Distance Education Net. Glossary. Web.
Grudin, Jonathan (1994). Computer-Supported Cooperative Work: History and Focus. Computer (IEEE).
Gunawardena, C.H., Nolla, A.C., Wilson, P.L., Lopez-Isias, Jr. et al. (2001). A cross-cultural study of group process and development in online conferences. Distance Education, 22(1), 85-122.
Haythornthwaite, C. and Wellman, B. (2002). The Internet in everyday life: An introduction. In B. Wellman and C. Haythornthwaite (Eds.), The Internet in Everyday Life (pp. 3-41). Oxford: Blackwell.
Herring, S. (1999). Interactional coherence in CMC. Journal of Computer-Mediated Communication, 4(4). Web.
Herring, S. C. (2004). Computer-mediated discourse analysis: An approach to researching online behavior. In: S. A. Barab, R. Kling, and J. H. Gray (Eds.), Designing for Virtual Communities in the Service of Learning (pp. 338-376). New York: Cambridge University Press.
Hiltz, S. R. (1994). The virtual classroom: Learning without limits via computer networks. New Jersey: Ablex Publishing Corporation.
Hughes, S. C., Wickersham, L., Ryan-Jones, D. L., Smith, S. A. (2002). Overcoming social and pyschological barriers to effective on-line collaboration. Educational Technology and Society. Florida, USA.
Jenkin, Martha N. (2007) Barriers to effective communication at work. Allinace Training and Consulting, Inc. Web.
Lapadat, J.C. (2003). Teachers in an online seminar talking about talk: Classroom discourse and school change. Language and Education, 17(1), 21
Lauz, Kimberly A. Barriers to effective communication. 2009. Web.
Leinonen, P., Jarvela, S., & Lipponen, L. (2003). Individual students interpretations of their contribution to the computer-mediated discussions. Journal of Interactive Learning Research, 14(1), 99-122.
McQuail, Denis. (2005). Mcquails Mass Communication Theory. 5th ed. London: SAGE Publications.
McQuail, Denis. (2005). Mcquails Mass Communication Theory. 5th ed. London: SAGE Publications.
Poole, D.M. (2000). Student participation in a discussion-oriented online course: A case study. Journal of Research on Computing in Education, 33(2), 162-176.
Severin, Werner J., Tankard, James W., Jr., (1979). Communication Theories: Origins, Methods, Uses. New York: Hastings House.
Thurlow, C., Lengel, L. & Tomic, A. (2004). Computer mediated communication: Social interaction and the internet. London: Sage.
Vonderwell, S. (2002). An examination of asynchronous communication experiences and perspectives of students in an online course: A case study. The Internet and Higher Education, 6, 77-90.
Walther, J. B. (1996). Computer-mediated communication: Impersonal, interpersonal, and hyperpersonal interaction. Communication Research, 23, 3-43.
Walther, J. B., & Burgoon, J. K. (1992). Relational communication in computer-mediated interaction. Human Communication Research, 19, 50-88.
Warschauer, M. (1998). Electronic literacies: Language, culture and power in online education. Mahwah, NJ: Lawrence Erlbaum Associates.
Warschauer, M. (2006). Laptops and literacy: learning in the wireless classroom: Teachers College, Columbia University.
Compound business applications, e-commerce, and transaction automation demand tough and accurate security procedures. Corporations employing the Internet as a means to carry out business operations can be more productive and successful if their decisions uphold the requirements of security-conscious consumers. At present Internet, consumers insist on strict security protocols to safeguard their welfare, privacy, interactions, and resources.
Public key cryptography facilitates security aspects like privacy, reliability, validation, and non-repudiation. Nevertheless, to effectively implement such security factors, a carefully drafted management plan is required to monitor the security infrastructure. The public key infrastructure (PKI) provides a keystone, based on which other systems, modules, applications, and security components are developed. A PKI is an indispensable element of the general security policy which is aligned with other security aspects, business operations, and risk management initiatives. (Conklin, 2004)
In this document, we look at the issues which require attention when deciding on whether the PKI infrastructure should be provided by in-house facilities or commercial services.
Discussion
The Public Key Infrastructure (PKI) involves a set of computer hardware, software, personnel, strategies, and required to generate, control, store, allocate, and validate digital certificates. It links cryptographically generated public keys with user identities through a certificate authority (CA). This linkage is ascertained through the registration and issuance procedure. The PKI functionality that warrants this linkage is known as the Registration Authority (RA). In some cases, the expression trusted third party (TTP) is synonymous with a certificate authority (CA). (Rothke, 2005)
When a corporations network security necessities require them to use digital certificates for transactions, then it has to decide from where to procure the certificates. The certificates may be purchased from a commercial or third-party certificate authority like VeriSign or Thawte, or instead, an in-house facility may be set up to issue ones certificate. The three primary issues when deciding on whether to set up an internal PKI or use commercial PKI are cost, liability, and repute. (Conklin, 2004)
For a medium-sized enterprise, like in this case, an external commercial PKI is highly recommended for reasons discussed in this paragraph. Procuring a large number of certificates from a commercial provider can be a costly issue wherein an in-house facility could cut costs. However, in this case, where only a few certificates are to be issued a commercial facility is a much more feasible option. Secondly, in case there is a disaster such as data loss or system failure the liability is owned by the certificate issuer. Thus, implementing risk management frameworks to deal with such a crisis is a painstaking and costly affair. For an in-house facility, such frameworks have to be designed by a dedicated team. Hence, employing the services of a commercial provider is a better choice for a medium-sized company.
Lastly, a dedicated commercial provider is better known and their reputation is much higher throughout the markets. Consequently, the level of trust, in the case of a commercial outfit is much higher than its in-house counterpart. Thus more customers would be assured to use the system if it is warranted by a reputed commercial provider. (Rothke, 2005)
To protect a wireless network and enhance security measures the following measures should be in place
The default values of the System ID called the Service Set Identifier (SSID) or Extended Service Set Identifier (ESSID) should be changed.
Identifier Broadcasting should be disabled.
Wi-Fi Protected Access (WPA) encryption standards must be met.
Hardware, as well as Software Firewalls, needs to be installed.
Anti-hacking tools need to be installed in the systems using wireless connectivity as a last line of defense. (Conklin, 2004)
Conclusion
Medium-sized companies like in this case should prefer a commercial PKI over an in-house approach as it decreases costs, transfers the liability, and takes advantage of the repute and level of trust enjoyed by dedicated commercial service providers. In addition, wireless networks must be secured by following a strict, understandable, and clearly communicated policy and ensuring that basic security measures are in place.
References
Conklin, A. (2004). Principles of Computer Security: Security and Beyond. NY: McGraw-Hill Technology Education.
Rothke, B. (2005). Computer security: 20 things every employee should know. NY: McGraw Hill Professional.
TomTom Navigator is a modern communication software artifact that are embedded on GPS technology and employed in Personal digital assistants (PDAs), pocket apparatus, pocket PCs as well as smart phones. The present day release of this system is Navigator 7. The system varies from the former edition Navigator 6 as more features such as the capacity to augment frequently employed functions to the main screen of the program, in addition to allowing end users to account maps corrections and share them with other end users. The program has been coded on SD card as well as on DVDs.
Innovation Background
Innovation is a comprehensive course of action through which an organization engenders resourceful and fresh scientific notion (contraption) and translates them into original, practical and feasible profitable products, services as well as trading performances for (prospective) fiscal advantage. Research findings have indicated that aggressive gain is centered on the capability to generate an economy driven not by cost efficiencies but by notions as well as intellectual expertise. The natural world of novelty has gone through a metamorphosis as such the intrinsic description of novelty has dramatically been altered with time. Conventional definition of innovation was centered on persons that toiled in laboratories, itchy and throbbing just waiting to unmask great discoveries. In the modern society, the attention has shifted away from being an individual entity to a global phenomenon that hinges on collaboration. The Amazon president, Bezos Jeffrey puts it like going down blind alleyways (Blayse, A, Manley, K 2005).
Yet every time one goes down an alleyway and it opens up into gigantic, expansive boulevard. Novelty begins when we break free and rebuff the inert. Dissimilar to invention, that emerges from endeavor, carrying out tests coupled with an aspect of lack, novelty is embedded closely on expertise and leadership-earmarking prospective places to centre novelty power and designing the idyllic atmosphere for innovation to boom. CEOs play an integral role to making novelty a reality; scientifically or rather methodologically. Analogous to executing a corporate line of attack, realizing novelty entails the making of deliberate choices. Options can copiously exist, yet highlighting and sharpening winning options require ingenuity and this is a role that requires a shrewd CEO. New-fangled expertise is a product or process that a company has not previously employed in their construction operations whereas invention is the notion for that latest expertise. The development of new-fangled know-how is embedded on the crutches of invention. On the extreme end originality is the first business contracts that engross the technology. Particular scholars have illustrated blueprint of modern expertise as the comprehensive design of modest knowledge, hence it is the execution of the contraption that conjures up originality into existence. Abernathy, W.J., Clark, K. (1985).
Strategic Competition
For TomTom navigator to compete, the company has to be smart and spontaneous than the big players in the bazaar- something it has already proved it is good at. The line of attack should be three-dimensional: mainstreaming the competition speculating while stabilizing own purchaser base and depositing roots in the unfathomable outlook. Apple Inc shook up the video leasing industry by commencing the floodgates.
In such a panorama anyone that stays at the top of the wave already has an advantage, since most of the rivals will not be able to keep up with the pace. The biggest advantage that TomTom has over new entrants like TimeTrak is that the company entering into the game which TomTom has already mustered. As such offering of new services features, either centering on technological variations or improvement of service will keep TomTom far ahead.
TomTom EU expansion is an example of this move, although tangible fiscal value may not be realized optimally but it will keep the company ahead. Subsequent move to keeping TomTom ahead of the park is by enhancing the proposal engine Cine Match.
Stabilizing Clientele Base
Whereas at the very moment TomTom has a great momentum in constructing its subscription support, the company ought to capitalize on the same pattern and try to enhance the volume of its clients, in bid to annex critical mass with respect to the market share. Their elevated churn echelons should be the immediate concern.
Plant deep founded roots in the future
It is comprehensible, that penetration of digital expertise will be the prime reason in the rescheduling of the significance sequence in the auditory-illustration media substance rescue in the future. TomTom supervision is already allocating DVDs in physical as well as digital infrastructures. TomTom has to merge with Time Trak in bid to cementing its position in the mapping industry.
Copyright infringement charge
Legal
Microsoft filed a patent infringement lawsuit against TomTom asserting that the apparatus manufactures infringe on patents associated to FAT32 file configurations that happen to be Microsofts chief idea.
Differentiation
TomTom markets two forms of artifacts: course-plotting devices as well as map-reading programs for setting up on mobile procedures. The map reading tools and mobile apparatus with installed programs are known as units. TomTom units present a flying boundary with an oblique birds-eye vintage point of the boulevard, and also a direct-overhead diagram inspection. They employ the GPS antenna to demonstrate accurate surroundings and offer chart as well as verbal instructions on how to drive to the chosen objectives.
Various TomTom structures incorporate with mobile phones supporting Bluetooth, traffic congestion maps or to realistically make calls and read aloud SMS texts. Under ordinary circumstances, diverse replicas are on a software echelon. The hardware incorporate exceptional features like FM transmission, Bluetooth, hands free calling as well as advanced locating expertise.
TomTom GO
This is a multifaceted GPS direction finding instrument. It incorporates a touch screen, speaker, USB port, interior Lithium ion battery, in addition to TomTom residence program. It re-energizes, synchronizes, and updates its statistics by linking to windows or Mac PC operating TomTom residence program through USB code. Various forms are Bluetooth supported allowing links to a smart receiver.
TomTom ONE and ONE XL
The TomTom One is the support replica for auto map reading. The disparities flanking the TomTom One XL as well as the TomTom One are the dimension of the finger display. Neither replica of the One contain the augmented functions augmented in the Go models like content to dialogue, Bluetooth hands-free calling, MP3 jukebox, etc. Nevertheless, the One has the capacity to collect interchange as well as weather updates employing a transmitter that is sold independently. The miniaturized software ability implies less stipulation on the hardware, which enhances the One that is sold at a logically minimal price than the Go.
TomTom RIDER
These are mobile replicas for motorbike as well as motor scooter users.
References
Cainarca, G.C., Colombo, M.G., Mariotti, D (1990) Innovation in Navigating Industry; Pattern of Innovation Diffusion; Pliable automation, Study Policy, Vol 17 pp 78-85.
Dodgson M (2000), Twenty 1st Century Business; technological Innovation: A global as well as a Strategic line of Attack, Oxford University Press, Oxford.
Dosi G (1998) Scientific Shift and Technological Curve in Online Business; Study Policy. Vol 12 pp. 157-67.
Wireless technologies are attracting attention because wireless components can provide temporal connections to an existing cabled network, they can help provide backup to an existing network; extending the degree of portability hence taking the network beyond the limits of physical connectivity. There are vast ranges of wireless technologies being adopted by individuals and even cooperate companies, and the question is, Wireless technology is here with us, what next?. Therefore in this context, we specifically explain the importance and the issues surrounding two such technologies; Social media and Voice and Messaging.
Social Media
Following the advent of the Internet, individuals felt the need to extend the advantages of the devices interconnectivity through sharing information and social life experiences. In essence people use tools like social networks in order to see and to be seen in a social interaction design. The Social Media merges direct and urgent communication interests of people with the indirect or mediated means of production of medias conceptual forms, thus presenting a good environment for wireless connectivity. Social media also promote the authenticity and truth that interpersonal communication posses, and which can only be imitated by mass media (Chan, 4)
In response to the consumer behaviors, preferences and acceptance, social software systems are different in their theme, user interface as well as their genre. For instance, dating sites such as afrointroduction.com and enharmony.com, deal extensively on personal information. Career connectivity sites like linkedin.com focus on individuals also, but presents only the professional details required. Both these networks are therefore biographical and representational. Facebook and Myspace deal dynamically with people as compared to other sites, for they produce social networks: groups, events, news and scenes. Blogging and discussion sites such as techrepublic.com also engage in the news generation, emphasizing different viewpoints, perceptions and expertise more than an individuals personality. Social media such as YouTube engage users in their content by presenting clips, ripped movies and music posted by them. Chan asserts that the social media enables a two way communication as opposed to the mass media (5). Therefore social media can be used for marketing, advertising, entertainment, communication and playing social games.
However, the social media requires higher bandwidth because of a two way transmission of content. In regards to security, this kind of technology raises ethical issues in relation to privacy of individuals data. The advancement of network through this technology enables individuals or organizations to reproduce data from one location to another and accessing personal data from remote locations. This means that the social media has made some laws obsolete or severely crippled. Also many people in offices find it easy to watch internet video and chat with friends through these media; this affects the throughput of a company leading to unattained goals. Moreover, Social media is deemed to be a technology that will continue to enhance the communication and relationships among societies.
Voice and Messaging
Because of the need for faster and reliable transmission of information between distant locations, technologists came up with an easier way of conveying voice and message which takes advantage of wireless connectivity. Voice and Messaging technologies enable users to communicate through voice and messages. Examples of these technologies include pagers, phones and duplex business radios. Gupta outlines that, voice and messaging devices can be categorized as under, analog or digital depending on the way in which they encode and decode signals. Advanced Mobile Phone System (AMPS) is an analog standard. And the digital protocols are Global System for Mobile Communications (GSM), and Code Division Multiple Access (CDMA). The operations of these devices are within networks which are operated by carriers.
AMPS standard is designed on the early electromagnetic radiation spectrum that assigns frequency ranges within 800 and 900 Megahertz (MHz) spectrum to cellular telephone. This enables the service providers of voice and messaging to receive and transmit signals between cellular phones by using the allocated frequency ranges (Gutpa). As a result of the pitfalls of analog systems, the digital systems were invented in order to support encryption, compression and Integrated Services Digital Network (ISDN) compatibility. One of such is the GSM network which functions in the 1850 to 1990MHz frequency range. The CDMA converts voice into digital information; which is then transmitted over wireless network as a radio signal.
This technology offers choices to individuals, service providers and phone manufacturing companies. It is therefore difficult to enjoy the features of GSM if you have a phone which does not support the service; making choices are sometimes hard especially when it comes to cost. Though, mobile phones today have advanced features like camera, internet accessibility, games and even music at a reduced price hence making wirelessly affordable. Sometimes this type of is prone to transmission impairments such as noise and distortion especially in analog signals and more so people tend to lie on the phone and such like devices; this reduces the rate of loyalty to voice and messaging products among consumers. In essence, the voice and messaging technology is deemed to increase the dynamism of communications systems.
Conclusion
The inherent difficulty of setting up cable networks is a factor that will continue to push wireless environments towards greater acceptance. Wireless connectivity can be useful for networking busy locations such as reception areas, distant buildings having users who collaborate in some way, even historic buildings structures for which it is difficult to lay cables.
Works Cited
Chan, Adrian. Social Media, Mass Media. Social Integration Design Reading Notes. (2007): 4-11. 2009. Web.
Gupta, Ruchi. Wireless Technologies: Voice and Messaging. 1999. Web.
With the emergence of the Internet, humanity discovered a new way of communication. This approach seems to be taken for granted today, but, in fact, it is unbelievable, how people went without the Internet in previous times? The thing is that the Internet becomes a new kind of media due to its scope of information and people sharing its services. Moreover, the increase of users is growing all over the world. Today one hardly could find a place where there is no Internet coverage. In this respect, the development of new type for communication, new style and culture of it are necessary for the commercial structures. The most progressive part of mankind, youth, constantly checks the information via the Internet for the purpose to find common interests and not lagging behind the time prospects. In this approach, there is a possibility for business to develop their services or production lines involving as many users being aware of such initiatives as possible. The paper deals with the significance of Internet interactive communication as of its use for commercial interests in particular.
Lister (2003) writes that communication via the Internet was a cultural achievement of the time and that it let the huge investments have a place in the world web by owing to a global interest of people worldwide. What is more, the cultural peculiarities of todays generations prop up against the necessity to master Internet and computing technologies to be stylish and advantageous in society. The flow of time ignores any delay of mind within individuals. In other words, it is better for people to follow technological progress, but not vice versa.
Franklin (2004) provides an idea that a faster change of social framework in the twentieth century determined the appearance of Internet communication in different spheres of activities and in business, particularly: As Internet technologies are appropriated by commercial interests with access to huge financial resources, technical know-how, and market access, popular imaginaries have been put to work in selling this commercial vision of communication futures (19). Thus, in the contemporary outline of technological appliances and supplements which are expressed and implemented in the Internet community space, people got accustomed to using high technologies for their strategic purposes already.
Turning up to the modern estimation of Internet opportunities it is vital [to point out the Web 2.0 technologies which are implemented in different services, namely: social networking, mashups, Internet communities, video-sharing sites, etc (Solomon & Schrum, 2007). With the invention of Web 2.0 technologies, the communication in Internet transformed from passive into the active form. Thereupon, the emergence of blogs, for example, let users express their feelings and opinions sharing them with the rest of users all over the world. Lister (2003) continues the list of appropriate services including e-mailing, participation in Usenet, Bulletin Board Systems, chat rooms, etc. The flow of personal and business interests and their so-called overlapping is invisible and hardly felt now due to the mutuality of approaches.
There are several examples of such collaboration. For instance, the social network Facebook or Twitter, or MySpace are used both by ordinary users and also by business structures for the purpose to be reached within different communities. The Internet dimension became the most intensive place for the provision of business in a global scope. Thus, by means of Twitter, you can not only communicate with peers and colleagues, but also find information about goods and services, or even proper educational establishments. Wood and Smith (2005) urge to mention that Internet communication is a complex of means which fall into packet-switching, multimedia, interactivity, synchronicity, and hypertextuality. People can get only an imaginary experience using these virtual technologies. For example, YouTube is a well-known video-sharing service that stores millions of different video clips from all over the world. It is a way for many people to make a name for somebody, a would-be singer, speaker, poet, etc. Show business looks for talents due to YouTube and then promotes new celebrities in the world arena. So this service is an interactive mean of communication is also commercialized. Commercial interests are also represented on sites where the information about, for instance, a well-known music project is accompanied by different items with the label of this project. All in all, when someone visits Internet resources he/she unintentionally runs across various advertisements. Amazon.com, Google, E-Bay, and many other web-based portals provide communication in terms of advantageous for both provider and user relationships.
With a trendy tendency to change the form of relationships appeared a trend to use a particular style of language for communication. The popular IM abbreviations and chat acronyms today are taken into account by users so that to express their attitude toward each other and toward some matters or events. Among them are BRB (Be Right Back), BOL (Best of Luck), ROTFL (Rolling on the Floor Laughing), and others (DSilva, 2007).
To sum up, the world of Internet relationships overgrew into a huge and wide field of interest sharing. Personal, social, religious, business, and other kinds of interest are implemented throughout Internet communication. The commercialization of the Internet is still going on. Nevertheless, in present days such transformation of communication between individuals is taken into account, and there is nothing surprising, in fact.
References
DSilva, R 2007, Chat Lingo: Popular IM Abbreviations and Chat Acronyms, 2007, Viewed. Web.
Franklin, M 2004, Postcolonial politics, the Internet, and everyday life: Pacific traversals online, Routledge, Lonon.
Lister, M 2003, New media: a critical introduction. Routledge, London.
Solomon, G & Schrum, L 2007, Web 2.0: new tools, new schools ISTE Interntl Soc Tech Educ, Washington, DC.
Wood, AF & Smith, MJ 2005 Online communication: linking technology, identity, and culture, Ed. 2, Routledge, London.
The following report looks at the internet communications and the use of Really Simple Syndication (RSS) in web sites to enhance the way of passing information to web site visitors. RSS feeds avail the latest information to users from various websites without the need of visiting each of the web sites. The application pushes the information to them and they can choose which information to read. To set up RSS one needs the internet, an RSS reader and a database with the information to be shared. RSS makes it possible to share information from incompatible databases as it uses XML files that can be used on different databases. This sharing of information can also become a privacy issue as information can be accessed by unauthorised persons. Hence, it is important to protect the information of the users. This is done by implementing privacy policies.
Introduction
Internet communications report is relevant in the field of communication today. The way of communicating has changed and more people are relying on the internet to get information. This information can be got from web sites, blogs, and online publications and so on. It is therefore important to look into the area of internet communications to be able to improve on the way information is availed and used over the internet.
The following report will explain how RSS is used on web sites to make communication easier. It was requested by Westy, a web site owner who uploads comments about his TV show. Many people comment and he is finding it difficult to cope with the numerous emails he gets. He needs to know about RSS so that he can use it on his site so that he does not have to respond personally to the emails without losing control of his site. He needs to know how RSS is set, its maintenance, how to monitor information posted on his web site, how to protect his users privacy, the benefits he can get from its installation and how to attract people to his web site.
Methodology
The information used in this report was collected from 8 October 2009 to 22 October 2009. The data for the topic Internet Communications- Research Report was collected from secondary sources through a research online. A number of internet articles were used in compiling this report. It was not possible to gather information from primary sources due to time and financial constraints.
Findings
RSS is a format that is designed for delivering regularly changing web content. RSS means Really Simple Syndication. This format is used mostly by news-related websites, online publishers and weblogs. They syndicate their content as an RSS feed to the people who may need it. The format is based on XML and the RSS feeds are use XML files that contain information in RSS specification (RSS, para.1). RSS is not restricted to news only and can be used other ways such as updating changes in books (Pilgrim, para. 1).RSS enables one to get content posted on new sites and blogs without looking for it as it is pushed to the user in their RSS reader. Users are able to get fresh or up to date news and information (What is RSS, para. 3). A web site owner is able to maintain an RSS feed to send notifications to the people who have an interest in the site. The notifications are made possible by computer programmes known as RSS aggregators that are designed to automatically access RSS feeds from websites and a client is able to get RSS feeds from different websites. RSS gives basic information in the notification. The information includes a title that describes the item and a link to the web page where the full details of the information can be found.
This is an example of an RSS feed:
Title: Sidewalk contract awarded
Description: The city awarded the sidewalk contract to Smith Associates. This hotly contested deal is worth $1.2 million.
To step up RSS one can use desktop applications or use online services. It is difficult to write RSS manually and most people prefer to use online services where one submits a feed into search engines as well as directories (McCartney, para.1). One requires an XML code. XML makes it easy to create a feed that can generate automatically because it is a publicly accessible text file. With XML one creates data that can be accessible. It simplifies data storage as well as sharing. When contrasted with HTML it is better because in HTML one would require a lot of time to edit data every time changes occur. With XML data is stored in separate XML files thus no changes are required in HTML as with the use of JavaScript a file can be read externally as XML by updating the content of HTML. XML makes it easy to easy to share data because computer systems use databases that contain incompatible formats whereas XML stores data in plain text that can easily be shared (Using XML, para. 1- 3). This way no human effort is required to update a web site (Angelius, para. 2). To make an RSS feed public one needs the internet to put the feed, a database that has descriptions of the content and a server-side scripting language that can access the database or a RSS reader. A reader can be obtained freely or bought. There are popular RSS readers such as NewsFire used on Mac, pRSS reader, free software for PocketPC, YeahReader used with windows, Liferea for use with linux and Bloglines that reads feeds online.
Maintenance of RSS feeds is very important so that the users are given current information. RSS is very important as it gives users information without having to visit web sites. One can subscribe to an RSS maintenance website or buy a software created for that purpose. This is because creating and maintaining RSS might be difficult at first for a novice. For example, there is the Feed Editor that edits RSS and offers additional functionality. It makes sense to use RSS because it will update information automatically. This reduces the chance of forgetting to update information on ones web site
The web site can be monitored to ensure that people do not leave libellous comments and place ads only, by using a software that can detect such messages. The users should also be required to give their emails as a sign of accountability. The site owner should also reserve the right to remove a comment that is found to be libellous. This will discourage visitors from publishing such kind of messages. The site owner can also delay publishing of the messages so that they are checked for suitability before publishing. Visitors should also be given conditions to follow and they should be advised against leaving derogatory messages. To limit the number of ads a user can place the site owner should give a limit and probably require them to do something before they are allowed to publish their adverts. For example, they can be charged for the advertisements. This will ensure that the number of advertisements is reduced.
Embedding posts from you tube to a web page requires one to copy the code in the box labelled embedded. It is found at the box named about this video. Once the video is copied it is pasted into the web page. This video is also embedded on other websites that are linked using RSS. To disable videos from being embedded on other web sites one clicks in their account the button named my uploaded videos. Then click on edit video info and under sharing options click to no external sites may not be embedded and play this video. (Embedding Video, para. 1, 2).
Protecting peoples privacy is very crucial because personal information can be used by malicious people to hurt someone. The site owner should have a clear privacy policy on how the information given by the users will be used. The site should also not ask for personal information that is not relevant to the site. Giving emails can be made optional. The visitors should also be advised about giving personal information on the site. They should be made aware of how their privacy may be affected by the information they give on that site (Cyber Ethics, Solution, para.1 ) The site visitors should not publish comments that give information such as physical addresses or phone numbers. The privacy policy should be well stated and easily understood. It should not contain verbose language. The site owner should also join a privacy guarantee programme that will monitor how the site uses visitors information.
Money can be made from web sites. Selling products on ones web site can earn someone money. This is enabled by e-commerce and Westy can sell products such as T-shirts. The site can also refer visitors to other sites that sell products and get paid a commission. The other way of making money is by introducing subscriptions to the site. The subscriptions can be a one time payment or can be renewed at a regular period. Making money from a web site is possible. The site owner can sell advertising space to businesses that want to reach the kind of audience the site attracts.
Attracting people to the site. RSS help site owners to attract visitors to their web sites due to its quality. If one provides an attractive RSS feed their site can become popular. This is because the feed is also sent to other sites and visitors in those sits can see it and follow the link provided. Site owners should also update their feed often so that the visitors get relevant information in good time. The site should be kept simple and easy to understand. This will make the site user friendly and visitors will find it simple to use. The login procedure should also be kept simple as it increases RSS subscribers. The main page should have an attractive design to capture the attention and interest of visitors so that they will find the site useful to their needs. Furthermore, the web site should focus on its theme and this can be brought out in the text choice, colour and images. The site is basically for posting comments by viewers of the show. This means that that Westy should write about things or topics that will interest the web users and make them want more. If this is done they will keep visiting the site and even recommend their friends to visit the site as they feel it is worthy of their time. Text boxes will give visitors a chance to comment or give their suggestions about the show or other visitors comments. Replies should be programmed so that the visitors can feel that the site owner takes time to read their comments and this will make them loyal to the web site thus increase traffic (Endo, p.1). Video marketing can be used to increase visitors to a site. YouTube is very popular and videos uploaded on it are watched by many people. By uploading a creative video related to services or products being offered on ones web site it can increase visitors because they are more likely to see the video. A link to the website can be included in the YouTube video as it is ranked highly on search engines due to the high traffic it receives. The other way is to use online sites that allow free classifieds. Such sites get many visitors daily. Placing an advertisement in such a site like Craigslist will direct many visitors to a web site that placed the advert and hence more visitors to the web site (Endo, p.1)
Summary
We are in the new era of communication based on the internet. Most people are using this tool to find information about many topics. Some use it for entertainment, general knowledge or academic research. This information is hosted on websites. It is therefore imperative that web sites are user friendly so that users can benefit from using the information carried on them. The way of communicating has changed and people want a type of communication that is easy and fast to access. This has led to development of products like RSS which make getting information from a number of web sites easy as information is availed as soon as it is up dated or changed. This ensures that people have the latest information. Site owners are required to update their websites so that they can transmit information to their users in real time. This can be done by incorporating RSS feeds which use XML type of files to avail information to their users wherever they are as long as they have access to the internet. On the other hand, web site use comes with its implications concerning privacy. The users should be protected against exploitation or harm that may be caused by exposing their information to strangers. On the other hand, web sites can be used to make money. Web site owners can sell advertising space, products or services through e-commerce. The customers who buy such products or services are required to give personal information about their credit cards and this information has to be protected from landing into wrong hands.
Recommendations
Web site owners should upgrade their web sites by using the latest technology to reach their users more efficiently and keep them updated with the latest information. They should adapt RSS use and encourage their uses to do so.
Due to the different uses that the internet has. The way of and sharing information has changed. People are able to share information with people from all over the world just by a click of a button. To do so they have to sign up with providers of such services and when they do so they are required to give personal information. They can also be traced using their computers IP address. This information can be used wrongly to harm them. Thus, the privacy of internet users should be safeguarded. This should be a collective responsibility among the users, site owners and the government. Strict privacy policies should be enforced to protect the internet users from hackers.
Information exchanged over e-commerce transactions should be safeguarded by the businesses to prevent exposing credit card numbers to hackers who may use that information to rob people of their money in banks.
Web site owners should also strive to ensure that the information posted on their sites by their visitors is appropriate. The information should be filtered so that we do not end up with derogatory messages on the web sites as they may lower the reputation of the web sites.
Web sites should take advantage of the traffic they get to their sites and make some money. However, this should be done legitimately as there are so many websites conning people of their hard earned money.
Works cited
Angelius, Ladd. Set up a simple syndication feed using RSS. Devx.com. 2009. Web.
Endo, Leonardo How to Increase Website Visitors in Only 24 Hours or Less For Free. EzineArticles.com. 2009. Web.
The life of modern human society is often accompanied by dangerous incidents and crises that might affect single people and might be comprehensive and global in their scope. Needless to say, crises can be the results of human activity as well as of natural origin. The extent of the danger of each of these crises is argued about by scholars but the fact that both crisis types need adequate management and communication strategies is doubtless. This paper focuses on the consideration of the advantages and failures of the strategic communication plans used by Maple Leaf and the US Federal Emergency Response Agency (FEMA) to deal with the Listeria Crisis and Hurricane Katrina respectively. Therefore, this report considers the main factors that conditioned the strategies failures in the crises mentioned and further offers an alternative strategic communication plan that might help avoid the considered mistakes in the future.
Hurricane Katrina and Maple Leaf Listeria Crises Comparison
Background
The contexts in which the topic of the current report will be developed are the two disasters, natural and man-made ones, which shook American society not so long ago. Hurricane Katrina in 2005 is reported to have taken 9,000 lives and caused damage equivalent to almost $100 billion (Barnes, 2008). The Maple Leaf Listeria Crisis happened on August 23, 2008, and resulted in numerous people reporting food-borne illnesses (Maple Leaf, 2009). The company was forced to shut down its manufacturing facilities and recall its recent products from all markets, which cost Maple Leaf the estimated sum of $20 million (CBS, 2008). Strategic communication plans for the two crises differed drastically, as FEMA failed to provide sufficient information to the public and any help to Katrina victims, while Maple Leaf introduced a relatively successful strategy that allowed the company to at least fight for restoring its positive image.
Roles and Priorities
In more detail, FEMA was in charge of managing the tragedy of 2005 in New Orleans. The role of the organization was to react to the hurricanes consequences and provide help to its victims in the first hours as this time is critical. However, FEMA failed in this role and gave reason to call it a public relations embarrassment (Kolstein, 2006). Maple Leaf, on the other hand, assessed its priorities in the crisis in a proper way and attributed much importance to public relations. The fact that the company did not hide or deny the Listeria outbreak and publically apologized for its consequences added considerably to its international and domestic image and respect (Adams, 2008). Thus, it becomes evident that Maple Leaf performed better in identifying its role in crisis management and assessing communication as its priority.
Key Stakeholders
The reason for this difference in the quality of approaches to crisis management and strategic communication might be found in the key stakeholders of both FEMA and Maple Leaf. The former is a government-run agency that deals with all emergency cases in the United States (Kolstein, 2006). FEMA is a non-profit organization, and this might be the factor that reduced the motivation of its employees when they faced the need to urgently react to Katrina and communicate the tragedy to the public. At the same time, Maple Leaf understood that its key stakeholders, apart from investors and the Board of Directors, are its customers. Drawing from this, Maple Leaf felt the need to establish communication with these key stakeholders which allowed the company to save its face in the potential thunderstorm of public outrage.
Overarching Strategy and Goals
The consideration of the overarching strategies by FEMA and Maple Leaf might also be of help in identifying the reasons for crises and different responses of the companies to them. The two major overarching strategies in crisis management, according to Wilson & Ogden (2008), are training and public relations (p. 176). Drawing from this, FEMA reportedly failed in both of them during the Katrina crisis as Kolstein (2006) argues about the FEMA being unable even to access the territory hit by the hurricane, which evidences the lack of training, and timely communicate the tragedy extends to the public. On the contrary, Maple Leaf trained personnel made proper decisions timely, stopped the performance of the companys plants, recalled the Listeria-hit products, and publically apologized proving the better training and public relations strategies being adopted in this company.
Primary and Secondary Tactics
The primary and secondary public relations tactics might have also caused the discussed failure of FEMA and help Maple Leaf overcome its crisis properly. The primary and secondary public relations tactics include access and collection of the needed data and the communication of those data to the respective agencies and public (Claywood, 1997, p. 194). Thus, FEMA did not manage to access the territory hit by Katrina four days after the hurricane and, as Barnes (2008) argues, could not provide the data on victims in the area to any federal agency. As contrasted, Maple Leaf had its primary and secondary tactics at the proper level, which allowed the company to access the crisis data, make decisions on their basis, and communicate both, the data and decisions, to the public.
Timeline and Resources
The use of time and resources in handling Hurricane Katrina and the Maple Leaf Listeria crisis was also drastically different. Barnes (2008) reports that during the first five days after the hurricane hit there were no communicational opportunities between the city and the federal authorities. The city government was also out of reach, and the FEMA workers that accessed the flooded areas could not properly communicate with the rest of the world. Although there is no fault of FEMA in this condition, the improper use and rejection of internationally provided resources might be considered as one of FEMAs improper policies while managing Katrinas effects.
Maple Leaf in the critical situation needed only a day to collect Listeria outbreak data, react to them by stopping its manufacturing facilities and recalling the products. The resources the company used to deal with the crisis included the companys funds of about $20 million and the powerful media campaign launched to inform the public on the problem and apologize to all potential victims of Listeria-infected products (CBS, 2008).
Measurements and Evaluations
The procedures involved in measurement and evaluation processes were also different and respectively had different efficiency levels. For example, FEMA is reported to have needed about 5 days to access the territory flooded as a result of Katrina hit and measure the extent of the damage caused (Kolstein, 2006). Respectively, in the context of the damaged communications and transportation issues, such a long time needed for preliminary assessment and reaction to the hurricane was a huge mistake that resulted in numbers of people having no accommodation and food for living until the federal help arrived in New Orleans. Maple Leaf in the critical situation proved to be effective in instant reacting, as according to Adams (2008) and CBS (2008), it took only one day for the company to measure and evaluate the crisis effects and take respective management steps.
Recommendations
Research Focus
Based on the above presented comparative analysis, the strategic communication plans and the overall crisis management strategies implemented by FEMA and Maple Leaf to handle Hurricane Katrina and Listeria crises respectively can be assessed as completely different in their success. Thus, the following recommendations will mainly concern FEMA and its PR policies and techniques regarding crisis management. Therefore, the research focus of the following recommendations will mainly include the stakeholder strategy, priorities assessment, overarching strategies, public relations tactics, use of time and resources, and speed of reaction and decision making, which is critical during a crisis.
Stakeholder Strategy
The stakeholder strategy should be more flexible and better equipped with the training and practical activity opportunities. In other words, to be prepared for the strategic communication and other public relations activities in case of a crisis or another emergency, the company should be more aware of the needs of its key stakeholders and, what is of primary importance, the company should realize who those key stakeholders are and how their interests are related to the companys interests. Such a clear stakeholder strategy will allow for the development of mutual communication between these stakeholders and will also enable the company to address its key stakeholders in case of need in resources or funding. Of course, the company will be completely responsible for its stakeholders for the success of its performance (Wilson & Ogden, 2008, pp. 128 130).
Tactics and Messages
The primary and secondary public relations tactics should also be improved for better performance in critical situations. First, it is necessary to improve the data collection procedures in respect of their speed and efficiency. To do this, the company should provide its employees with the respective training and establish contact with other organizations that have, or might have, access to data that can be of help during a crisis. As well, the company should promote and develop its positive image in public. This can be achieved through developing the media strategy and establishing contacts with the mass media. This will also involve the formulation of the companys messages that should be formulated in a manner concentrating on the need to help people in emergency and make public relations one of the companys basics (Claywood, 1997, p. 197).
Success Measurement
It is also natural that a company dealing with crisis management and public relations should have a clear and properly developed strategy for measuring the success of its activities. As the company deals with strategic communications planning and public relations in emergencies, there should be a standard of success established by this company based on the previous research in the area and the reports highlighting the performance of other companies from the same area of activity. More specifically, the success measures should include the reaction time and the comparative rates of successful performances of the company when it managed to prevent the emergency and when it had to deal with its consequences (Claywood, 1997, p. 121).
Legal and Ethical Considerations
The companys strategic communication plan should be compiled in strict compliance with the legal requirement of the country. Thus, the company should reveal the information it obtains about the possibility of man-made or natural disasters and inform the public about it as soon as such information is received (the only exception being the legally agreed concealing of the information to avoid panic and unpredicted consequences) (Wilson & Ogden, 2008, p. 130). The legal responsibility of the company should also concern the area of the privacy policy, which is also a part of ethical consideration for this strategic communications plan. Accordingly, in the process of communicating the crisis-related information and retrieving the data about crisis victims, the company should follow the requirements for privacy, politeness, and political correctness in dealing with people of various races, ethnic groups, religious beliefs, and sexual preferences.
Conclusions
Summing up the data presented in this report, it is necessary to restate that crises and disasters, either man-made or natural, are integral parts of human life. Crisis management techniques and the importance of public relations and communications in it should be properly considered by any organization dealing with crises effects. The examples of FEMA and Maple Leaf allow seeing how important the proper organization of the PR and crisis management activities are for the overall settlement of the crisis. FEMA failed to fulfill its duties and direct functions while fighting Hurricane Katrinas effects on the city of New Orleans as the emergency help and communication of the tragedy extent were carried out with great delay and not properly. On the other hand, Maple Leaf proved to have properly developed PR department and crisis management policies, which allowed this company to identify, assess, and start managing the Listeria outbreak within a day after it happened. The above recommendations focus on the necessary improvements in FEMA PR and strategic communication plans that will allow avoiding failures similar to the one related to Katrina in the future.
References
Adams, C. (2008). The Maple Leaf Food Crisis: One Month Later. Web.
Barnes, N. (2008). FEMA: Hurricane Katrina Case Study Communications Focus. Web.
CBS. (2008). How Maple Leaf Foods is handling the Listeria outbreak. Web.
Claywood, C. (1997). The handbook of strategic public relations & integrated communications. New York: McGraw-Hill.
Kolstein, L. (2006). FEMA A Public Relations Embarrassment. 2009. Web.
Maple Leaf. (2009). Food Safety at Home. Web.
Wilson, L. & Ogden, J. (2008). Strategic communications planning: For effective public relations & marketing (5th ed.). Dubuque, IA: Kendall/Hunt Publishing.