Acumen is a microfinance institution that provides analysis and recommendations for businesses toward their investments, capital, and social impact. The company has used the RCT model for the investor-driven recommendations concerning the social impact made by Acumens clients. According to the companys investigation, the convenient analytical tools and evaluation models had little effect on social investment and did not represent the full spectrum of possible implications or issues that businesses and communities might face.
The Lean Data model was developed by Acumen as the solution to the discovered analytical roadblocks and recommendation pitfalls. This model is based on the accurately developed survey techniques that collect data from communities affected by business so that the survey outcomes can show how products, services, and investments affect peoples welfare and value-added demands (Cole et al. 4). What is more, the new methodology shows how companies can improve their social responsibility and deliver more value-added products and services to their consumers. In this case, Acumen introduced a significant change that allows communities and businesses to collaborate and contribute to each others sustainability.
Problem Identification
The development of Lean Data methodology has become a resourceful project for Acumen; namely, it has accomplished its objectives and achieved desired outcomes. Furthermore, Acumen conducted several successful evaluations and sprints, which proved that the model functions well and can provide insights about the business real impact on consumers welfare and quality of life (Cole et al. 9). Nonetheless, the company has been facing a new dilemma as Lean Data results should be transformed into change strategies and recommendations, which is not an easy task.
The prior RCT model allowed Acumen to conduct an evaluation based on investors perspectives, making emphasis on numeric data, financial reports, and market activity. Lean Data offers an identical view; however, from a consumer-centric perspective, companies show results and areas of impact and the driving forces are to be discovered (Cole et al. 10). As a result, Acumen should decide whether Lean Data is an appropriate tool for further integration and use or it should become a supplemental tool to support traditional analytical models.
Alternatives
The first alternative for Acumen is to replace the prior methodologies and rely on Lean Data as the only reliable tool for social impact measurement. This option is based on the evidence collected by the company, where the survey and analysis outcomes proved to have an in-depth and in-breadth application to evaluate business social responsibility and impact on consumers welfare (Cole et al. 7). Moreover, Lean Data displays the field evidence collected from consumers, making it a more consumer-oriented and potent tool, which assists companies in managing their capital and social footprint.
The second alternative is to abandon Lean Data and focus on the investor-centric tools to offer numeric evidence directly aligned with finance and strategic changes. The rationale for this choice is that companies will obtain straightforward data on how they perform regarding social responsibility (Cole et al. 8). Moreover, the investor-oriented approach may appeal to companies stakeholders as any change or issue is aligned with financial performance.
The third alternative is to apply a combined analytical approach where Lean Data shows issues from consumers perspectives, while RCT interprets these findings in a manner that investors may prefer more. In this case, Acumen can deliver an extended report to its clients, showing both consumer and investor-centric positions. On the contrary, such a decision may require a new prototype and a testing period.
Recommendations
The recommended option is the first one due to its flexibility and accuracy. The third option cannot be selected as the combination of the new and traditional methods may cause issues and complexity in data interpretation. What is more, no evidence is given that Lean Data and RCT can be easily merged. The second option is a step back for Acumen as the initial idea of Lean Data was to replace RCT and other investor-centric methodologies. As a result, the utilization of Lean Data should support Acumens intentions to represent social footprints from consumers perspectives and indicate to companies their problematic fields with value-added capital investment.
Economic growth depends on the value-added production created by a company. Value-added goods and services form the basis that is used to support the economy of enterprise, territorial communities, and human development. The development policy should be put in place in order to create conditions and incentives for value-added products and to ensure the efficient functioning of the enterprises economy.
However, the economic policy does not specify the preconditions, measures, and mechanisms of value creation as a component of the companies finance performance at different levels (Cole et al. 9). In this case, the utilization of Lean Data should support the economic growth and development of Acumens clients through the identification of possibilities to increase the number of value-added products.
Reflection
Acumens case study and Lean Data method show the shift in trends within the business environment. The replacement of investor-driven approaches to produce goods and services indicates that companies are more concerned about their sustainability and social responsibilities. Acumens experience and findings show the importance of consumer-centric strategies so that communities and companies can establish beneficial relations and contribute to each others enhancement and growth.
Work Cited
Cole, Shawn, et al. Acumen and Lean Data 2018. Harvard Business School, no. 9-218086, 2018, pp. 123.
As seen from Images 1-4, some of the most serious security issues include the MBSAs inability to test Windows Firewall due to an error. The MBSA detected that the Automatic Updates system service is not configured even though the best practice is for it to be started as Automatic. The security check discovered that autologin is configured on this computer, meaning that anyone who is able to obtain access to the computer can retrieve all the computers contents. The MBSA failed to run a local account password test. It discovered that some of the users of this computer have blank or simple passwords, and four out of five of them use non-expiring passwords. To improve the security of this computer, it is recommended to enable password expiration for the SQL server accounts.
On the positive side, the MBSA reported that other common administrative vulnerabilities have been avoided. For instance, as per the report (see Image 1), all hard drives (1) are using the NTFS (New Technology File System) file system, a proprietary journaling system developed by Microsoft. This computer does not support guest accounts and restricts anonymous access, which improves its security. On top of that, as Instance QSRNVIVO10 has shown (see Image 4), guest accounts are not permitted on any of the databases. Lastly, this computer is run by no more than two administrators.
The additional information obtained during the MBSA check has shown that neither login success nor logon failure auditing is enabled. It is recommended to enable auditing and turn on auditing in the case of specific events such as logon and logoff. This measure would be helpful with monitoring the event log to identify unauthorized access. As for unnecessary services, none were found during the check. For some reason, the software was unable to figure out which version of Windows was used and reported that the computer was running Microsoft Windows Unknown.
SQL server scan has demonstrated one vulnerability and two positive results. Firstly, it turned out that the permissions on the SQL Server and/or MSDE installation folders were not set in the best way possible. On the other hand, Instance MSSQL12E.LOCALDB (Image 3) suggests that folder permissions are in perfect order. Further, the SQL server scan has shown that SQL Server, SQL Server Agent, MSDE, and/or MSDE Agent service accounts were not members of the local Administrators group. This implies that they cannot run as LocalSystem, which would be a threat to the security of the computer.
The same results were obtained as reported by Instance MSSQL12E.LOCALDB (see Image 3), so the check was passed. Lastly, the MBSA has found no issues with SQL Server/MSDE Security Mod. The report shows (see Image 2) that SQL Server and/or MSDE authentication mode is set to Windows Only. However, as seen from Image 3, SQL Server and/or MSDE authentication mode is set to SQL Server and/or MSDE and Windows (Mixed Mode). The last check by Instance MSSQL12E.LOCALDB concerned the CmdExec role, and it was passed as it was restricted to sysadmin only. In summation, this computer requires some improvements to enhance its security and protect it from threats.
IT Security Risk Management at the UAE Bank in Dubai
Insider Threats
Step 1. Risk Framing
Task 1-1. Risk assumptions. Insider threats pose serious risks to financial institutions. This issue encompasses all kinds of threats that come from employees, third-party partners, set up contractors, and other freelancers. Anyone who has access to the banking environment can eventually take advantage of its vulnerabilities. For the managers of an organization, the wrong assumption to build an IT security strategy around would be that this problem does not apply to them. The UAE Bank in Dubai operates on the assumption that as much as it wants to trust all the individuals involved, statistically, the majority (60%) of cyber-attacks come from within (Marous). Therefore, the UAE Bank of Dubai chooses to take this issue with utmost seriousness and develop its IT security strategy accordingly.
Task 1-2. Risk constraints. Insider threats are fairly difficult to identify and prevent. The reasons for attacking a company from within show a significant variance. Hueca et al. explain that insider threats may be driven by motivation, opportunity, or rationalization (4). For example, an employee struggles financially or has sick relatives, which drives him or her to gain access to the companys resources illegally. Sometimes, the system itself provides an opportunity for someone with malicious intentions (for example financial record databases can be accessed, searched, and manipulated without such events being logged). Lastly, there may be rationalization taking place (for example a contractor attacks a company because they think that they were not paid fairly). As seen from this classification and examples, insider threat detection has everything to do with understanding peoples psychology as well as the work environment dynamic. The latter is often impossible as individuals typically prefer to keep their motives to themselves.
Task 1-3. Risk tolerance. Risk tolerance defines to which degree a financial institution is ready to accept risk, be it expressed in either qualitative or quantitative terms (Hopkin 89). Risk tolerance is used as one of the key criteria in the decision-making process (Hopkin 103). The emergence of the new economy in which the existing risk-reward relationship has become much less predictive and more complicated makes defining risk tolerance quite difficult. In the banking sector, compliance risk tolerance may be the most prominent type. Financial institutions operate in environments characterized by relentless changes that often concern legislation. It is recommended that dynamic banking systems keep their compliance risk tolerance close to zero. In other words, the UAE Bank in Dubai cannot and should not tolerate any incidents that undermine its legality, insider threats included.
Task 1-4. Priorities and trade-offs. Drawing on the previous point, it is safe to assume that one of the top priorities for banks is meeting regulatory and compliance requirements. Previously, many financial institutions used to choose the reactive mode when it came to regulations and compliance. In other words, they would only change in response to regulatory orders, audits, or other pressure. It is essential that the UAE Bank adopts a proactive approach and establishes a stronger connection between legislation and its business strategy. A trade-off, in this case, would be operating costs that might not be as low as a bank would like them to be. Marous reports that operating costs reduction ranks third among the top priorities for modern banks, especially for large organizations such as the UAE Bank in Dubai. Marous explains that financial organizations put a lot of time and effort into cutting costs wherever possible. However, sometimes these efforts are exerted without proper understanding of how short-term financial gains affect the organization on the whole. The UAE Bank cannot and should not cut risk management expenses as safety is vital to the organizations operations and reputation.
Step 2. Risk Assessment
Task 2-1. Threat and vulnerability identification. The two main types of insider threats are called turn cloaks and pawns, the first of which is malicious and intentional and the second are unwilling participants. Turncloak is an insider (employee, third-party partner, contractor, or freelancer) who abuses his work privileges (Hopkin 201). It could be for-profit or sheer entertainment this type is driven by all sorts of motives. A pawn, on the other hand, is a normal employee who might be unfortunate enough to let an error slip, which is later exploited by a turn cloak. Some of the mistakes that a pawn might be responsible for happening are losing gadgets with company data, sending a sensitive document to the wrong person, or installing malware. Insider threat indicators fall into two categories: digital and behavioral (Hopkin 152). Among the digital indicators are downloading or accessing large amounts of data for no apparent reason, accessing sensitive data outside ones job function, and accessing data that is outside of a persons normal behavioral profile. Behavioral indicators include attempts to bypass security, staying at work overtime, having a history of violating corporate policies, and discussing resignation.
Task 2-2. Risk determination Ignoring insider threats may lead to many unwanted effects for the UAE Bank in Dubai. For instance, a malicious employee may sabotage systems and send proprietary data to contending companies, which in turn, will take advantage of the access to unique technological solutions. Another possible situation would be an insider obtaining access to customers financial records and manipulating them for personal financial gains. Lastly, some insiders might want to abuse their work privileges to pursue their non-financial interests. An example would be illegally retrieving staff personal data to stalk female coworkers. A company that allows it to happen without taking proper measures risks not only losing its valuable employees who no longer feel safe but also ruining its reputation for both potential employees and customers.
Step 4. Risk Monitoring
Task 4-1,2. Risk monitoring strategy and risk monitoring. Insider threat monitoring strategy needs to include two elements: human resources management and technology. As mentioned above, employees ulterior motives remain concealed, which makes it difficult to predict whether any of them could potentially have malicious intentions. Firstly, the employment process needs to include extensive background checks. Secondly, the human resources department together with other relevant professionals might want to assess how much risk each person or team poses based on their work privileges such as access to databases, use of proprietary software, and others. As for the second aspect, banks should consider having better visibility on user activity by installing relevant monitoring software. This measure could be expanded into a full-fledged intelligence-based approach. Such an approach could employ AI and machine learning tools for detecting abnormal activities and fraud.
Malvertising
Step 1. Risk Framing
Task 1-1. Risk assumptions. Recent statistics have revealed that one in every one-hundred online advertisements are malicious (Sullivan). While at first glance, this share might not seem to be anything significant, as explained by Sullivan, more than 20% of user sessions may include interactions with malicious or disruptive ads. For this reason, the UAE Bank of Dubai does not dismiss the danger of malvertising.
Task 1-2. Risk constraints. The efficiency of malvertising detection is contingent on two factors: employees digital literacy and cybercriminals efforts to avoid detection. Employees online activities are generally not easy to control without resorting to micromanagement, which complicates predicting possible attacks. At the same time, cybercriminals have been putting a great deal of effort into earning a good reputation among third-party vendors and designing ads in a way that they are not distinguishable from harmless ones.
Task 1-3. Risk tolerance. As with any other bank, the UAE Bank of Dubais success relies on the degree of trust that its customers are ready to put into it. A disruption caused by malvertising and subsequent leakage of data or other adverse outcomes might compromise the banks reputation. For this reason, the UAE Bank of Dubai should adopt the policy of zero tolerance for safety threats.
Task 1-4. Priorities and trade-offs As mentioned in the previous section, any company faces a conflict between cutting costs right now and investing more into aspects that have the potential of yielding benefits in the long run. Safety concerns should not be treated as a trade-off: they need to be a priority, even if it means piling up more costs.
Step 2. Risk Assessment
Task 2-1. Threat and vulnerability identification. Malicious advertisements can seize users attention in two ways. The first scenario is when a user is manipulated by a pop-up ad or an alert and clicks on it by accident (Dwyer and Kanguri 29). The second scenario is somewhat trickier: the attackers employ drive-by download techniques, which expose the user to malicious content while the hosting website loads (Dwyer and Kanguri 29). Once the malvertisement identifies a vulnerability, it installs info-stealing malware or ransomware onto the victims computer.
Task 2-2. Risk determination. The consequences of exposure to malvertising are two-fold: the attacks may impact both the stability of the banking system and on a larger scale, the companys reputation. While browsing the Internet, the banks employees may come across malicious ads and unknowingly let them take advantage of their computers vulnerabilities. An alternative scenario would be malvertisement making its way on the banks official page. In this case, the pages visitors will suffer from exposure and likely associate the bank with poor safety standards.
Step 4. Risk Monitoring
Task 4-1,2. Risk monitoring strategy and risk monitoring. The Malvertising monitoring strategy needs to rely on two aspects: human resources management and technology. Firstly, bank employees need to be educated on safe Internet use and become familiar with suspicious ads, redirects, and other online activities. The staff needs to be knowledgeable enough to make smart decisions on a daily basis. The second part of the strategy should include the implementation of cutting-edge ad scanning systems for faster and more efficient malvertising detection. For all its efforts, the security team should still operate on the assumption that they cannot prevent 100% of all attacks. Therefore, there needs to be a standardized plan in place that would describe in detail how to handle the aftermath.
Works Cited
Dwyer, Catherine, and Ameet Kanguri. Malvertising-A Rising Threat To The Online Ecosystem. Journal of Information Systems Applied Research, vol. 10, no. 3, 2017, p. 29.
Hopkin, Paul. Fundamentals of Risk Management: Understanding, Evaluating and Implementing Effective Risk Management. Kogan Page Publishers, 2018.
Hueca, Angel L., et al. Exploring the Motivation Behind Cybersecurity Insider Threat and Proposed Research Agenda. 2016. Web.
Marous, Jim. Top 10 Strategic Priorities for Banking in 2017. 2017, thefinancialbrand.com/62711/top-10-strategic-priorities-for-banking-in-2017.
Recently, insider threats have become one of the most complex problems in various kinds of companies. This problem is especially acute in structures closely related to ensuring security, including state security. Most of all, in such situations, systems with a certain degree of openness, such as government, industry, universities, and research laboratories systems suffer (National Counterintelligence and Security Center [NCSC], 2017). The loss of valuable information in such structures can lead to a wide variety of consequences. That is why the IC manager and leader must concentrate all forces on ensuring all protective measures and counter-intelligence programs. The purpose of this essay is to discuss the role of various new technologies in countering insider threats.
First, it is necessary to determine what exactly the insider threat is and who the insider is. According to Hunker and Probst, an insider can be a person who has privileged, legal access to any organizational structure and has the right to represent it or make changes to it (2011). Accordingly, an insider threat can be described as a threat emanating from a person who uses his or her access for other purposes or whose access leads to improper use of the system. The most critical aspects that define an insider are access to the system, the ability to represent the system to outsiders, as well as the trust of the organization and knowledge (Hunker & Probst, 2011). Thus, counteraction to insiders should be directed along the way of these aspects.
All approaches to solving the insider threat problem can be divided into two types: social and technical. According to Safa and Furnell et al., factors such as the severity of sanctions and lower remuneration have a substantial impact on employees attitudes toward the company (2019). Accordingly, to reduce risks, managers are advised to pay attention to environmental factors and improve the working environment (Safa & Watson et al., 2018). However, these factors can only generally improve the situation, and for effective results, it is necessary to use technical measures. Such measures may be, for example, host-based analytics, which analyzes data collected from the host, i.e., from every computer.
A variety of statistics can fall into the category of this data, from the simplest values to data from applications. Such data may seem useless, but with the accumulation of statistics, this approach allows us to evaluate user behavior (Liu et al., 2018). As stated above, a host is assessed by a variety of parameters, starting with system calls. These operations give a concept and idea of exactly how the program accesses the internal resources of the computer. Thus, this method is useful for an in-depth analysis of user actions on the host computer, which allows detecting violations in work.
Keyboard and mouse usage dynamics are directly related to user behavior. Since the data collected is directly personal, this method is most suitable for identifying people who impersonate workers, the so-called masqueraders (Liu et al., 2018). Finally, one of the most sophisticated methods is to track the logs of committed actions. Its complexity lies in the massive amount of data obtained, but even in them, useful information can be found. For example, a long chain of login errors can signal a blatant attempt to break into a system (Liu et al., 2018). In addition to these three factors, there are many more, for example, the study of network traffic, but they are all united by the same idea.
Thus, host-based analysis is an effective and multifaceted method of preventing and detecting insider threats. Unlike various social, this method allows to identify an intruder directly by collecting information about the user. However, the disadvantage of this method is the difficulty of implementing these algorithms, since, in addition to issuing the correct result in this matter, the speed of rendering this result is also critical. Accordingly, the organization of host-based analysis requires powerful machines and trained specialists.
Safa, N. S., Maple, C., Furnell, S., Azad, M. A., Perera, C., Dabbagh, M., & Sookhak, M. (2019). Deterrence and prevention-based model to mitigate information security insider threats in organisations. Future Generation Computer Systems, 97, 587-597. Web.
Safa, N. S., Maple, C., Watson, T., & Von Solms, R. (2018). Motivation and opportunity based model to reduce information security insider threats in organisations. Journal of information security and applications, 40, 247-257. Web.
Modern society is gradually shifting from the traditional way of doing business to mobile one. There is a tendency to ensure not only the availability and completeness of the information received but also its timeliness and relevance. To meet this need, wireless networks are being introduced into business everywhere. This paper will discuss the main categories of wireless networks that have the potential to be used in business, as well as the pros and cons of such a solution.
Wireless networks are elements of information technology designed to transmit data between the receiver and the sender over long or short distances without the use of wires. Various techniques can be used to send information, such as radio waves, optical, infrared, and laser systems (Glisic, 2016). WPAN is a personal technology that enables the communication between devices at distances of up to hundreds of meters. One of the most common examples of such a network, Bluetooth, can be used in offices to connect peripheral computer equipment, transfer materials between employees, or to operate tracking and control devices. WLAN is a wireless local area network, better known as Wi-Fi. The network is used to access the Internet simultaneously by a large number of employees. WMAN is used as a telecommunication technology that operates within a particular area (Paul and Kumar, 2016). WiMAX, as an example of WMAN, can be a lifeline for office premises that are geographically isolated from the cable Internet. Finally, WWAN is the widest area network that can interconnect international offices of conglomerates without being linked to specific systems. The 5G network, on which many bets are placed today, can be an ideal solution for uninterrupted and fast transfer of corporate data on the planet.
Wireless networks will increase the mobility of employees in office or production facilities, get rid of a large number of wires, eliminating the cost of installation and maintenance of the wired network. Modern wireless networks are designed to provide employees with high speed and stability. However, wireless for business has some significant drawbacks as well (Paul and Kumar, 2016). Technically, most networks in existence today do not provide users with uninterrupted access to the Internet. In addition, these networks are most vulnerable because sensitive data is sent to the system. The integration of wireless types of networks into the business has a robust economic impact as it requires improvements not only in cybersecurity but also in modems and routers.
Reference List
Glisic, S. G. (2016) Advanced wireless networks: technology and business models. 3th edn. New York: John Wiley & Sons.
Paul, S. and Kumar, S. (2016) A survey on wireless security, International Research Journal of Engineering and Technology, 3(12), pp. 396-410.
Video streaming service providers are online platforms that allow for streaming, and sometimes downloading of films and television series for watching by individual consumers or households. Users pay for the service on a subscription basis, with a set monthly fee which provides access to a catalog of media. Subscriptions plans differ among companies, having either varying access to content, changes in quality or additional features depending on how much a consumer pays. Streaming services differ from other media providers such as cable television, by offering viewers the choice of what and when to watch content. Companies acquire licenses for existing media or create their own films and television shows that the user can browse, select, and watch at any given time on most devices that have an internet connection (Zimmerman, 2019). While an internet connection, either broadband or mobile, is required to use streaming services, the wide availability and affordability of internet in modern day makes such platforms more popular due to the freedom and mobility that they offer in comparison to stationary and highly expensive cable television.
Netflix
Netflix is synonymous in modern popular culture with video streaming, largely being the first platform that developed this type of business model in practically any entertainment sector. While, Netflix once held a monopoly with 91% market share in the sector, it now holds approximately 19% (due to massive competition) despite having a consistent addition of new subscribers, having 167 million users globally and 60 million in the U.S. (Clark, 2020). In 2017, Netflix announced slight price changes, with the basic plan (standard quality only) at $7.99, $10.99 for Standard, and $13.99 for Premium plans. The only differences between the plans are quality of streaming and number of devices that can watch at the same time. The price hike did not affect subscriber growth and continued to demonstrate an upward trajectory while providing Netflix a significant jump in operating profits from $154 million to $245 million (Poyar, 2020).
Netflix stands out from competition by not having advertising from outside marketers nor having differently priced tiers of content. All customers, no matter the plan have access to the full library. Netflix uses subscriber fees to secure licensing agreements, totaling 15.3 billion in expenses, consistently negotiating new deals. Netflix is a data-driven company, utilizing complex analytics to gauge consumer interests both in certain titles as well as genres, and comparing this to competition. The data is extremely detailed ranging from times and dates when content is watched to types of devices and zip code, as well as behaviors such as searches and browsing activity on the platform. This is carefully analyzed before engaging in negotiations for licensing deals (Patel, n.d.). Netflix is also known for creating its original content series that are unique and not built on existing franchises like most new content from other streaming services, a strategy which has proven to be highly successful financially (Spangler, 2020).
Due to Netflixs early start in the space, it has a broad appeal to a wide range of consumers. It is particularly popular with younger generations aged 18-24 as well as a large portion of adults 25-39, also popular with families. (Yang, 2017). The Netflix platform is also well-developed and adapted for mobile allowing to watch on the go, offering high quality ultra-HD content for those who enjoy it, while also offering a tremendous library of choice for every genre, for an affordable subscription price. The Netflix platform which allows to continuously watch shows, even those that just released, in one sitting creating the phenomenon binge watching also makes it appealing to a wide range of consumers (Kay, 2019). Overall, out of all streaming platforms, Netflix can be considered the most versatile, trendsetting, and, in some ways, more affordable for the content offerings it has.
CBS All Access
CBS All Access is a North American streaming service which allows for live programming from a wide range of CBS stations, news and late night shows, sports such as the NFL, and a number of properties to which CBS owns access to such as Nickelodeon, Comedy Central, MTV, Smithsonian, and Paramount Film collection. It also owns back catalogs to a number of popular television properties such as The Big Bang Theory, Star Trek, CSI: Miami, and Twin Peaks. CBS has two subscription plans, one for $5.99 a month with limited commercials, and the other for $9.99 a month without commercials and the ability to download videos to watch offline.
Unlike Netflix, CBS All Access does not offer ultra-HD or HDR quality for any of its content, even newest additions. However, having a widespread catalog of older shows, most of which were produced by CBS studios, it does offer many older shows and movies in HD quality that cannot be found on other streaming platforms which also license this content. One prominent example is the Star Trek series which have been remastered. It also produces very little original content, and the few shows it does, are spinoffs of its existing properties. However, back catalogs and latest viewing programs from cable television which CBS offers distinguish it from competitors.
CBS All Access has one of the cheapest subscription offerings with commercials, other than Hulu. Meanwhile, its regular subscription is also slightly below competitors currently. This occurs for two reasons. First, unlike Netflix, CBS All Access does not invest billions into original content. The majority of its library comes from its own studio produced catalog, which all belong to the parent company ViacomCBS, and require no licensing expenditure. In fact, CBS rarely holds unique rights to any property and continues to license them out to competitors for additional income. Second, CBS is unable to boast such large subscriber numbers, with only approximately 4 million users in the U.S., experiencing gradual growth. It is seeking to be appealing to market segments through its pricing that is affordable, both more affordable than competitors or regular cable television while having both offerings. The lost income on the basic subscription is supplemented by commercial advertisement revenue.
CBS All Access offers immediate high-quality versions of its latest prime time content, which is appealing for those that do not have a TV antenna or cable or dislike the consistent commercial disruptions of regular television. It also offers original and often better quality of content than when it is licensed to other platforms. Both the cheaper and more expensive plan have access to the same content at the same quality, with the only difference being the presence of advertisements. This differs from the Netflix approach which by principle, since its inception, has refused to use advertising on the platform, even for its own shows. The difference may stem once again from the fact that CBS is inherently a television studio company which relies heavily on advertising for profitability. The price level difference for CBS All Access users only affects the experience factor (commercial disruptions) but does not touch the underlying content one is trying to watch. It is likely that if versioning was not allowed and CBS could not offer differing plans, the company would still remain on the low end of subscription costs at $8.99 or lower. This is due to its own back catalog and lacking the need to invest in original content as well as being more appealing to consumers who are seeking to view live and syndicated television through a streaming platform at an affordable price.
Amazon Prime Video
Amazon Prime Video is an online video streaming service that is owned and operated by the technology company and online marketplace Amazon.com Inc. Similar to other platforms, users are offered access to a library of television shows and films, as well as original content produced by Amazon, following the footsteps of Netflix. However, similar to other online marketplaces, users have the ability to purchase or rent newer movies or television shows on-demand. The service also offers subscriptions to purchase live television content from other studios such as HBO, Cinemax, and Showtime for an additional price, similar to a cable package (Csathy, 2020).
Amazon Video takes a unique approach to its pricing, bundling it with its primary overarching service of Amazon Prime. Its most popular plan is Prime monthly for $12.99 includes access to Prime Video but also features such as unlimited music streaming, digital storage, and e-book library along with free fast shipping from its online retail marketplace. There is the Prime annual package for $119 a year, that offers the same features but with a 36$ discount rather than paying every month. Finally, there is a standard rate of $8.99 that offers just access to Prime video and nothing else. The content or quality does not differ among the plans when it comes to the Prime Video itself (Amazon, 2020).
However, it is a strategy of bundling services that Amazon is known for. Due to the popularity and relevance of the online marketplace, Amazon Prime is one of the most popular subscriptions, at over 150 million global users. Primes success in remaining competitive came at the expense of larger profit margins. Instead, it focuses on creating a brand customer loyalty with various Prime benefits that come with the subscription (Csathy, 2020).
From a survey conducted on Amazon Prime consumers, streaming video is ranked as the highest valued product, followed by the benefits of flagship shopping and shipping. By bundling the services, Amazon creates a value package for the consumer, not just in advertising, but realistically giving users much more for the price than most competitors. Therefore, a user that came for the video streaming service, is also offered other entertainment options as well as now being able to enjoy free shipping options and discounts (Campbell, 2020). That may incentivize people to not only use Amazon Prime Video but the marketplace as well. Meanwhile, those who use Amazon Prime primarily for shopping have access to video streaming and have the incentive to purchase one of the newer non-free movies or episodes. In the end, the bundling benefits Amazon in the long-term of creating an ecosystem of user services with various points of profitability. The accessible pricing points and benefits creates a scenario of conditioning prices on purchase history since consumers are willing to pay more for the favorability of features, especially when there is something available for everyone which helps to differentiate customer segments. There has always been a stable demand for the core services such as two-day shipping, but with Amazon Prime video, there is something new in terms of services and sector which further enhances the companys product offerings bundle.
HBO Now
HBO Now is a direct to consumer video on demand service that allows access to the premium content available on the cable television network HBO. Access to HBO Now is either bundled with the cable package or available on its own for a current price of $14.99. The service currently has approximately 5 million paying subscribers. Owned by the media corporate giant Time Warner, HBO Now offers access to a library of original series and films from the HBO network as well as content from content partners such as Warner Brothers, 20th Century Fox, and Universal Pictures. It does not offer the live streaming capabilities of the HBO Go platform that is offered to cable package owners (). At this price point, HBO Now remains the most expensive video streaming platform out of all its competitors.
Although anecdotal evidence suggests that HBO Now has been accused of overcharging for its services, its pricing strategy is appropriate both to its positioning and purpose as a platform. First, HBO positions itself as a premium service all around. It is one of the most expensive cable channel additions and takes a similar approach with its streaming service which on a month-to-month basis costs about the same. HBO Now was not a service created necessarily to compete with cord-cutter services like Netflix but simply a capability that is available to those who want to watch HBO content without restrictions of cable or broadcasting (Barr, 2015). Furthermore, HBO, both the cable network and the video streaming service differentiate themselves not for the quantity of content, but for its extremely high quality in every aspect: visual fidelity, storytelling, and award-winning productions. Only in recent years have other platforms began producing similar content. However, the premium HBO brand behind certain content such as the pop culture favorite Game of Thrones creates this segment that makes HBO Now stand out.
Comparing HBO and Netflix is difficult. Despite having much less subscribers than Netflix, HBO Now is not a standalone service. The premium content requires the help of cable and satellite providers to be successful. The parent company of HBO, Time Warner relies heavily on its existing relationships with cable providers. The HBO Now platform is a middle-ground compromise between traditional cable and the future of streaming, but ultimately allows the company to capitalize twice on existing content. However, the price point is unlikely to cost less than it is currently unless there is a radical shift to how consumers receive television (Barr, 2015). Despite a mass audience having transitioned to streaming services, a significant amount of American households still buy and watch cable television, and many of the most popular shows still air on television before being transitioned to streaming, with Netflix and Amazon original content being essentially the only outliers. This is why HBO Now will remain a high-priced premium service and this target audience focus does not require it to use versioning or subscription plans utilized by other platforms.
Summary and Conclusion
The popularity of streaming platforms came largely due to the cord-cutter movement, a pattern of viewing that cancel multichannel subscription selection or cable packages, in favor of other types of viewing, one of which is digital streaming. Cable television has historically been considered expensive, while streaming services are often both cheaper and more convenient (not having to rely on broadcast schedules and having control over content watched). Despite price hikes in recent years on the most popular platforms, streaming still remains cheaper than cable, since cable companies also raise prices as well as commonly require a contract and installation fees. From a consumer standpoint, one saves money on these as well as other broadcast fees and package fees such as for sports that the household may never watch. With streaming, a consumer can have control over what they sign up for, picking and choosing between which platform has the best shows for their interest as well as many services now offering live television as well. Often, having several streaming service subscriptions ends up costing less than the cable TV package per year, even with broadband internet costs included (Snider, 2020).
It is evident that each video streaming service approaches pricing differently. Netflix has mid-range pricing with very gradual hikes each year to finance its licensing and content creation with all content available to users. Amazon follows similar pricing mostly but focuses on bundling additional products and offering purchases of outside content for consumers. CBS All Access is a more affordable approach, focusing on wide availability with a plan having advertisements and live television offerings. Meanwhile, HBO Now is the premium offering that has a limited catalog and but has high quality content. Each pricing strategy meets both the financial requirements (profit margins) of the company as well as targeting the market segments that are most likely to sign up for the service, allowing for continuous growth of subscribers, even with rising competition in this sector.
Csathy, P. (2020). Amazon Prime Video: The stealthy, ominous streaming force. Forbes. Web.
Clark, T. (2020). Netflix is still growing wildly, but its market share has fallen to an estimated 19% as new competitors emerge. Business Insider. Web.
Kay, P. (2019). What Netflixs approach to audience personalisation can teach SMEs about content strategy and demographic targeting. Web.
Patel, N. (n.d.). How Netflix uses analytics to select movies, create content, and make multimillion dollar decisions. Web.
Pino, N. (2019). HBO Now: everything you need to know about HBOs standalone service. Web.
Potential Effects of the division of USCYBERCOM and NSA
From an intelligence management perspective, the division of USCYBERCOM and NSA does not impair mission effectiveness yet divides the limited resources. First, it is important to consider that the NSA and Cyber Command have separate mission sets and operate under distinct legal authorities (INSAs Cyber Council, 2018). While NSA is responsible for intelligence collection conduct, cyber surveillance against foreign powers, and espionage, Cyber Command has authority for offensive cyber operations. The close bond and cooperation of the two organizations established and maintained historically does not remain critical currently (INSAs Cyber Council, 2018). Keeping cyber collectors and warriors under the same leadership may appear as ineffective and limiting to the autonomous mission of each. Earlier Cyber Command heavily relied on the NSAs resources, yet the organization matured both in manpower and operations, exhibiting a greater degree of functional independence.
While there is little doubt that the two organizations can fulfill most of their missions effectively, the question regarding their adequate resource allocation is still to be answered. Shared personnel, tools, equipment, and knowledge base ensured proper execution of command and control systems used to deconflict cyber operations (US Department of Defense, 2018). Holistic training of the employees and common objectives in mutually achievable missions prevented the jeopardization of the countrys defense. Therefore, potential split of USCYBERCOM and NSA will likely result in the division of limited resources, such as highly qualified labor force, technical equipment, and tools enforcing cyber security of the nation (DoD, 2018). To prevent an insufficient distribution of resources, careful preparatory measures should be taken to ensure gradual separation of the two organizations. Though the initial call made by Obamas administration failed to advance in Congress, potential division of USCYBERCOM and NSA can occur soon.
Risks to Ending the Dual-Hat Leadership Arrangement
The debate in regard to dual-hat leadership arrangement is ongoing and multifaceted. Ending the dual-hat leadership agreement comes with its own challenges and risks; thus, the decision has to be carried out with the utmost attention to detail and risk assessment. To start with, termination of the arrangement may lead to the unfair prioritization of support requests (Machiavelli, 2018). Special attention, accompanied with unfair treatment, can be given to the organization or mission based on the individual favor or preference of the commander (INM 660). For example, partiality for the collection of signals intelligence over the cyber operation execution can lead to the advancement of NSAs cause over Cyber Commands mission. Another risk to consider is an overly broad span of control, resulting in the decay of effective leadership (Machiavelli, 2018). It can be inefficient for a single commander to oversee two large independent organizations, having a detrimental impact on management.
Ending the dual-hat leadership arrangement might also contribute to the increased potential for exposure of NSA operations. Enabled sharing of resources does not guarantee its adequate allocation and violates the secrecy of the cyber tools (DoD, 2018). As the agreement ends, the tension between the staff of two organizations might accelerate. Strained relationship is harmful for the military tasks and intelligence operations in the mutually achievable missions. The tight bond between NSA and Cyber Command also has an impact on the Cyber Commands culture and operational development (DoD, 2018). To manage the risk in the future, equities should be reconciled, and resources should be allocated as per the collective decision of the personnel. Proper balance should also be found between the war fighting and intelligence missions.
INSAs Cyber Council. (2018). A framework for cyber indications and warning. INS
Machiavelli, E. (2018). Breaking up the dual-hat leadership of National Security Agency and United States Cyber Command: The central debate in the United States Cyber Community. Maggio
US Department of Defense. (2018). Department of Defense cyber strategy. DoD
Modern technologies are developing at an incredible speed, and this process is accelerating over time. New achievements related to information and electronics are affecting all areas of human life. For instance, they are used in everyday activities of people, business, trade, and science. In particular, the use of modern technology has a significant impact on the military sphere. Researchers state that leaders say they are looking for a new approach to data, one that breaks down siloes while maintaining security (Stone, 2020, p. 1). Thus, this issue is one of the vital among these people now.
ISR (Intelligence, Surveillance, Reconnaissance) is one of the most important methods of data collection, and with the development of technology, the capabilities of these systems increase. Undoubtedly, this affects the quality of intelligence and tracking, allowing people to get more data and have more opportunities for their analysis (Arrigo, 2016). However, one of the main problems associated with ISRs is their diversity. Different military organizations within the same country use different ISR systems, which can lead to misunderstandings and errors (Leadership perspective: Managing intelligence, Surveillance & Reconnaissance (ISR) integration, 2020). This situation may turn out to be critical; therefore, it is necessary to understand whether it is worth developing a particular system for all these organizations and introducing it everywhere.
On the one hand, it is necessary to be fully aware of the adverse effects of an excessive number of ISR systems. The main problem is that military organizations must work together within the framework of one paradigm since, for instance, in case of an emergency, they will need to exchange information. However, when using different ISR systems, data transfer will become a much more time-consuming process. There is a possibility that one of the parties will not even be able to use them due to the inappropriate format. This means that the communication process will be disrupted, and there will be no opportunity to bring it back to normal quickly. The state should not allow such difficulties in emergencies.
On the other hand, functioning within the framework of a single ISR system will make the data exchange process quick and convenient. This can have a significant positive effect on the military sphere, as professionals in this field will be able to spend resources on more critical issues. Researchers state that professionals should ensure the development of an institutional culture imbued with a deep expertise in intelligence and national security (Richelson, 2018, p. 38). Moreover, in a critical situation, the time and effort saved can also play a significant role.
It is undeniable that the integration of a single ISR system is associated with a wide range of difficulties. This requires time, money, and the work of specialists. Integration is also connected with bureaucratic issues: the use of systems will need to be established with the help of many documents. However, these efforts are necessary to build a more productive interaction between different organizations in the future.
Thus, the creation of a single ISR system for various military organizations is a useful and significant prospect. Without it, military forces could potentially suffer considerable damage, which could adversely affect the entire state. If this system is available, these situations can be avoided. According to Suojanen (2018), since military operations cover life and death situations, validity of information in space and time is often stressed in the planning (p. 19). Military organizations will be able to work more productively and harmoniously. ISR system development and introduction are worth the effort and resources necessary for its creation because it is designed for long and fruitful functioning.
References
Arrigo, B. (2016). The SAGE encyclopedia of surveillance, security, and privacy. SAGE Publications.
The project is 24% complete, with the full execution of the first 3 key milestones, which comprise both the project commencement meeting and approval of the required business documents. Other deliverables achieved include the implementation of modules 1 and 2. However, the project experiences some factors that may negatively affect the process of its execution. These problems consist of the resignation of the overall project manager, who was tasked with the monitoring of the program implementation process. The absence of the leader leads to the slow progress of project activities. The continuation of the task was also slower than expected, due to the changes in requirements by the users. A lack of an established communication channel, implementation, and risk management plans were crucial hitches.
Project Analysis
The Impact of Scope Creep
The user requirements keep changing, and this interferes with the adherence to the project plan that was prepared initially (Aaltola, 2017). Therefore, information regarding the preferences of the users should be gathered before the succeeding stage of project execution, such that the needs may be incorporated into the existing project schedule.
The Significance of the Baseline
The project exceeded the budgeted cost by 20%, hence the supervision team has to source the required funds or stop the execution of the project (European Union, 2018). The management group should have collected all the information regarding the actual cost, before preparing the final budget (Aaltola, 2017). Programming was also delayed, and this could have been prevented by first establishing the interface requirements.
The Impact of Baseline Change
The alteration of the baseline could cause the amendment of the whole project plan and incorporate the new resource requirements (World Bank Group, 2015). This would also lead to an adjustment of the budget, hence delays in project completion.
Estimated vs. Actual Costs
The estimated cost of the project was lesson 20% of the actual cost. This could be avoided by having identified all the project requirements and their associated costs and then a supplementary budget should have been incorporated (World Bank Group, 2015).
Forecasting
Assumptions that Guide the Forecasting
The forecasting assumes that the project timeline and resources are not adjusted (Aaltola, 2017). Another supposition is that the milestones and deliverables are maintained.
Projected End Date of the Project
The projected end date would be extended due to insufficient funds (Aaltola, 2017). Inefficient management and poor communication would also lead to the extension of the completion date.
Cost to Complete the Project
Project completion cost would be higher than the initial, due to the addition of new program features and other project requirements (World Bank Group, 2015). The time extension is also associated with an additional cost.
Forecasted Overrun at Completion
The project is likely to end later than expected, with a financial overrun of 20% or more. Extra resources will be ordered to ensure the continuity of the project.
Impact of the Past
The issues can have a significant influence on the failure of the project (Aaltola, 2017). Lack of additional funds and time can cause the stoppage of project execution (European Union, 2018).
Corrective Actions
Immediate Actions Necessary to Rescue the Project
A competent project manager should be recruited to organize and oversee the project execution process (Aaltola, 2017). Extra funds should also be obtained to ensure the continuity of the project.
Target Dates to Rescue the Project
The deadlines for program development, monitoring and control, and completion should be extended so that the managers can have ample time to find extra sources of funding (World Bank Group, 2015). The adjusted dates will also allow for the users to present their opinions on program adjustments.
Necessary Steps for Long-Term Success
The company should adopt a standard system for budgeting and planning for any project. The overall project manager should be contracted for a compulsory service period of more than 10 years (World Bank Group, 2015).
References
Aaltola, K. B. (2017). Project management handbook. VTT Technical Research Centre of Finland. Web.
XBRL is an open standard for business information reporting and regulating the exchange of financial information through XML language. The specification of this language is supported and published by an international non-profit organization XBRL International. XBRL contains metadata that forms taxonomies describing both individual reporting parameters and the relationships between them and other taxonomy elements. Thus, XBRL is a universal digital language for financial and business reporting.
One of the main features of XBRL is the presence of precise and authoritative definitions. Taxonomies are definitions that are entirely consistent with reporting terms and items. They are developed by government authorities and, therefore, fully comply with the requirements of laws and regulations. In addition, XBRL supports the ability to establish testable business rules that regulate the transmission of accounting information. This allows for better control over the quality of information sent to regulatory authorities and third parties, as well as to identify questionable information. Moreover, XBRL supports an unlimited number of languages and a broad spectrum of software.
The primary users of the XBRL framework are companies and regulators. Typically, different regulators require data in different forms, although the facts contained therein may be the same. Reporting companies thus have to fill in identical data in separate reports. XBRL allows unifying this system by filling in business information once and then generating any reports based on this data. It defines an electronic format for reporting that allows computers to create, validate, and process reports automatically. Determining a unified semantic meaning of facts ensures that each recipient of the report interprets the facts in the same way. Regulators, such as stock exchanges and securities regulators, also benefit from XBRL. First, they are given the opportunity to evaluate the compliance and performance of listed corporations and securities in a universal format. Moreover, regulators can provide the relevant data in this format for other market participants in order to ensure its disclosure and accessibility.
Inline XBRL
The Inline XBRL (iXBRL) specification allows for embedding XBRL document tags into an HTML page. The main goal of HTML format is the presentation of data in an easy to understand and straightforward way. In other words, HTML documents have become a framework for displaying XBRL data. iXBRL framework is used by both regulators and companies across the globe to customize business reports and financial statements in both human-readable and machine-readable formats.
The benefit of iXBRL is that it provides structured and clear information to regulators and analysts. At the same time, it allows reporting entities to maintain complete control over the type and presentation of the relevant report. Moreover, it eliminates the need for time-consuming data analysis and material search. The IXBRL is designed in such a way that data marking and organization are clear and obvious, and it takes several seconds to extract specific information. It should be noted that this format is recognized by the US Securities and Exchange Commission and many government financial regulators of other states. Thus, a company that has prepared accounting data in this way receives additional credit and a guarantee that this format is acceptable to the regulator.
Financial and business report data can be displayed through the interactive iXBRL viewer. It allows exploring structured accounting data as well as concepts and definitions related to an item in a report. One of the most convenient features of the viewer is the ability to click on a specific position in the report, such as the amount or percentage, and find out detailed information in this regard. It may include the date, accuracy, change, and other information, depending on the definitions of a particular taxonomy.
Artificial Intelligence, previously seen as an inconceivable concept, is now a reality. For the past 30 years, technology has drastically developed and, moreover, began to evolve. At the hypothetical moment when the development of machines becomes beyond human control, the technological singularity will begin. Currently, the futurists try to define what the technological singularity constitutes, what can be considered as a start of the uncontrolled machine evolution and what impact it would make on humankind.
General Discussion: The TS definition, TS drivers, and obstacles
The possible definition of the TS
The future is inevitably coming, which constitutes its inherent feature. Artificial intelligence (AI), a digital neighbor of that of human beings, is now being exponentially developed, which, in turn, might lead to the technological singularity (TS). Nevertheless, defining intelligence and technological singularity is one of the most difficult problems one may face. One of them suggests that Intelligence is a problem-solving algorithm, that can only be understood with respect to a specific problem [1]. Chollet states that the AIs known so far are highly specialized like playing computer games or classifying images into categories [1]. The other possible form of the Intelligence formulation (a machinery one, in this case) is the combination of the abilities of a machine to think, to collect data, and to adapt [10]. Whereas, the boundaries of determinacy of the aforementioned intellectual skills have not yet been defined.
According to Grout, the idea of TS can be realized through two concepts process-based and result-based models [10]. The process-based model of the TS constitutes an automated replication combining design and production [10]. In order for the machine evolution to proceed, all phases should be completely automatized. However, they are subjected to human intervention and mediation between the sub-phases. The result-based definition constitutes the automatized and independent replication of hardware. It can be exemplified by a 3D printer that creates the identical version of itself [10]. Then, with the extra code lines, it might create a better version of itself. The superior versions may obtain better features in hardware (control, power, capabilities) and software. In this case, thus, the evolution continues with each generation of a machine.
The TS drivers and obstacles
With some of the TS definitions and Intelligence mentioned, the study will consider the TS drivers and obstacles. First, the powerful driver of the TS is the massive expansion of technology: connected computers, numerous devices, ongoing communication, data accumulation in cyberspace, boost the development of technology [2]. On the other hand, a powerful contributor is the designing and production of neuromorphic chips, which are more efficient and are capable of collecting sensory data, such as smell, sound, and image [2]. These chips are expected to change radically the machine learning process, which, in turn, catalyses the evolution of technology. The next TS driver, presented in the book Conscious, is the cooperation of hardware and software [10]. When integrating this, the hardware initially working with the primitive human-created software was later capable of adapting the software to its purposes and develop further.
Nevertheless, several processes and factors may impede the TS realisation. First of all, artificial intelligence is strongly limited by the framework of the environment for which it was created [2]. Sensors to stimulate brain activity can hardly afford to capture telephone devices nearby, as they were not designed for it. In other words, it implies the limitation of Intelligence development according to the context it expresses itself in. Then, the AI global regulation may become a difficulty for the TS to be reached. Earlier, global cooperation on such tasks as nuclear disarmament has been rather challenging [7]. Moreover, the elaboration of the AI needed to be regulated may provoke a certain advantage to the country that first develops it and, thus, provoke separation and rivalry between the countries. While claims to hardware and software may be challenged by fiction, such as Conscious by Grout, other arguments against the TS are based on research and scientific observations. In other words, there are two approaches to assessing and predicting the TS scenario, and the scientifically based scenario implemented by researchers is more reliable because it represents educational value.
The conflicting opinions on the issue
In continuation of the topic of scientific research on the phenomenon of TS, it is essential to show that the available information around this problem only describes in detail the definitions of technological singularity and predicts its development through the proliferation of electronic devices or integration of subsystems, but cannot create a unified model of views on the question of whether such a future has come. According to Lahoz-Beltra [4], humankind will not reach the technological singularity soon due to limited resources and energy. The researcher suggests that if at present about 1000 Exabytes are stored on the Internet, and human knowledge doubles every 12 months, then the point of reaching TC may not come shortly. Soon, the Internet will consume the total electricity produced worldwide, and the energy used would be equivalent to the energy produced by 1500 nuclear plants [4]. Therefore, with the current technology, it is hardly possible to reach the TS in the near future.
However, there are other views on why the TS scenario will not be implemented soon. Some experts state that the TS point cannot be reached due to the specific, limited forms for the Intelligence to increase. Upchurch assumes that a brain needs a certain environment to gather knowledge: he suggests that if intelligence growth is linked to a specific environment, upbringing, and a problem to solve, then it cannot evolve solely by existing isolated [3]. The children that did not grow up within the human culture environment are not able to develop human intelligence. The human baby is born with a set of reflex behaviours that drive their sensorimotor development and learning: hands that can grab, eyes to visually follow objects [2]. This innate capabilities boost human intelligence development; the brain itself, isolated, is not capable of increasing intelligence. The AI is, therefore, highly unlikely to evolve and reach the TS is not included in a context.
However, some researchers, futurists, and science fiction writers claim the high possibility for the current technology to reach the TS soon. The Internet of Things (It), described by Grout in his book Conscious, acquired control over transportation, communications, climate, life-support, and weaponry [10]. Such control could be gained only by the exponential growth of machine intelligence. Furthermore, others claim that AI growth and development not only is possible but also would benefit people. Grout describes the smartwatch, which is capable of delivering the location of a certain person to their loved ones [9]. He also enhances the ability of It to load the nutritional information to the personal diet plan [9]. The AI evolution is happening in the present moment, and its development will only be accelerated more.
Despite distinct points of view on the technological singularity, if not achieved shortly, can be, at least, approached to. The growth and evolution of the machines may contribute to the increase of the resources and energy, which, in turn, may deliver space for further AI development. Although, to investigate an artificial brain functioning and intelligence growing in a specific environment can hardly be implemented at present. The futurists and fiction writers, along with the researchers, contribute to the development of diverse devices and the constant speed growth of technology, taking the population one step closer to the technological singularity.
The social, ethical, moral, legal, political, economic impact
The machine evolution entails enormous changes in all aspects of life: social, ethical, moral, political, environmental, and economic. It is worth starting with the supposition that the legal realm may be significantly affected by AI development. The expansion of the security systems, coupled with satellite and camera recording, may ensure and, at the same time, abuse ones privacy [6]. A camera or a satellite could record each move in the street or an outlying area. Moreover, the possible impact on social life may be ambiguous: on the one hand, technology development implies better communication among people at any part of the world and, thus, may provide people with an opportunity to gather, collect, and share information instantly. However, there are other views on why the TS scenario will not be implemented soon. On the other hand, the development and expansion of AI technologies can be a reason for the human loss of skills, limited memories, slow thinking speeds, and may cause unemployment and social inequalities [8].
Furthermore, a machine can hardly be conscious of the environmental, political, economic, and moral impact it makes. The It destroyed a large part of life on Earth since the machine is not aware of what life means [9]. Economic progress is inextricably linked to regular automatization: if this process, due to AI development, increases in terms of production of goods and services across nations, the economic system may be changed significantly [5]. The ethical and moral aspects of humans life may also be changed drastically. For instance, robots may benefit or harm people and may be prosecuted as a human being. However, the measures of restraint and penalty for machines may be different from those for human beings [6]. Finally, if AI is capable of replacing human, the boundaries of human nature may be reconsidered.
Conclusions
The technological singularity, a capability of the machine to replicate its hardware and software without human intervention, remains an arguable issue. The experts define the notion as an ability of the AI to design, produce, replicate, and evolve automatically. The latter, although, is doubted by many since the evolution of any AI demands purpose and consciousness that the machine does not possess. While some state the TS is inevitable, others argue that it will scarcely happen in the near future due to the limited resources and the impossibility for the isolated artificial brain to learn and develop. The author of the current study, however, believes that the TS will shortly happen. Soon, the scientists will find the manner for the AI to learn within the human culture context, gather and collect data at unbelievable speed. The human desire to transcend to borders will motivate people to manage social, economic, and other problems emerging with AI development.
R. Lahoz-Beltra, The crisis of noosphere as a limiting factor to achieve the point of technological singularity, Interdisciplinary Description of Complex Systems, vol. 16, no. 1, pp: 92-109, 2018. [Online].
V. Callaghan, Introduction to the Technological Singularity, in The Technological Singularity. Springer-Verlag GmbH Germany, 2017, pp: 1-11.
V. Callaghan, How change agencies can affect our path towards a singularity, in The Technological Singularity. Springer-Verlag GmbH Germany, 2017, pp: 1-11.
V. Grout, Conscious. Lulu Press, 2016.
V. Grout, The singularity isnt simple! (However we look at it) A random walk between science fiction and science fact, Information, vol. 9, no. 4, pp. 99116, 2018. [Online].