The purpose of the study is to evaluate research methods in two articles by Bouchard (2011) and OConnor, Monk, and Fitelson (2014) devoted to the investigation of prenatal development. The analysis of research constituents will help to identify the correlation between research instruments, internal and external influences, ethical considerations, and overall research outcomes.
Research Design
Bouchard (2011) employed the experimental research design for the investigation interrelations between the psychosocial variables and prenatal attachment. In this case, the psychosocial variables included neuroticism, quality of union with partners, and attachment to family members (Bouchard, 2011). The researcher assesses the given indicators in the sample of 161 adult couples in the third trimester of pregnancy to find what influence they have on the quality of the current attachment to the unborn child. The experimental study allowed Bouchard (2011) to analyze the effect of alternative, independent variables in distinct participants on the dependent variable of prenatal attachment.
In their study, OConnor and colleagues (2014) investigate the relationships between the variables of maternal mood and developmental outcomes in infants. The authors obtained information about the multiple study participants from the databases. They didnt have the opportunity to alter the independent variables to evaluate the cause-and-effect interrelations between maternal anxiety, depression, and the adverse outcomes in children. It means that the researchers applied correlation design.
Research Methods
The sampling technique in the study by Bouchard (2011) is convenience sampling. The data collection tools include the different scales for assessing paternal and maternal attachment indicators to the fetus. Other instruments included neuroticism subscales, the Dyadic Adjustment Scale, Adult Attachment Scale, and the London Measure of Unplanned Pregnancy (Bouchard, 2011). The instruments were applied to accumulate the global scores and correlate them with the associated prenatal attachment.
OConnor and colleagues (2014) collected data from large databases such as PubMed, Cochrane Library, etc. The data analysis tool implemented in the research is the selective literature review. The authors included various quantitative papers in their review to evaluate prenatal maternal moods relevance to the psychological, cognitive, and biological development of infants. OConnor and colleagues (2014) introduce the theoretical and historical background of the problem and focus on empirical evidence linking the study variables.
Summary
Bouchard (2011) found that both paternal and maternal quality of prenatal attachment is affected by their previous experiences and emotional states but in a different way. A high level of partners union quality was correlated with womens strong prenatal attachment only when the participants were not characterized by a high level of attachment to their caregivers, and their neuroticism level was low. On the contrary, a high level of union quality positively affected males prenatal attachment when the participants had a strong psychological attachment to their parents (Bouchard, 2011). The obtained results allowed the researcher to identify the cause-and-effect relationships between the variables, and find out how the unique protective psychological mechanisms affect the level of prenatal attachment.
The literature review conducted by OConnor and colleagues (2014) revealed a direct link between the infants exposure to prenatal maternal distress, depression, anxiety, and the adverse psychological and biological, developmental outcomes. It was observed that the exposure to the prenatal distress might affect the fetus in the dose-response patterns, and it means that the prenatal developments adverse impacts may be clinically detected (OConnor, Monk, & Fitelson, 2014).
The observations made by the authors have significant medical implications the researchers suggest that effective prenatal intervention is possible for both mothers and children. The ability to provide effective psychological and physiological interventions may help the families to reduce the potential emotional and financial burden associated with the disorders developed due to the prenatal distress exposure.
Strengths and Limitations
The major strength of the experimental study conducted by Bouchard (2011) is the opportunity to evaluate the cause-and-effect relationship between the variables. According to the researcher, the inconsistency in the previous research findings related to the subject of prenatal attachment was associated with the failure to examine the interrelations between the predictors of attachment (Bouchard, 2001). The researcher fulfilled the gap identified in the literature review by evaluating the psychosocial variables in couples. However, the sample included merely first-time parents to eliminate the interference of previous pregnancy experiences. But the inclusion of participants from a broader demographic background may increase the representativeness of the research data.
The selective review of literature conducted by OConnor and others (2014) provides a structured, consistent, and systematic evaluation of the principal findings in the field of prenatal development and the effects of exposure to prenatal distress. The study demonstrates that the further investigation of the subject may help to elaborate an efficient prenatal intervention of potential disorders and to prevent the adverse developmental outcomes through education of patients. Although the qualitative studies are usually associated with a high potential of data biasing and subjectivity, the researchers professional skills helped to minimize the subjectivity. They helped to maintain the scientific rigor throughout the study.
Evaluation of Internal and External Factors
The major internal factor influencing both of the studies is the researchers ability to track the causal relations between the introduced concepts, the program of the study, and the research outcomes (Bleijenbergh, Korzilius, & Verschuren, 2011). The internal structure of the articles should be consistent and logical. Otherwise, the research results will lack validity. The studies conducted by Bouchard (2011) and OConnor and colleagues (2014) are associated with a high level of consistency because their outcomes are logically interrelated with the selected research methodology.
The external factors influencing the outcomes in the qualitative study include the consistency of conducted analysis with the theories in the psychological discourse. OConnor and colleagues (2014) managed to increase the external validity of the results by considering multiple perspectives in the text and rationalization of distinct points of view. The external validity of the quantitative study conducted by Bouchard (2011) primarily depends on the sample size and sampling technique. However, the researcher used the appropriate sample selection as it allowed her to answer the formulated questions, and a large sample size guarantees that the data may be generalized.
Ethical Issues
The assessment of prenatal developmental abnormalities has many ethical implications. The prenatal diagnosis usually creates significant distress in pregnant women and their families (Zizzo et al., 2013). Therefore, the data accumulated in prenatal assessment is associated with a high level of sample participants vulnerability, and it requires careful use of the obtained information in research.
Bouchard (2011) assessed the psychological indicators of the study participants, and she had to follow the principles of informed consent and confidentiality. The disclosure of data obtained through examination of psychosocial variables may have a significant impact on the social and psychological identity of the participants. Therefore, by following the principle of confidentiality, Bouchard (2011) reduced the opportunity of damaging the participants well-being.
Since OConnor and colleagues (2014) implemented the literature review method, their major ethical consideration was compliance with the APA standards of referencing and formatting the retrieved information. There was no direct contact with the study participants so that the authors couldnt go against the ethical principles of prenatal assessment.
Bouchard, G. (2011). The role of psychosocial variables in prenatal attachment: An examination of moderational effects. Journal of Reproductive & Infant Psychology, 29(3), 197207. Web.
OConnor, T. G., Monk, C., & Fitelson, E. M. (2014). Practitioner review: Maternal mood in pregnancy and child development implications for child psychology and psychiatry. Journal of Child Psychology & Psychiatry, 55(2), 99111. Web.
Zizzo, N., Di Pietro, N., Green, C., Reynolds, J., Bell, E., & Racine, E. (2013). Comments and reflections on ethics in screening for biomarkers of prenatal alcohol exposure. Alcoholism, Clinical and Experimental Research, 37(9), 1451-1455. Web.
The Research and Development (R&D) functions of organizations play a significant role in the formulation and implementation of business strategy. The R&D personnel are responsible for the development of new products and the enhancement of the companys old products in line with the successful implementation of the organizations strategy.
The R&D function is responsible for the transfer of complex technology, adjustment of processes to suit local markets, modification of processes in line with local raw materials, and adjustment of company products to specific tastes and requirements. Organizational strategies targeting product development, diversification and infiltration of new markets demand that company products be improved, and new products created to fit the expected changes. This paper looks at the three major R&D approaches for implementing strategies.
Major R&D Approaches for Implementing Organizational Strategy
Most successful organizations employ R&D strategies that link the available external opportunities to the companys strength and objectives. What this means is that a properly formulated R&D policy must connect the available market potential to internal organizational capabilities. A company can use any of the following R&D approaches for the implementation of strategies.
The first approach the organization can adopt is to be the first company that promotes new technological products. Though this approach seems attractive and exciting, it is fairly dangerous since the company is likely to delve into untested products. It is also significantly more costly.
This approach has a high rate of failure since companies are testing new products and processes. Nevertheless, this approach can offer the company a temporary monopoly of certain products that the organization can exploit to maintain competitive advantage. If a company fails, other companies can seize the initiative and learn from the mistakes made by the initiating company.
The second approach requires that the organization becomes an innovative imitator of certain successful products and processes from other companies. This approach offers the benefit of a reduction in costs often associated with start-up and program initiation. The company can also benefit from reduced risk of failure since the products have already been successfully adopted.
The company allows pioneer organizations to develop new products and establish the marketability of the product. The company then follows suit and develops a similar but improved product. The success of this strategy relies extensively on the abilities of the R&D personnel and the marketing department of the organization.
The final R&D approach is for the company to adopt the production of a low-cost product that is similar to the one already in the market. The company can achieve lower-cost production by embarking on mass production of recently produced products.
The success of this approach relies on the fact that once a product has been accepted by consumers, the issue of pricing becomes central in determining sales of the product. Unlike the previous strategic R&D approaches, this approach requires less expenditure in R&D, but significant plant and equipment investment for mass production.
The success of any of these approaches requires effective communication between R&D departments and other functions of the organization that are instrumental in executing various forms of generic organizational strategies. Conflict between any of the concerned functions (such as marketing, information technology, R&D, and finance) can be controlled by clearly outlining policies and company objectives.
Conclusion
Research and development function of an organization is important in ensuring organizational growth and in obtaining and maintaining a competitive advantage. Strategic R&D should be aligned with organizational objectives. The three approaches can be used by the organization in implementing a strategic business plan that is well-informed.
Wireless cell phone services industry is an integral part of the modern society as far as communications acquire the constantly greater role in the human life and the operators of the wireless connection offer the fastest services at the cheapest price. Research in Motion is one of the world’s leading software providers for mobile communications industry developing its network and having numerous branches in Canada, North America, and Australia. The history of the company, although not rather long, exemplifies the proper way of development that allowed Research in Motion to become the world’s leader within 25 years of its existence. The financial statements of the company also report the gradual development of Research in Motion with the respective annual profit increases, sales rates growth, etc. Being a progress oriented company allows Research in Motion to consider the demands of the customers worldwide and be able to conform to those demands on any market possible.
Introduction
Wireless cell phone services industry is an integral part of the modern society as far as communications acquire the constantly greater role in the human life and the operators of the wireless connection offer the fastest services at the cheapest price. Today’s market of the cell phone services is rather competitive as far as such giants as Verizon Wireless, AT&T, Sprint Nextel, etc. fight for new customer bases. In this market, Research in Motion has recently managed to become one of the most influential suppliers of software and wireless mobile devices. Having developed from a local Canadian telecommunications company into one of the largest businesses of today, Research in Motion possesses the image of the reliable business partner whose reputation has never been damaged by any cases involving Research in Motion as an infringer of the Patent Law. The following paragraphs focus on the history of Research in Motion’s development and present the picture of the current state of affairs in the company.
Background
History
The very beginning of Research in Motion history is 1984, the date when the company Research in Motion was founded by Mike Lazaridis and his friend Jim Balsillie (RIM, 2009). Dealing with the creation of innovative and efficient wireless communication devices for mobile operators, this company was first involved in the work with Ram Mobile Data and the Eriksson Company as their partner in the development of the Mobitex Wireless Data Network (Research in Motion, 2009). The year 1995 was the time when the company under analysis started being financed in a private way. In other words, private and institutional investors’ funds made Research in Motion into a privately-owned enterprise (RIM, 2009). Research in Motion as one of the most potentially developed Canadian companies obtained the substantial funding from Working Ventures Canadian Fund Inc., which amounted to a $5 million investment. The company was also financed by the IPO institutions which helped it raise $30 million more (Research in Motion, 2009). At the turn of the century, Research in Motion started its campaign of the public offering of its shares on the Toronto Stock Exchange. Interactive Pager 950 developed in 2000 was the first major advance the company coped within its independent history (RIM, 2009).
Founders
The actual founders of the company under consideration are the already mentioned Mike Lazaridis and his friend Jim Balsillie, who dedicate their lives to the sphere of wireless technology and are not taking the leading positions in Research in Motion’s organizational structure (RIM, 2009). 2008 was the year when the company under analysis was admitted as one of the most influential Canadian companies and even entered the list of “Canada’s Top 100 Employees”. In 2009, moreover, the global initiatives of the company Research in Motion started being developed. It was first of all reflected in the announcement of the plans to launch the training centers and production premises in Australia and other regions of the world (RIM, 2009). Currently having 12, 000 of employees, Research in Motion also takes one of the most prominent places in the wireless communications market. Its performance allows it to stand the competition with such companies as Apple Inc., Seven, Nokia, Samsung, etc. and be a rightful competitor for them (Research in Motion, 2009).
Major Products
The major products and services provided by Research in Motion in the initial years of its development were limited to the wireless communication devices. Interactive Pager 950 was the first step in the company’s entering the Canadian and, later, the American market. In more detail, Research in Motion was the first company to ever introduce the number of wireless devices able to operate in CDMA, GSM, and even Motorola iDEN (RIM, 2009). The joint name or this variety of devices is Blackberry, due to their form and color, and their major function is limited to giving assistance to office workers in any situation they might need help with memorizing professional or personal data, names, dates, etc (Research in Motion, 2009). Some other products offered include the Single Mailbox Integration device, Good Technology device, portable wireless e-mail devices, etc (RIM, 2009).
Major Current Services
As for the today’s state of things in the company, it continues its international development and progresses in the technological sense. In more detail, nowadays Research in Motion has its branches in Canada, United States of America, Australia, and plans to expand into many other countries (Research in Motion, 2009). The orientation on multiple markets makes the company invent more and more new devices and products. For example, in BlakcBerry technology is the most famous product by Research in Motion. They are equipped with e-mail and messaging systems, Web browser, organizer, and the systems of SureType and SurePress technologies (Research in Motion, 2009).
Financials
The financial aspects of the performance of the company Research in Motion are the most important side of the firm’s existence. Accordingly, the following figures can characterize the latest five years of the company’s performance. For example, the growth of the net income of the company is rather substantial from one fiscal year to another. Thus, in 2005 the net income of the company Research in Motion before taxes equaled $205.6 million, while already the next year saw the drastic increase of this figure to $347.7 million (Research in Motion, 2009a). Moreover, the year 2007 brought almost the triple net income as compared to the figures of 2006. The company under analysis earned $631.6 million (Research in Motion, 2009a). Such a progression of the net income growth can be explained by the combination of the successful strategic planning of the company aimed at satisfying the needs of the diverse customers by offering simple and sophisticated technology and the correct sales department work (Shankar, 2008). Further on, in 2008 the net income figures exceeded billion dollars amounting to $1, 293.9 million, while 2009, irrespective of the economic recession and stagnation, brought even the higher net income rates to the company – $1, 892.6 million (Research in Motion, 2009a).
However, as compared to its main competitors, Research in Motion still has space for perfection. The recent net income figures by Apple, Inc. equal $ 3,496.0 million in 2007, and 4,834.00, which is almost three times as large as Research in Motion had for the same period (Research in Motion, 2009a). Nokia Corporation net income figures are also rather substantial, even though financial crisis has affected them rather substantially. Nokia Corporation had $7,205.0 million, while 2008 brought only $3,988.0 to the company (Nokia Corporation, 2009). Nevertheless, the figures presented are still higher than the ones by Research in Motion, and this fact evidences that the latter still finds itself in the middle of the market but planning to compete with the most successful market players.
Conclusions
To conclude, today’s market of the cell phone services is rather competitive as far as such giants as Verizon Wireless, AT&T, Sprint Nextel, etc. fight for new customer bases. In this market, Research in Motion has recently managed to become one of the most influential suppliers of software and wireless mobile devices. Having developed from a local Canadian telecommunications company into one of the largest businesses of today, Research in Motion possesses the image of the reliable business partner whose reputation has never been damaged by any cases involving Research in Motion as an infringer of the Patent Law. The strategic planning of the company is focused on the long-term initiatives that would allow success in future, while the current services and goods offered by the company allow it to have considerable net income growth rates and customer base expansion levels.
The field of project management is constantly changing. What once was a phenomenon strictly referred to when addressing the issues surrounding engineering or construction management, project management is now a separate research area. Lately, in conjunction with the rapid growth of information technology for business, professional project management from specific groups of processes for certain areas and areas of activity has grown into a generally accepted management system. Nowadays, project management is focused on a wide range of business initiatives, providing theoretical framework for operating project portfolios. Therefore, it is crucial to take into consideration the existing vault of scholarly literature related to project management conceptualization in order to understand the concepts and strategies of project management. The following paper focuses on analyzing the development of project management and its thematic perspectives. The ultimate goal is to connect past developments and current trends with the potential advancements in the future. The patterns and progressions in the history of project management’s development are the main sources of inquiry into the future of the concept. The paper contextualizes project management, offers a variety of different perspectives in relation to the concept, and makes informed predictions as to what challenges and opportunities are in store for the field of project management.
Introduction
Project management (PM) is a subject that has risen to prominence over the past decades. Companies use projects as a way of coordinating work as they allow to bring a semblance of structure to a somewhat chaotic stream of day-today responsibilities on the agenda. Many enterprises have decided to switch to thinking of their initiatives in terms of short- and long-term projects, rather than specific tasks. Initially only a sub-discipline of engineering, PM has grown to encapsulate a “dominant model in many organizations for strategy implementation, business transformation, continuous improvement and new product development” (Winter et al., 2006, p. 638). Nevertheless, despite the wide distribution and the availability of extensive tools aimed at successfully achieving the goals of the project, this area of knowledge is still quite young and needs to be carefully studied. Another important factor in the use of professional tools of the project approach is the need to adapt project management tools to the needs of a specific project.
Thus, a substantial body of literature has been formed, which now serves as the main source of knowledge related to the history and development of project management. For example, a study by Collyer et al. (2010) provides a qualitative research including methodology to improve project management dynamism, the complexity of human behaviors in actual environments, and a new theory to support change. Another study from Evans et al. (2009) investigates project planning in a specific environment, which makes it impossible to be widely generalizable beyond the offered setting. Finally, Giezen’s (2012) research develops a deviant case study research design with narrative interviews, and explains why megaprojects may be improved by maintaining time and choosing simplification.
Project planning becomes especially relevant today, as the globalization and digitalization of the world takes its toll on business industry, demanding for new approaches and innovative thinking. The research agenda allows one to understand the current trends and findings in the field of project management, giving a comprehensive overview of it. Therefore, the purpose of this paper is to analyze the existing body of literature centered around project management in order to determine past trends in PM conceptualization and predict some of the future ones.
Contextual background
Projects are the driving force behind change for any organization. Accordingly, the way to create business value of an organization through the implementation of project activities depends on the capabilities and resources of the organization, as well as on its project management strategy. Based on this assumption, the project approach and all the variety of existing project tools are favorable for use if the company has the knowledge to implement the planned initiatives. In order to build this knowledge, there is a need to understand the concepts of project management, its current trends and tools, and theoretical framework behind it. An enhanced project management practice can provide the enterprises with new effective solutions in the field of operating multiple projects at once, and improve the current strategies. Due to this fact, it is crucial to study the project management research agenda, as it offers an overview of the vast assortment of the theories, findings, case studies, and concepts.
Firstly, for the purpose of this paper, it is imperative to conceptualize project management and the studies surrounding project organization and coordination. The term “project management” in its traditional and most well-known meaning is used to refer to “the processes, tools, techniques and concepts to manage the execution of the project” (Geraldi and Soderlund, 2017, p. 57). However, over the recent decades, there has been a trend characterized by a steady departure from classic PM definitions. In the scholarly community, project management is no longer regarded as a separate activity (Garel, 2013). Now, the stream of academic literature focused on project management is active, yet it has not always been the case. Arguably, the early era of PM literature dates all the way back to mid-1980s (Aubry, Hobbs, and Thuillier, 2008). Therefore, the insights gathered as a result of more than three decades of studies provide modern-day researchers with a substantial theoretical framework. In the context of this paper, the primary objectives behind analyzing the existing body of literature related to project management relate mainly to the assessment of the most prominent of existing studies
Perspectives to consider
When identifying the different perspectives of project management development, it is imperative to use a framework to aid in completing a qualitative analysis of the research. For this paper, the thematic framework described by Padalkar and Gopinath (2016) is incorporated. It implies the categorization of the common themes in PM research into four groups, which include deterministic themes, explanatory themes, non-deterministic themes, and finally, general themes. It is imperative for the purpose of this paper to examine each of these themes in detail in order to determine what nuanced perspective each of them offers.
Deterministic Thematic Perspective
The early era of project management research is characterized by a heavy focus on a deterministic perspective, primarily centered around the themes of scheduling and methods of control. Deterministic evaluation methods are of a more quantitative nature, establishing a set of KPIs and milestones to be reached at certain points. According to Padalkar and Gopinath (2016), the peak of this thematic perspective dates back to the early to mid-1980s, signaling that there has been a reduced interest in the deterministic approach ever since. Their work Six decades of project management research: Thematic trends and future opportunities discusses how project management have developed throughout the years, offering an evolutionary approach.
Moreover, this study sets to explore the existing literature on the topic, which allows the reader to predict the potential trends that can arise in the future. The deterministic view posits that projects ought to be evaluated based on their “iron triangle” performance, which includes such key indicators as cost, quality, and schedule (Jha and Iyer, 2007; Pollack, Helm, and Adler, 2018). Thus, all the projects were considered a fixture with clearly set, deterministic attributes. Proponents of such a perspective argued that improving a project’s efficacy implied optimization of its schedule.
Explanatory Thematic Perspective
The explanatory thematic perspective may start coincides with the slow loss of interest in determinism in the 1980s. A new era signified a newly found fascination with seeking a perfect explanation for various project organization phenomena. Thus, the main themes include “project performance and success, success and failure criteria, and antecedents of project performance” (Padalkar and Gopinath, 2016, p. 1308). The authors state that there is a need to strengthen the positions of explanatory approach that can help evaluate the projects through a qualitative and not quantitative assessment, as it happens with deterministic perspective. However, successor factors remained the primary theme during the active rise of the explanatory era, leading to the prominence of a number of sub-themes, including project efficiency, the contributing role of leadership in project success, stakeholder and human resource management, and so on. Despite the fact that this thematic perspective is now currently saturated, it still reigns as the dominant mode of theoretical inquiry into project management.
Non-Deterministic Thematic Perspective
Once certain academics started to express their doubts in relation to the relevance of the explanatory and deterministic perspectives, a new thematic approach has emerged. In the mid-2000s, “following calls for treating projects as complex social systems and examining the “actuality of projects” (..), and to examine project complexity (…), researchers turned to themes of interdependence and complexity” (Padalkar and Gopinath, 2016, p. 1309). The study gives a valuable insight, stating that for theory building, there needs to be a stronger non-deterministic approach, focusing more on the theoretical framework of project management strategies and tools. This has marked the emergence of the non-deterministic perspective. Apart from those mentioned in the quote above, explanatory themes include project risk, project value, organizational contexts, and many others. Explanatory approach is different from deterministic in its general direction – while deterministic perspective is focused on evaluating the progress of a project, explanatory perspective discusses the various aspects that factor in the project’s success.
General Themes
The project management research agenda is quite vast, as the importance of it has rose with the growth of digitalization and globalization. Some of the most common general themes are research typology and methodology. For example, a study by Wang and Gibson (2010) discusses the correlation between the pre-project planning and project success using mathematical models. The authors state that planning makes it possible to ensure a high probability and high level of achievement of goals, based on the systematic preparation of decisions. Pinto (2013) in his research elaborates on the consequences the human errors might bring into the project, affecting the operational activity of the whole organization. The article explains the most common reasons for project failures in different industries, while also offering advise on how to avoid them. In addition, the studies on general concepts of project management can include research integration and the adoption of PM-related research findings across various sectors, industries, and scholarly fields.
Significant historical shifts in the study of project management
In terms of the development of project management, as mentioned earlier, there are three dominant eras, each of them signaling a shift in the conceptualization of project management and phenomena associated with it. The shift from a deterministic to an explanatory approach signaled moving from a means to an ends orientation, although both perspectives still largely supported the assumption of tractability until a non-deterministic era shattered such an assumption. Additionally, Padalkar and Gopinath (2016) indicate that all of the eras had distinct ontological assumptions. The deterministic era implies a reductionist approach, which means that the studies of the period prioritize conceptual and analytical frameworks. Finally, the non-deterministic era masterfully combines approaches, which are empirical with the ones that are purely conceptual.
Furthermore, in order to get a better grasp of the developmental journey of the PM-associated literature, it is crucial to highlight the prominent trends. Across all the thematic perspectives, there has been a declining trend. The lack of excitement about older themes combined with a low interest in new ones results in an overall decline. Over the period of 2000-2015, the most influential theme has been project methods (Padalkar and Gopinath, 2016). Despite a relatively high interest in project methods under both deterministic and non-deterministic perspectives, the theme continues to decline (Seymour and Hussein, 2014). As for the most engaged theme, success factors show the most progress and no signs of convergence so far. Success factors are not the only explanatory theme as a part of the positive trend. Project risk is another explanatory theme, which is now the third largest overall in the literature focused on project management.
The second theoretical strand of thinking dates back to the 1960s and 70s and is defined as an approach centered around the organizational structure as a determinant of success and efficacy. Winter et al. (2006) clarify that this era is represented by the scholarly works of Lawrence and Lorsch, Galbraith, Toffler, and Mintzberg. Lastly, the third theoretical era has emerged in the 1980s. According to Winter et al. (2006), studies of this era still are of much relevance today as they emphasize “a broader view of projects, recognizing the importance of the front-end, and of managing exogenous factors” (p. 640). It is evident that, as time passed, the focus of researchers shifted towards applying project management insights into practice, leading to the popularity of such themes as strategy and learning.
When discussing the development and diversification of the research pertaining to project management, it is integral to mention the International Journal of Project Management (IJPM). Moreover, the Journal of Project, Program, and Portfolio Management (JPPPM) and the Journal of Modern Project Management are important to mention. Analyzing their articles can serve as a tool for existing the gradual shift of PM from a phenomenon associated exclusively with engineering to a widely-accepted and applied process, which is now known as project management. The journal perfectly captures the dynamic nature of the field.
If prior to the 2000s, the focus of scholarly journals has been on engineering and construction management, then now it has changed to broader organizational topics. According to Geraldi and Soderlund (2017), over the past two decades, the publication “has become more rooted in social science and management/organization studies in a braided sense” (p. 57). For example, the study conducted by Crawford et al. (2006) concludes that there has been a rising trend for IJPM to cover such topics as marketing, project outcomes, and evaluation. Rather than what has been traditionally associated with PM: quality assessments and task coordination – the researchers are now more open to studying more specific or targeted aspects of PM. Thematic trends in publications in one of the field’s leading journals are indicative of larger trends, demonstrating the development of project management research and conceptualization.
Future possibilities and existing constraints
With engineering roots, project management is a phenomenon developed as an extension of a rational, practice-oriented approach. The fact that PM will always be rooted in such technical concepts leads to a whole set of limitations (Shenhar and Dvir, 2007). It is virtually impossible or just exceptionally challenging to engage with taken-for-granted conceits. Geraldi and Soderlund (2017) indicate that project management conceptualization is often associated with the inability of contributing “to theory on a more general level, and if not being recognized as a scholarly field of inquiry as such” (p. 67). These are the primary constraints worth of a mention.
As for the future of project management research, it is evident that there are numerous possible directions of development. Currently, PM is going in the direction of sustainability and digital tools, which marks, perhaps, a beginning of new era in the history of PM research. Sylvius (2017) emphasizes that the collaboration of sustainability and project management creates a new school of thinking in the field of PM. Picciotto (2020), from an international perspective, also states that PM needs to reconsider its approaches, as they are too rigid and goal-oriented, thus not fitting for today’s projects.
There is also possibly going to be a shift towards concepts related to social agenda and cooperation as a part of project management (Morris, 2010). Value creation will most likely become a top priority for project managers, citing such objectives as “creating value” and “facilitating benefits” as the primary ones. Conceptualization will no longer be regarded as static, prioritizing an ongoing conceptualization process instead.
Conclusions
In conclusion, it is evident that the field of project management has changed a lot over the past couple of decades. The thematic perspectives have evolved from deterministic to explanatory, and then, to non-deterministic. Despite that, a deterministic approach is still the most relevant in current discourse surround PM-associated practices. However, PM has transitioned to include a wide range of themes, such as organizational effectiveness, success factors, risk management, and governance. While it might be challenging to make predictions about the future of the field of project management, it is apparent that ”hard” and static convictions about PM will be replaced by the understanding of this phenomenon as a dynamic one.
References
Aubry, M., Hobbs, B., and Thuillier, D. (2008) ‘Organisational project management: An historical approach to the study of PMOs’, International Journal of Project Management, 26(1), pp. 38–43. doi:10.1016/j.ijproman.2007.08.009
Crawford, L. et al. (2006) ‘Uncovering the trends in project management: Journal emphases over the last 10 years’, International Journal of Project Management, 24(2), pp. 175–184. doi:10.1016/j.ijproman.2005.10.005
Collyer, S. et al. (2010). Aim, fire, AIM—project planning styles in Dynamic Environments. Project Management Journal, 41(4), pp.108–121.
Evans, J. et al. (2009). Discrete return Lidar in natural resources: Recommendations for Project Planning, Data Processing, and Deliverables. Remote Sensing, 1(4), pp.776–794.
Garel, G. (2013) ‘A history of project management models: From pre-models to the standard models’, International Journal of Project Management, 31(5), pp. 663–669. doi:10.1016/j.ijproman.2012.12.011
Geraldi, J. and Söderlund, J. (2017) ‘Project studies: What it is, where it is going’, International Journal of Project Management, 36(1), pp. 55-70. doi:10.1016/j.ijproman.2017.06.004
Giezen, M. (2012). Keeping it simple? A case study into the advantages and disadvantages of reducing complexity in Mega Project Planning. International Journal of Project Management, 30(7), pp.781–790.
Jha, K.N. and Iyer, K.C. (2007) ‘Commitment, coordination, competence and the iron triangle’, International Journal of Project Management, 25(5), pp. 527–540. doi:10.1016/j.ijproman.2006.11.009
Morris, P.W.G. (2010) ‘Research and the future of project management’, International Journal of Managing Projects in Business, 3(1), pp. 139-146. doi:10.1108/17538371011014080
Padalkar, M. and Gopinath, S. (2016) ‘Six decades of project management research: Thematic trends and future opportunities’, International Journal of Project Management, 34(7), pp. 1305–1321. doi:10.1016/j.ijproman.2016.06.006
Picciotto, R. (2020). Towards a ‘new project management’ movement? An International Development Perspective. International Journal of Project Management, 38(8), pp.474–485.
Pinto, J.K. (2013). Lies, Damned Lies, and project plans: Recurring human errors that can ruin the project planning process. Business Horizons, 56(5), pp.643–653.
Pollack, J., Helm, J. and Adler, D. (2018) ‘What is the iron triangle, and how has it changed?’, International Journal of Managing Projects in Business, 11(2), pp. 527-547.doi:10.1108/IJMPB-09-2017-0107
Seymour, T. and Hussein, S. (2014) ‘The history of project management’, International Journal of Management & Information Systems (IJMIS), 18(4), pp. 233–240. doi:10.19030/ijmis.v18i4.8820
Shenhar, A.J. and Dvir, D. (2007) ‘Project management research—The challenge and opportunity’, Project Management Journal, 38(2), pp. 93–99. doi:10.1177/875697280703800210
Silvius, G. (2017). Sustainability as a new school of thought in Project Management. Journal of Cleaner Production, 166, pp.1479–1493.
Svejvig, P. and Andersen, P. (2015) ‘Rethinking project management: A structured literature review with a critical look at the brave new world’, International Journal of Project Management, 33(2), pp. 278–290. doi:10.1016/j.ijproman.2014.06.004
Wang, Y.-R. & Gibson, G.E. (2010). A study of preproject planning and project success using ANNS and regression models. Automation in Construction, 19(3), pp.341–346.
Winter, M. et al. (2006) ‘Directions for future research in project management: The main findings of a UK government-funded research network’, International Journal of Project Management, 24(8), pp. 638–649. doi:10.1016/j.ijproman.2006.08.009
The study of Jupiter has always fascinated scientists, as this giant sphere is of great interest. On September 5, 1977, NASA launched the Voyager 1 automatic interplanetary station weighing 723 kg into space. The project was approved in 1972, and for 40 years of flight, the device separated from the Earth by almost 20 billion kilometers and became the farthest artificial object (“Voyager 1 sees Jupiter’s red great spot”). Voyager 1 was the first spacecraft to take detailed images of the satellites of these planets. The maximum approach of the station to Jupiter took place on June 6, 1979 (“Voyager 1 sees Jupiter’s red great spot”). This was one of the pictures the station took when it was close to Jupiter.
At first sight, the picture reminds one of a painting, but one may observe unusual swirls when one looks closely at it. These remind me of enormous hurricanes that sweep away everything on their way. Besides, these vortexes look like sandy hills from up above. The red spot is the most distinguishable – perhaps, it is the epicenter of all the storms on the planet. The whole globe has many swirls across its surface. In general, Jupiter is quite a unique planet in terms of structure.
From the scientific perspective, it is called the Great Red Spot on Jupiter – the most remarkable sight seen from a distance on the surface of Jupiter. It is an immense vortex in the Solar System capable of devouring a whole planet such as Earth. The size of this swirl was noticeably larger than today — it could accommodate not one but three spheres at once. The spot is located at about 22° south latitude and moves parallel to the equator of the Earth. The scientists tried to identify what this spot was and trace its lifespan.
Any kind of research is a complicated and time-consuming process. This is especially true if the researcher does not have a clearly defined topic at once and needs to broaden and narrow it in the course of the research. There are numerous important aspects that need to be taken into account when carrying out research. First of all, an initial research question has to be set. After this, the process of research as such takes place. The initial research question serves as a basis of the research only. One of its main contributions to the research process is that it can provide the researcher with one or more keywords to start with. After this, information may be collected from primary, secondary, and tertiary sources; in the course of this process and depending on the results, the researcher gets additional keywords that help to broaden and limit the topic of the research and eventually bring the researcher to the final research question. This is an approximate plan of how this research on Escherichia coli is going to be performed; thus, this report will carefully evaluate and articulate the information needed to develop the final research question, describe search strategies that have been implemented during the research, and eventually present a final research question that will be used for writing the research paper.
Research Process
To begin with, a research question that will be guiding the research needs to be formulated. Since the research, in general, is going to deal with E.coli, it seems the most relevant to explore this bacterium’s effect on the human organism. Therefore, the initial research question is: Which effect does E.coli have on the human organism? One of the thought processes used to arrive at this research question was a generalization. At the beginning of the research process, one definite fact about E.coli was that it was harmless for humans because it was identified as a bacterium that, as it was discovered in the course of the search process, is present in the human intestine (Gorbach, Bartlett, & Blacklow, 2004). Owing to generalization, the suggestion that E.coli may be harmful was not considered because something living in the human organism from the beginning of its formation can hardly do it any harm. Such a conclusion seemed to lead the research to a dead end because it presupposed that subsequent development of the topic was impossible. Nevertheless, this problem helped to re-focus the research; it was clear that the thought process needed to be changed to get to another aspect of the topic.
Another though process, critical thinking, made it possible to assume that, if E.coli is a bacterium than, like the majority of other bacteria, it can be harmful under certain circumstances. Thus, by means of drawing parallels between the already known types of bacteria, a guiding question for research arose: What are these circumstances? This significantly broadened the topic at this stage of research. The matter is that, at the beginning, it was assumed that E.coli as a non-harmful bacterium will be considered in terms of its properties, environment in which it lives, and, perhaps, even its therapeutic use. With this new information in mind, the scope of the future research was broadened; it was decided to not only explore the bacterium in terms of its characteristics, but to find out what exactly is responsible for the bacterium’s changing its properties and becoming alien for the organism in which it lives. Besides, this made the research process much easier because the information on harmfulness of E.coli is more abundant.
Furthermore, using modifications of the keywords (that will be discussed later), new data regarding the bacterium under consideration have been obtained. It was found out that E.coli has harmful and harmless strains. Harmful, or pathogenic, strains of E.coli are capable of causing different infections in the human organism (Barness & Barness, 2003). This is where another possibility to broaden the scope of the research arises. At this stage, it is possible to re-direct the research and explore different pathogenic strains of E.coli and different infections that they cause. For example, some of them cause local and systemic infections, such as diarrhea and the majority of other nosocomial infections (Handbook of Diseases, 2004). This possibility will be considered more attentively when the research paper as such will be written. So far, it is enough to explore the properties of E.coli, the circumstances under which it can become harmful, and how this can happen.
This is the next stage of the research where broadening of the topic takes place. If there are pathogenic strains of E.coli, then there should be a way they get into the human organism and cause infections in it. Since the overall topic is related to food, it is possible to suggest that certain strains of E.coli become harmful for the human organism when they get into blood owing to digestion process. This is where the research is slightly re-directed to the way in which E.coli may get into food and/or drinking water. The modification of the keyword (also discussed later) has revealed that E.coli, or, to be more exact, its particular pathogenic strain, E. coli 0157: 57 (Tadataka, Alpers, Kaplowitz et al, 2003), may cause food poisoning. This is a vast area for research. For instance, definite diseases connected with food poisoning may be identified, as well as the ways to treat these diseases may be researched. Thus, as it is known, colitis is one of the most widespread diseases that develop because of food poisoning (Roy, 2002). This is why it seems relevant to explore its connection with the presence of E.coli in food or drinking water (which narrows the research at this stage).
This all eventually led me to the final research question. Considering the information that I have obtained during this research process, I have formulated the following research question: When and how can E.coli be harmful for the human organism? This research question encompasses most of the aspects concerning the bacterium in question. It presupposes that general information about E.coli will be provided, its pathogenic and non-pathogenic strains will be discussed, the possibility of the bacterium’s pathogenic strains to get into human organism will be considered, and the effect of these strains on the human health will be evaluated. The research question that has been formulated is a pure research question. It does not deal with finding solutions to a definite problem; instead, the problem is critically evaluated and described in detail with various aspects of it being taken into consideration.
Search Strategies
Above all, it should be mentioned that a variety of keywords should be tried to achieve success in research. The following table (a screenshot made from a database) reflects the progress of my research process at one of the databases that I have used:
It can be seen what I have started with (mere introduction of the word ‘E.coli’ to get general information about the bacterium) and what direction my research took after this. For example, after discovering that E.coli is present in the intestines of the humans, I have arrived to a conclusion that this bacterium can be both harmful and harmless. From the first source I have found (Infectious Diseases) I have learnt that E.coli had pathogenic and non-pathogenic strains. This is why I have modified my keyword to ‘E.coli strains’ to verify this information and, further, to ‘E.coli pathogenic strains’ to be able to broaden the topic of the research. This allowed me to find out that pathogenic strains of E.coli can cause numerous infections, which resulted in another modification of the keyword into ‘E.coli infections’.
Since the results I have obtained with the use of this keyword were not numerous, I have used a synonymic keyword, ‘E.coli diseases,’ which turned far more results. The keyword was modified several times with the progress of the research and its further re-directions. I have used ‘E.coli in food’ to discover whether this bacterium can be present in it, as well as ‘E.coli in drinking water’ for the same reason. Further, I have remembered about colitis as one of the most widespread food poisoning consequences and used such combinations as ‘E.coli colitis’ and ‘E.coli food colitis’ to find out whether E.coli can cause colitis if the bacterium gets into the food (or drinking water). Finally, I used the keyword ‘E.coli diseases’ to discover which other diseases, apart from colitis, may be caused by this bacterium. This all testifies to the fact that I have used a variety of keywords, including synonyms, during the search of information. The keywords mentioned, however, present only a part of my research performed at a definite database. The keywords that I used with regular search engine (for general information) and other databases were also numerous and diverse, but my experience with this definite database reflects the progress of my research best of all.
With other databases and search engines, I used a variety of spellings to obtain more results. For instance, I usually started with entering the full name of the bacteria, ‘Escherichia coli’ and only then reduced it to ‘E.coli’; besides, when wishing to obtain information about E.coli strains, I entered ‘E.coli strains,’ ‘E.coli harmful strains,’ ‘E.coli strains causing infections,’ ‘E.coli pathogenic strains,’ etc, as well as I modified the keyword when researching the presence of E.coli in food through entering ‘E.coli food,’ ‘food poisoning E.coli,’ ‘food poisoning bacteria’, etc.
Moreover, my search of information was broken into multiple queries as I researched the topic gradually. For example, I chose a definite aspect of the problem and explored it; when the information obtained was sufficient, I moved to another aspect and, consequently, performed another query. I have also used several databases, though my search queries did not change much depending on the databases. They all have revealed approximately the same information and sometimes even the same sources. This is probably connected with the limited scope of the topic; besides, my research will be more of a descriptive this is why general data about the problem under consideration will hardly vary from database to database.
It is worth mentioning that I did not use Boolean operators (such as AND, OR, NOT) this is why their efficiency for my research process can not be discussed here. The matter is that my research had a definite direction (general direction): to explore E.coli bacterium and different issues related to it. I did not have to make any comparison or consider two issues that were alike this why Boolean operators were simply not applicable to my research this time.
Lastly, I also attempted to use subject headings, but they proved to be inefficient in my case, so I got back to variations with keywords. Usually, however, searching information with the use of subject headings is quite helpful for me because they allow retrieving only the most relevant information. They have a controlled vocabulary that allows limiting the search and accessing only those sources that have direct relation to the topic. In my case, subject headings were not helpful because I narrowed and broadened the search all the time and, thus, needed al the information, not the specific one, which had at least any relation to my topic. The database that I have used most of all was Ovid. The scope of this database is extremely wide; it has sources (books, journals, etc) on a number of topics, including biology, chemistry, medicine, nursing, environmental science, etc. The database is easy to operate and the sources that I have found there were the most relevant.
Sources Evaluation
Since I have reviewed a variety of sources in the course of my research, I needed to evaluate each of them to select the ones to use in the paper. The main criteria I turned attention to were:
Date of publication (I avoided consulting sources that were older than 2000);
Relevance to the topic (I paid attention to whether the whole source was about the problem I explored, or only a chapter of it);
The completeness of information (whether the source presented the information that I have looked for)
These criteria were used to reject a source (if the source did not correspond to any of them).
Sometimes, there were sources that seemed to say the same things. Choosing among them was rather difficult, but I usually gave preference to those that were more up-to-date. The main strategy I used to determine the quality of the sources was screening them for reliability (Booth, Colomb, & Williams, 2003). I have turned attention to the publishers, the authors’ use of other sources in their work, and, as already mentioned, the up-to-dateness of the source.
Conclusion
Therefore, this report has presented how exactly the research on a definite topic has been carried out. It has traced the development of the working research question into a final research question, presented the search strategies that have been used during the search of the information, and discussed how the sources that have been found during the research were evaluated. Finally, it has explained why certain strategies (such as the use of subject headings or Boolean operators) were not applied in this particular research process.
Reference
Barness, E.G. & Barness, L.A. (2003). Clinical Use of Pediatric Diagnostic Tests. Philadelphia, PA: Lippincott Williams & Wilkins.
Booth, W.C., Colomb, G.G., & Williams, J.M. (2008). The craft of research. Chicago: University of Chicago Press.
The purpose of the study is to evaluate research methods in two articles by Bouchard (2011) and O’Connor, Monk, and Fitelson (2014) devoted to the investigation of prenatal development. The analysis of research constituents will help to identify the correlation between research instruments, internal and external influences, ethical considerations, and overall research outcomes.
Research Design
Bouchard (2011) employed the experimental research design for the investigation interrelations between the psychosocial variables and prenatal attachment. In this case, the psychosocial variables included neuroticism, quality of union with partners, and attachment to family members (Bouchard, 2011). The researcher assesses the given indicators in the sample of 161 adult couples in the third trimester of pregnancy to find what influence they have on the quality of the current attachment to the unborn child. The experimental study allowed Bouchard (2011) to analyze the effect of alternative, independent variables in distinct participants on the dependent variable of prenatal attachment.
In their study, O’Connor and colleagues (2014) investigate the relationships between the variables of maternal mood and developmental outcomes in infants. The authors obtained information about the multiple study participants from the databases. They didn’t have the opportunity to alter the independent variables to evaluate the cause-and-effect interrelations between maternal anxiety, depression, and the adverse outcomes in children. It means that the researchers applied correlation design.
Research Methods
The sampling technique in the study by Bouchard (2011) is convenience sampling. The data collection tools include the different scales for assessing paternal and maternal attachment indicators to the fetus. Other instruments included neuroticism subscales, the Dyadic Adjustment Scale, Adult Attachment Scale, and the London Measure of Unplanned Pregnancy (Bouchard, 2011). The instruments were applied to accumulate the global scores and correlate them with the associated prenatal attachment.
O’Connor and colleagues (2014) collected data from large databases such as PubMed, Cochrane Library, etc. The data analysis tool implemented in the research is the selective literature review. The authors included various quantitative papers in their review to evaluate prenatal maternal mood’s relevance to the psychological, cognitive, and biological development of infants. O’Connor and colleagues (2014) introduce the theoretical and historical background of the problem and focus on empirical evidence linking the study variables.
Summary
Bouchard (2011) found that both paternal and maternal quality of prenatal attachment is affected by their previous experiences and emotional states but in a different way. A high level of partners’ union quality was correlated with women’s strong prenatal attachment only when the participants were not characterized by a high level of attachment to their caregivers, and their neuroticism level was low. On the contrary, a high level of union quality positively affected males’ prenatal attachment when the participants had a strong psychological attachment to their parents (Bouchard, 2011). The obtained results allowed the researcher to identify the cause-and-effect relationships between the variables, and find out how the unique protective psychological mechanisms affect the level of prenatal attachment.
The literature review conducted by O’Connor and colleagues (2014) revealed a direct link between the infants’ exposure to prenatal maternal distress, depression, anxiety, and the adverse psychological and biological, developmental outcomes. It was observed that the exposure to the prenatal distress might affect the fetus in the “dose-response patterns,” and it means that the prenatal development’s adverse impacts may be clinically detected (O’Connor, Monk, & Fitelson, 2014).
The observations made by the authors have significant medical implications – the researchers suggest that effective prenatal intervention is possible for both mothers and children. The ability to provide effective psychological and physiological interventions may help the families to reduce the potential emotional and financial burden associated with the disorders developed due to the prenatal distress exposure.
Strengths and Limitations
The major strength of the experimental study conducted by Bouchard (2011) is the opportunity to evaluate the cause-and-effect relationship between the variables. According to the researcher, the inconsistency in the previous research findings related to the subject of prenatal attachment was associated with the failure to examine the interrelations between the predictors of attachment (Bouchard, 2001). The researcher fulfilled the gap identified in the literature review by evaluating the psychosocial variables in couples. However, the sample included merely first-time parents to eliminate the interference of previous pregnancy experiences. But the inclusion of participants from a broader demographic background may increase the representativeness of the research data.
The selective review of literature conducted by O’Connor and others (2014) provides a structured, consistent, and systematic evaluation of the principal findings in the field of prenatal development and the effects of exposure to prenatal distress. The study demonstrates that the further investigation of the subject may help to elaborate an efficient prenatal intervention of potential disorders and to prevent the adverse developmental outcomes through education of patients. Although the qualitative studies are usually associated with a high potential of data biasing and subjectivity, the researchers’ professional skills helped to minimize the subjectivity. They helped to maintain the scientific rigor throughout the study.
Evaluation of Internal and External Factors
The major internal factor influencing both of the studies is the researchers’ ability to track the causal relations between the introduced concepts, the program of the study, and the research outcomes (Bleijenbergh, Korzilius, & Verschuren, 2011). The internal structure of the articles should be consistent and logical. Otherwise, the research results will lack validity. The studies conducted by Bouchard (2011) and O’Connor and colleagues (2014) are associated with a high level of consistency because their outcomes are logically interrelated with the selected research methodology.
The external factors influencing the outcomes in the qualitative study include the consistency of conducted analysis with the theories in the psychological discourse. O’Connor and colleagues (2014) managed to increase the external validity of the results by considering multiple perspectives in the text and rationalization of distinct points of view. The external validity of the quantitative study conducted by Bouchard (2011) primarily depends on the sample size and sampling technique. However, the researcher used the appropriate sample selection as it allowed her to answer the formulated questions, and a large sample size guarantees that the data may be generalized.
Ethical Issues
The assessment of prenatal developmental abnormalities has many ethical implications. The prenatal diagnosis usually creates significant distress in pregnant women and their families (Zizzo et al., 2013). Therefore, the data accumulated in prenatal assessment is associated with a high level of sample participants’ vulnerability, and it requires careful use of the obtained information in research.
Bouchard (2011) assessed the psychological indicators of the study participants, and she had to follow the principles of informed consent and confidentiality. The disclosure of data obtained through examination of psychosocial variables may have a significant impact on the social and psychological identity of the participants. Therefore, by following the principle of confidentiality, Bouchard (2011) reduced the opportunity of damaging the participants’ well-being.
Since O’Connor and colleagues (2014) implemented the literature review method, their major ethical consideration was compliance with the APA standards of referencing and formatting the retrieved information. There was no direct contact with the study participants so that the authors couldn’t go against the ethical principles of prenatal assessment.
Bouchard, G. (2011). The role of psychosocial variables in prenatal attachment: An examination of moderational effects. Journal of Reproductive & Infant Psychology, 29(3), 197–207. Web.
O’Connor, T. G., Monk, C., & Fitelson, E. M. (2014). Practitioner review: Maternal mood in pregnancy and child development – implications for child psychology and psychiatry. Journal of Child Psychology & Psychiatry, 55(2), 99–111. Web.
Zizzo, N., Di Pietro, N., Green, C., Reynolds, J., Bell, E., & Racine, E. (2013). Comments and reflections on ethics in screening for biomarkers of prenatal alcohol exposure. Alcoholism, Clinical and Experimental Research, 37(9), 1451-1455. Web.
The human visual perceptual processes begin from global processing to local processing (Navon, 1977, p.353). This is based on the principle of global precedence (Navon, 1977, p.354). It follows; visual perception (p.355) begins with recognition of global features before local features are recognized, processed and synthesized. Development of awareness of the global features (Navon, p.353) is critical to visual processing. Visual perception therefore follows a hierarchical order because visual processing depends on stimulus development (Navon, pp.355-6).
Testable hypothesis
The two testable hypotheses in the Navon study sought to test inevitability of global processing (Navon, p.368) hence confirming the principle of global precedence (Navon, p.353). The principle of global precedence affirms recognition of local features following visualization of global features. The second hypothesis was meant to test conflict between global cues and local cues (Navon, p.353). Through identification of rate of subject response to pairs of simple patterns, it was envisioned possible to identify priority of global or local visual perception.
The design of the study
The design of the study (Navon, p. 353, p.368) involved response of subjects to the auditory character. The study was designed to test processes through which visual perception develops. The study used a visual stimulus that consisted of large characters (representing the global level) that had been made up of small characters (representing the local level). The goal was to determine if the viewer would recognize the large characters first as opposed to recognition of the small characters that formed the large character. The character, large or small, that would have been recognized first would have helped to determine mechanism through which visual processing initiates and develops.
Findings of the study
The results presented an error percentage of 3.3% (Navon, p.369). The differences (p.370) were non-significant at 0.05 confidence level. The conflicting consistency (Navon, p.370) was significant when attention was directed at P<0.01. The mean latency in the conflicting trials for the local directed condition was determined to be higher compared to the global directed condition (Navon, p.370).
Discussion of the results
The results established that respondents have higher recognition for global features than local features. The subjects first recognized the large character (global level) before they recognized the small characters that formed the large character (local level). The subjects demonstrated increasing reliance on global features as opposed to local features. This implies, human perceptual processes proceed from the global level, intermediate levels or coarse-grained analysis and finally fine-grained analysis of local level is achieved (Navon, p.370). This proved first hypothesis on inevitability of global processing to be true. The findings established that if subjects have no specific feature to identify or attention is not directed, the viewer’s attention is drawn to the global features. The study established attention could be directed to local features if the subject has prior knowledge or awareness of global features. The subjects therefore cannot demonstrate capacity to control perceptual processing regardless of efforts to ignore scenes. This implies, for a subject to demonstrate local processing, the subject should have awareness of the global level (Navon, p.368). This confirmed word letter phenomenon hence higher level of visual perception has capacity to improve local level processing (Navon, p.357).
The recognition of letters H or S at global level [experiment 3(1)] indicated global level had higher priority (Navon, p.369). The differences observed in experiment 3(2) established subjects could allocate attention level to local level if they had prior awareness of global level. The variation in speed of response based on mean latencies (Navon, p.369-370) is indicator of global awareness influence on local level processing. This demonstrates that when people attend to global level, they have limited capacity to attend to local level (Hughes et al, 1984). It further shows attention is indivisible between two elements that are dependent. Based on mean latencies, local processing doesn’t interfere with global level processing but global level processing can interfere with local processing.
Conclusion
The human perceptual processing, therefore, develops from global processing (Robertson et al, 1993; Duncan, 1984). The global level has properties of the local level. Local-level processing involves the identification of individual entities (small letters that form the global level). This means global level processing is an important step towards human local processing (Rensink, O’Regan & Clark, 1997). This satisfies principle of global processing. Processing of local level is negatively influenced by global processing if a viewer doesn’t have global level awareness. This provides proof that visual perception is not instantaneous as implied by gestalt psychological theory.
References
Duncan, J. (1984). Selective attention and the organization of visual information. Journal of Experimental Psychology: General, 113, 501-51
Navon, D. (1977). Forest before trees: The precedence of global features in visual perception. Cognitive Psychology, 9, 353-383.
Rensink, R.A., O’Regan, J.K., & Clark, J.J. (1997). To see or not to see: The need for attention to perceive changes in scenes. Psychological Science, 8, 368-373.
Robertson, L.C., Egly, R., Lamb, M.R., & Kerth, L. (1993). Spatial attention and cuing to global and local levels of hierarchical structure. Journal of Experimental Psychology: Human Perception and Performance, 19, 471-487.
Medicine identifies blood groups in each person, which determine their antigens. When a non-native antigen is introduced into the human body, complications begin due to the struggle of the body and the defense system activation (Zhu et al., 2017). Four major blood types are: A, B, O, and AB. Each blood group defines specific antigens that are located on the red blood cells of a person (Fan et al., 2018). Blood groups add several unique characteristics, which is why the combination and movement (transfusion) of groups between people is possible only if unique factors coincide or there is no specificity in one of the groups.
O blood type is referred to as a “universal red cell donor” due to the fact that it has no antigens (A, B, or Rh), while group A contains antigens A, group B antigens B, and group AB both. Therefore, O blood type can be safely used and adapted for people with other blood groups (Zhu et al., 2017). This makes the O group unique and possible to use for patients with other groups.
Moreover, people with different blood types have different Rh factors which can be positive or negative. Most people have a positive Rh factor, while negative is rare. Even smaller percentage of the population has both blood group O and a negative Rh factor (Zhu et al., 2017). In the event that a person with a negative Rh factor is transfused with blood from a positive factor, antigens may begin to be produced, which leads to complications.
At the same time, the AB blood type is the only type with a unique plasma that is suitable for any other group. This allows transfusion of plasma from AB blood to people with any others without complications (Zhu et al., 2017). Universal blood types for transfusion are always in high demand and are needed in hospitals.
As a result of the global financial crisis, it has become of necessity to consolidate the clearing houses to make it safer and to reduce systematic risks. This creates a platform from which independent clearing houses either merge, or link with each other to deal with major international exchanges. It also become easier to clear the various asset classes, among the exchange traded derivatives, inter-bank interest rate swaps among others (Hull, 67). An example of such a firm is the London Clearing House- Clearnet group limited.
Derivative exchanges as derivative contracts
Derivative exchanges are derivative contracts that are standardized and their transaction is conducted in a regular futures exchange, and they can be futures, call, and put and will normally have the date they expire (Kolb and Overdahl, 62). This essay will discuss the necessity and the advantages that come with consolidating as compared to a single (CCP). IT WILL ALSO DISCUSS the economies of scale benefits that come with consolidation and the technical issues that arise.
Various ways how derivatives are traded
There are various ways that derivatives are traded; there are Over the Counter OTC derivatives contrary to Exchange Traded derivatives (ETDs) or CCP (Central Counter Party) where trading is regulated by Commodity Futures Trading Commission (CFTCs), and is done in a regulated exchange. Derivative exchanges were traditionally only done by Central Counter party (CCP) but recently there is OTC trading for them. The concerns over the credibility of CCPs were raised which pushed many OTC market participants to embrace the OTC Clearing; currently derivative exchanges are offering both the OTC and ETDs.
Role of market participants
Consolidation is preferred by the market participants and it does not necessarily have to be done by merging or through acquisitions, a few CCPs can be linked together. It can be done independently as long as the two clearing houses come into an agreeable contract. The market participants will prefer the presence of several across the borders clearing firms that operate together and with each other. They must have a position transfer and should create cross-margin agreements, by tackling the legal, political and technical hurdles that come with cross-border trading. Cross-margining is done in order to make optimal use of capital and for collateral purposes. It can be can either on the same type or different products. It can be combining of trading in derivatives and other underlying assets such as future bonds, future swaps, or equity option. Cross-margin trading can also be across different markets, e.g. Chicago Mercantile Exchange (CME) and LCH. Different markets trading is meant to enhance inter global trading in derivatives and securities
Consolidation and clearing of derivatives
Consolidation and clearing through a single center is a method being preferred by financial and non-financial firms, as more products become available and the difference in the prices of OTCs and ETDs is narrowed down (Pickup). The marketers have discovered the advantages that go with clearing through a one centre. The single center will net down and cover deals from various products. An example is the London Clearing House (LCH) –clearnet and Depository Trust Clearing Company National Securities Clearing Corporation. (DTCC-NSCC).
The benefits of consolidation are found and fully utilized on condition that the systems are fully integrated and are interoperability with the benefits of full cross margining. Consolidation across the borders has been hindered by different border regulations, lack of trust by the participants and national interests. A Central Counter Party (CCP) may not gather as much benefits in a risk management point of view (European Central Bank). A CCP must have enough capital and strong risk management standards so as to meet and provide a guaranteed in case of default. The marketers may not want to deal with an undercapitalized CCP.
Maximization of economies of scale
Economies of scale can be maximized by creating a single global CCP for OTC. Increase of the number of transactions subsequently reduces the cost of each transaction this is as a result of reduced administration and technology infrastructure expenses. There are no clear rules. Firms that merge have the advantage of gaining a market more easily than their single counterparts. An acquisition and a merge are more beneficial than creating a new branch or company. A big company is more efficient, as there is only a single operational process that is required.
The main advantage of CCPs is that it reduces systematic risk, by enforcing and maintaining quality risk management standards. The single global CCP will also reduce counter party risk. To the trader’s point of view, the number of legal contracts is reduced, by reducing the number of counterparties the trader has to deal with.
There are technicalities that however that arise with consolidation. There are the legal and political issues that come with cross-border integration. These are brought about by different rules and regularities in different countries. These are disadvantages of having a single CCP in that in case it fails, this would have a big impact in the market and it would even paralyze the trading before a solution is found. In this case multi-CCP would be preferred. For a single CCP that trades in the foreign exchange the impact of its failure would affect the currency exchange trade and this may have eve more financial effects. A single CCP would also have challenges in jurisdiction coordination. At a trader’s point of view, they believe that competition is a healthy ingredient in the market, competition is optimal. If consolidation leads to one single trader there is a danger of a monopoly situation arising. A monopoly may lead to increased transaction cost which is a disadvantage to the trader.
Benefits of consolidations
The benefits of consolidation cannot be underestimated, as has been witnessed by the success of (LCH) –clearnet. However to counter the disadvantages mentioned above there should be strong risk management strategies. To reduce counter party risk through netting of multilaterals, there should be interlinking and cross–margining through a statutory and contractual point of view. Thus for consolidation to be at its best several issue need to be addressed, a need for quality risk management standards; improved infrastructure; cooperation from public policy decision makers and at the global level; and defining efficiency market structures. There should be frequent supervision and regulation conducted by an independent body. This can be enhanced by the Committee on Payments and Settlements Systems (CPSS) and the International Organization of Securities Commissions (IOSCO) revising the international standards to accommodate this new trend.
Works cited
European Central Bank. Eurosystems Policy Line with Regard to Consolidation in Central Counterparty Clearing. European Central Bank. 2009. Web.
Hull, John. Options, Futures and Other Derivative Securities. Options, Futures and Other Derivative Securities. 2008. Print.
Kolb, Robert and Overdahl, James. Financial Derivatives: Pricing and Risk. New York: Willy. 2009. Print
Pickup, Paul. Consolidation of Cash and Derivative Markets Technology. World Federation of Exchanges. 2008. Web.