Developer and Software Project Manager: The Importance of Interaction

In the age of the rapid development of technology and the emergence of hundreds of new computer and telephone programs, the role of a software project manager is only growing. The project developer interacts with them quite a lot during the program creation. There are two main reasons why their communication is significant for successful project implementation. Firstly, the software project manager is responsible for explaining to the developers and other team members how their future product should look and what characteristics it should have. Therefore, they should assign the tasks, inform developers about the deadline, and provide them with all necessary tools. Secondly, since the software manager leads a team, which also requires communication, they should oversee the process of the programs development and know at what stage it is (What is software, n.d.). Hence, the interaction between the developer and the software project manager is a prerequisite for successfully implementing a project.

The Answer to Darcies Post

Timely and effective interaction between the developer and the software project manager is crucial in many aspects. It is difficult to disagree that the need to solve emerging problems and fix current errors is one of the reasons why these professionals should communicate. It is also necessary for the manager to ensure that the developer has all the tools they require. However, they should interact for other reasons, too, which are not less important: for instance, the manager assigns the tasks to the developer, which they can discuss. Moreover, the manager should control whether the developer can complete their work by deadline and whether its features meet the initial requirements. Thus, there are many reasons why a developer needs to communicate with the software manager.

Reference

What is software project management? (n.d.). Wrike. Web.

Virtualization and Software-Defined Networking

Introduction

It should be noted that the area of network design and management develops rather rapidly. The main reason for it is the fact that organizations have to adapt quickly to the changing environment. All the alterations are based on the accumulated experience to ensure quality improvements and revisions. The purpose of this paper is to review the trends in the areas of virtualization, software-defined networking, and network security during the past three years.

Virtualization and its Subsets

Virtualization is a trend that has already gained significant momentum, and according to experts in the industry, it will continue to grow further. This term means the abstraction of computing processes and the provision of a system that encapsulates the implementation to the user. The trend covers both virtualizations of platforms and resources. Virtualization of platforms implies full or partial emulation, address space virtualization, and virtualization of the operating system level (Dixit, Politis, & Papathanassiou, 2015). Resource virtualization involves combining and aggregating components, clustering computers, encapsulating, and partitioning. Among the listed categories, the server consolidation segment is the fastest growing area. Over the years, VMware has been the leader in the virtual server market; however, the competition is constantly increasing, and such services as Microsoft Office, SQL, and SAP are soon to be virtualized as well (Dixit et al., 2015). The main development trends in this subset are the virtualization of infrastructure and storage facilities as well as mobile virtualization.

Cloud computing is the main trend, which helps companies in hosting their servers. This is a data processing technology in which IT resources are combined for different hardware platforms, and the user can access them through the Internet. Cloud computing makes resources available in any setting (Salih & Edreis, 2016). Also, this technology assumes greater feasibility and efficiency as well as flexibility and scalability. Many organizations and enterprises have started switching to cloud computing because it is not only convenient but also saves their resources (Salih & Edreis, 2016). Therefore, the main trend within the server consolidation subset is the tendency linked to the abandonment of physical non-office servers.

Another tendency is linked to the spread of the virtualization approach to the consumer market. The core of it lies in the fact that consumers have received an opportunity to work on one computer using several operating systems (Salih & Edreis, 2016). Virtual desktops became implementable due to hypervisors of the 2nd type. Their functionality lies in the fact that the server can serve many users without the need for a complicated or sophisticated interface. In addition, the virtualization has become implementable on embedded devices.

Software-Defined Networking (SDN) and its Subsets

Due to the fact that network traffic and the number of devices connected to the network are constantly growing, the configuration of networks is becoming more complex. In order to make this process more simple, it is necessary to take measures to change the operation of networks and their management. SDN is an approach, which implies the reformation of the network structure, separation of management from data transmission, and automation of equipment administration (Mishra & AlShehri, 2017). With a traditional architecture, the network is represented by a set of functional blocks in the nodes of which the processing of large data packets occurs. In the SDN, standard equipment is used when building a network while the services required in each node are implemented using the software.

According to experts in the industry, within a few years, the SDN market volume will grow up to 35 billion USD, and almost 45% of expenses on transmission networks will be associated with this approach. The main trend within this area can be concluded to the fact that SDN is likely to encompass service providers, cloud-based commercial centers, and large corporate data centers (Mishra & AlShehri, 2017). The advantage of this strategy lies in the fact that organizations will be able to adapt to the changing business requirements easily. It will be achieved through a centralized environment created by a flexible network configuration and secure access to each network segment.

Apart from that, software-defined networking has the potential to become quite common to the majority of disk operating systems. The benefit of this approach is that a unified infrastructure can be reprogrammed for the current needs of the company. One of the main trends in this area is the growing penetration of BYOD (bring your own device). Image 1 evidences the growing interest in this trend, which is likely to increase over time. Therefore, it can be assumed that the technologies of container virtualization will become more popular among the general public as well (Mishra & AlShehri, 2017). This trend implies that employees will use their personal devices to work with corporate resources, which will create greater comfort in the workplace.

The graph shows Google Trend statistics (the interest of the general population and scientific community) on the BYOD trend
Image 1. The graph shows Google Trend statistics (the interest of the general population and scientific community) on the BYOD trend (BYOD, n.d.).

SD-WAN is another subset in which various trends are actively developing. As a rule, this technology is used to reduce costs. In it, the work of the software-defined distributed WAN network is provided by the controller in the data center of the head office. This controller is equipped with specialized software. The main and backup Internet access channels can be connected to it directly. The controller also acts as an ultimate router, and it installs applications that ensure network security (Mishra & AlShehri, 2017). However, this trend has certain barriers to implementation due to the fact that several controllers can furnish logically inconsistent configurations to switches. Such conflicts are difficult to resolve without delays; therefore, it can be assumed that this trend will be developed further so that such incidents are eliminated.

Network Security and its Subsets

It should be stressed that information systems are rather vulnerable; therefore, they have to be protected using the particular security means. The risk of attacks is growing, and security specialists have to develop and employ advanced measures to protect company data. The main trend in network security is the changing nature of threats. The main reason for this tendency is the political setting, which has been volatile during the past several years. Such instability has directly affected network security. It has impacted both private and public organizations resulting in increased expenditures (Sperotto, Hofstede, Dainotti, Schmitt, & Rodosek, 2016). Malware mainstream has become one of the significant issues since it started to affect different systems. However, the greatest problems have been experienced by Apple and iOS in particular. Certain actions of the company have led to the infection of the system through the code. In the course of time, the iOS platform has become one of the main targets for security violations.

The trends in perimeter security subset are also linked to the increasing number of threats experienced by organizations and enterprises. This tendency has encouraged alterations in legislation. In particular, fines have been initiated to ensure better data protection. If companies do not comply with data security rules, they will be subjected to penalties. In terms of the physical security subset, the trends center on ransomware prevention and detection (Sperotto et al., 2016). Within the past three years, several severe ransomware attacks took place. For instance, ransomware Petya has affected multiple companies and organizations including banks and large-scale enterprises leading to the loss of information, data, and company resources. Considering the fact that this type of threat is more complex than regular malware, stronger approaches such as defense-in-depth strategy are required to secure the data.

Network security infrastructure consolidation also has important trends. In particular, companies tend to initiate workforce education and social engineering to ensure that security breaches caused by the performance of workers are eliminated (Sperotto et al., 2016). Employee education is one of the best ways to ensure that security is overseen by all the parties who have access to any corporate information. Since security threats have become more advanced and intricate, it is important to ensure that employees are aware of the different forms of attacks and know the ways to avoid breaches.

Conclusion

Thus, it can be concluded that trends in network design in management are various. They are connected to the needs of companies and organizations, which change over time and require them to adapt to the current setting. Some of the trends discussed reflect the intention of enterprises to reduce expenses by eliminating physical non-office servers and other measures, and some tendencies have emerged due to the increasing number of threats that come from the external environment. Therefore, the trends are pushed not only by the intention to eliminate redundant processes or aspects of network performance but also by the negative manifestations, which can result in security breaches and data loss.

References

BYOD. (n.d.). Web.

Dixit, S., Politis, C., & Papathanassiou, A. T. (2015). Network virtualization in the wireless world [from the guest editors]. IEEE Vehicular Technology Magazine, 10(3), 27-29.

Mishra, S., & AlShehri, M. (2017). Software defined networking: Research issues, challenges and opportunities. Indian Journal of Science and Technology, 10(29), 1-9.

Salih, B. M., & Edreis, H. A. (2016). Comparison between virtualization and cloud computing. International Journal of Science and Research (IJSR), 5(6), 195-199.

Sperotto, A., Hofstede, R., Dainotti, A., Schmitt, C., & Rodosek, G. D. (2015). Special issue on measure, detect and mitigate-challenges and trends in network security. International Journal of Network Management, 25(5), 261-262.

Software Testing: Manual and Automated Web-Application Testing Tools

Introduction

The purpose of the software testing process is to distinguish all the imperfections existing in software design (Bahl, 2015, p. 316). It is the way toward practicing and assessing a framework or framework segments by manual or programmed technique to confirm that it fulfills determined necessities or to recognize contrasts amongst expected and genuine outcomes. People whose software evaluations are called testers. It requires an analyzer to assume the part of a client, and utilize all highlights of the application to guarantee its right conduct. Testers use specific guidelines to evaluate each web application (Bharti & Vidhu, 2015, p. 134). The purpose of this research is to perform an applied study on the manual and automated web-application testing tools. Consequently, the outcomes will be compared to evaluate the right tool for software testing.

Background of study

In this fast-changing period and aggressive business condition, manual testing has made it troublesome for organizations to evaluate their sites and applications. As a result, software testers used an automated testing process that reduces the challenges of manual testing (Bhateja, 2015, p. 2). Web-application testing highlights the defects of most site applications. This kind of testing additionally guarantees that the usefulness of web applications and administrations that are identified with the web is working properly. It additionally offers reusability and extensibility of tests over numerous dialects, stage, and for various programs (Bhatt, 2017, p. 3). It is important to note that manual and automated testing provides different results on web applications. However, manual-testing tools have their challenges, which limit its usability. Manual testing is a tedious process, not reusable, has no scripting capability, and requires time and effort (Chaudhary, 2017, p. 2). Nevertheless, automated testing covers the issues of manual testing. Automated testing robotizes the stages of manual testing utilizing automated tools, for example, Selenium and QuickTest Pro.

Testing is a basic piece of the product advancement process. With the huge improvement in programming frameworks, the real concern to developers is quality and security (Chauhan & Singh, 2014, p. 3). An effective software-testing tool ensures quality, which influences user confidence. Numerous open-source applications are accessible for enhancing software quality by diminishing the product defects.

Literature review

Online applications or web applications are programmable application software that keeps running in a web program and is made in one dialect and depends on a web program to render the application. Web programs are software applications (Google Chrome) that enable clients to show, recover, and communicate with content situated on the pages of a website. The communication process is carried out through a two-way data process between the server and the customer (Dubey & Shiwani, 2014, p. 7).

Sites or web applications support an interminable number of employments and functionalities on the web. Regular illustrations incorporate Amazon.com for shopping, Facebook for long-range informal communication, Google for Search, and Yahoo mail for an email to mention a few (Gautam & Nagpal, 2016, p. 4). Web applications are designed with three layers. Each layer has its specific functions and objective. The layers of a web-based application include the database layer, operational and user interface layer. The database layer stores the whole data of the application, including the creation, refresh, and ease of individual records, traits, and qualities in the application information. The operation layer directs all capacities that work the application (client solicitations and reactions). This layer can be considered as the administrator who guarantees that all assignments are finished and appointed to legitimate territories of work. The user interface layer enables clients to interact with the application capacities, for example, pictures, user profile, and the website navigations (Hanna, El-Haggar, & Sami, 2014, p. 6).

Automated Testing

Automated software testing is useful for huge projects. It is dependable and effective for end-user applications. As a result, automated software testing tools are used to conduct tests with predefined activities, coordinates the built-up program, and genuine outcomes (Rathi & Mehra, 2015, p. 4). If the task prospects and results align, the application is working properly. However, if it does not align, the codes would be altered and the test initiated again.

Manual Testing

Manual testing is a technique utilized by programming designers to run tests physically. There are numerous manual testing tools, which are completed physically and naturally. Manual testing tools include black box, white box, integration testing, system testing, and unit testing (Islam, 2016, p. 30).

Research Methodology

For upgrading information on software testing tools and practices for web applications, this research explores an applied study of manual and automated software testing tools (Shah, Shah, & Muchhala, 2014, p. 3). Consequently, the study compares both software-testing tools and prescribes the fitting extension and utilization that each instrument is most appropriate to serve (Mahmood & Sirshar, 2017, p. 53).

Testing Instrument

Selenium is an open-source test apparatus, which sustains diverse kinds of testing in web applications. Selenium is not only a solitary instrument, however, but it also comprises of four apparatuses: Selenium IDE, Selenium RC, Selenium Webdriver, and Selenium Grid. Selenium gives the capacity to make test contents in various programming dialects and to perform various types of testing, for example, utilitarian, relapse test to mention a few (Sharma, 2014, p. 17). Selenium can be integrated with different structures to give a hybrid structure, which makes the software testing easier.

Evaluation Metrics

The evaluation metrics include script generation, versatility, preparation for automation, utilization, and operating system, cost, test result generation, and other factors. The parameters for testing include Computer brand, operating system, RAM, processor, and browser. Figure 1 shows the test requirements for the web- applications

Figure 1. Software testing requirements.

Features Testing device
Computer brand Hewlett and Packard
Operating system Windows 10, 64 bits
Random-access memory 4B
Processor brand iCore3
Browser Chrome or Mozilla
Software testing tools (Selenium IDE, Webdriver, and WATIR) Mozilla application

Test Results

The functional tools were tested based on their reliability. The variables for testing include browser, program language, operating system, framework support, and web elements. Others include pop-up support, installation process, script implementation speed, learning curve, web compatibility, cost, database support, and test generation.

Keeping in mind the goal to compose the code for testing, the analyzer should generate and understand the HTML content for the site page. To recognize site page components, extraordinary program devices can be used (Patil & Temkar, 2017, p. 3).

The WATIR is another software-testing tool that supports most web browsers. Like Selenium, it does not give testing to windows applications. However, UFT/QTP functions admirably with both web and windows applications. It has a designed instrument for product categorization. The functional testing tool is compatible with most web add-on. However, QTP is not compatible with most recent renditions of programs and OS, although it integrates recent forms of Internet Explorer and a large portion of the Chrome and some of Firefox programs, and also, it performs API testing.

Conclusions

Generally, automated and manual testing is considered as unique and separate methodologies are utilized for their execution. Manual and automated software testing are complementary because one limitation is developed and solved by another. Manual testing can be valuable for discovering bugs in unique situations where the prerequisite changes persistently, and circumstances where automated trials cannot be guaranteed. Nevertheless, automated testing has numerous preferences, for example, repeatability, and consistency, better and powerful treatment of experiments (for circumstances where a substantial number of experiments should be executed). The contrast between manual and automated software testing is that automated testing is most proper for the circumstance where redundant work should be finished. Nevertheless, it cannot kill the bugs in the application without the assistance of manual testing. Along these lines, automated software testing tools are great at expansiveness, however, very little at profundity. After the general investigation, it is difficult to rank these devices given various factors as they were.

Selenium furnishes the opportunity to work with various kinds of programs, frameworks, and additionally adaptability to pick one among numerous programming dialects. However, its use is constrained to web applications. Thus, software-testing tools have favorable circumstances, restrictions, and usage for some specific kinds of testing, because of the extent of uses. None of this software-testing tool is perfect or best. Although, Selenium has been favored by most programming enterprises, analyzers, or engineers. Seleniums assortment of highlights and functionalities, capacity to coordinate with different systems, its number of various programming languages, and its free cost gives its competitive edge. It is important to note that not all applications are web-based. Hence, there will dependably be interest in UFT or testing tools, which can perform testing in both web and window-based applications.

References

Bahl, K. (2015). Software testing tools & techniques for web applications. International Research Journal of Engineering and Technology, 4(5), 316.

Bharti, T., & Vidhu, E. (2015). Functionality appraisal of automated testing tools. International Research Journal of Engineering and Technology, 3(1), 134.

Bhateja, N. (2015). A study on various software automation testing tools. International Journal of Advanced Research in Computer Science and Software Engineering, 5(6). 1-3.

Bhatt, D. (2017). A survey of effective and efficient software testing technique and analysis. J.Institute of Engineering and Technology, 1(1), 1-4.

Chaudhary, S. (2017). Latest software testing tools and techniques: A review. International Journal of Advanced Research in Computer Science and Software Engineering, 7(5), 1-3.

Chauhan, R., & Singh, I. (2014), Latest research and development on software testing techniques and tools. International Journal of Current Engineering and Technology, 4(4), 1-5.

Dubey, N., & Shiwani, S. (2014). Studying and comparing automated testing tools: Ranorex and Testcomplete. International Journal of Engineering and Computer Science, 3(5), 1-8.

Gautam, S., & Nagpal, B. (2016). Descriptive study of software testing & testing tools. International Journal of Advanced Research in Computer Science and Software Engineering, 4(6), 1-8.

Hanna, M., El-Haggar, N., & Sami, M. (2014). A review of scripting techniques used in automated software testing. International Journal of Advanced Computer Science and Applications, 5(1), 1-9.

Islam, N. (2016). A comparative study of automated software testing tools. Culminating Projects in Computer Science and Information Technology, 12(1), 1-92.

Mahmood, H., & Sirshar, M. (2017). A case study of web based application by analyzing performance of a testing tool. J. Education and Management Engineering, 4(1), 51-58).

Patil, R., & Temkar, R. (2017). Intelligent testing tool: Selenium web driver. International Research Journal of Engineering and Technology, 4(6), 1-4.

Rathi, P., & Mehra, V. (2015). Analysis of automation and manual testing using software testing tool. International Journal of Innovations & Advancement in Computer Science, 4(2), 4.

Shah, G., Shah, P., & Muchhala, R. (2014). Software testing automation using Appium. International Journal of Current Engineering and Technology, 4(5), 1-4.

Sharma, R. (2014). Quantitative analysis of automation and manual testing. International Journal of Engineering and Innovative Technology, 4(1), 17.

Syntax Code Writing in Statistical Software

It is clear that conducting an analysis of quantitative data using the IBM SPSS software package often requires performing numerous operations so as to compute the statistics for the given data, produce various types of data representation such as histograms, graphs, and scatter plots, conduct transformations of the data, or run statistical tests. In order to carry out these operations, it is possible and often easy to utilize menu selections so as to have the SPSS perform a certain set of actions. However, when working with large sets of data that require repetitive implementation of the same operations, or when dealing with a number of similar sets of data for which the utilization of the same actions is needed, it might be more convenient to use SPSS syntax programs so as to carry out the analyses (Field, 2013).

While in the past, prior to the creation of the Windows operating system, it was difficult to write a syntax program, this task can be performed in a much simpler manner nowadays. First, a list of the syntax commands which are required for carrying out a certain operation can easily be obtained via the menu selections. To get it, it is necessary to determine the action that is to be carried out by SPSS (for instance, to click Graphs ’ Legacy dialogs ’ Histograms, and select the variable for which a histogram is to be created), and click the button Paste instead of OK.

As a result, the SPSS will open a Syntax window in which the needed syntax code will be displayed. By combining a number of such codes, it is possible to create syntax programs the implementation of which will result in SPSS running the specified operations without the user having to carry out the same series of menu selections many times (George & Mallery, 2016).

Therefore, the production of syntax programs that do not contain numerous errors is rather simple nowadays due to the fact that it is possible to use the Paste command in order to display the syntax command that corresponds to the specified menu selection.

References

Field, A. (2013). Discovering statistics using IBM SPSS statistics (4th ed.). Thousand Oaks, CA: Sage Publications.

George, D., & Mallery, P. (2016). IBM SPSS Statistics 23 step by step: A simple guide and reference (14th ed.). New York, NY: Routledge.

Data Coding in Statistical Software

Data coding is of paramount importance if a proper analysis of this data is to be carried out. In particular, data coding plays a critical role when it is needed to use statistical software in order to process the data. Because of this, it is crucial to properly code the data which is to be entered into IBM SPSS.

In particular, it is necessary to use the ways of coding information which can be recognized by the statistical software. Any user of SPSS should be aware of the fact that this package is not capable of telling apart the pieces of data which are entered in different styles. That is, SPSS will not differentiate between symbols written in italics, those which are bold, or those which are underlined; it will also not tell the difference between symbols written in different fonts, colors, and so on. Therefore, the symbols 2, 2, 2, 2, 2, 2, 2, 2, 2, etc. will be read by the package as the same symbol 2. Thus, it is necessary to employ symbolically different units for coding if SPSS is to differentiate the data properly (George & Mallery, 2016).

It is also noteworthy that SPSS will differentiate between capital and lowercase letters (for instance, A and a), but employing such coding might be confusing for the user, which is why the use of numbers is advised. In addition, the SPSS will work better with numerical than with alphanumerical data; for instance, it is possible to calculate statistics (the mean, standard deviation, etc.) and run tests for a variable measured in Likert scale if it is coded using numbers, but it is impossible to do so if letters are employed for coding (Field, 2013).

Therefore, it is recommended to utilize numbers for coding variables, for this type of data will not be confused by the user, and will be properly processed by the statistical software.

References

Field, A. (2013). Discovering statistics using IBM SPSS statistics (4th ed.). Thousand Oaks, CA: Sage Publications.

George, D., & Mallery, P. (2016). IBM SPSS Statistics 23 step by step: A simple guide and reference (14th ed.). New York, NY: Routledge.

PeopleSoft Inc.s Software Architecture and Design

The PeopleSoft Inc. was a company that provided Enterprise Performance Management software (EPM) performance, Human Resource Management Systems (HRMS), financial management solutions (FMS), Supply Chain Management (SCM), and customer relationship management (CRM) software systems (Kurtz 2012). Beside this, the company also provided software solutions for enterprise performance management, as well as student administration to large institutions, corporations, and the government.

The company had been in existence independently until 2005 when the Oracle Corporation purchased it. PeopleSoft technology has based its product on a Client-server approach since its establishment. Advancements resulted to new versions of the application and with time, the entire set shifted to a web-centric design named as the Pure Internet Architecture (PIA) (Anderson 2006).

With the PIA architecture, a company using ERP application could access all its business functions on a web browser (Kurtz 2012). Even after its acquisition by the Oracle Corporation, this application still can still function as an ERP. In addition, the application can as also be used for single modules for instance Student Administration in isolation. Presumably, the community college in question has an integrated PeopleSoft ERP system that encompasses student management, human resources systems, payroll systems, and inventory systems (Anderson 2006).

The ERP software as well as the documentation involved with it is a private property (Matthias 2005). Its use is only through under a license agreement that normally contains restrictions of use and disclosure. The ERP PeopleSoft software enables an advanced planning and budgeting system. It enables a synchronized top-down target setting as well as bottom-up budget preparation in support of continuous wide planning. These shows that the planning and department system in the community college do not have a hard time doing planning and budgeting.

The system as well guarantees the community college of security. ERP software provides options of read only, read-write, or no access security purposes. These dimensions assure the institution total security of their data and free from sabotage. The software can document and test risks and controls, which is an important aspect as undetected errors, could result to consequences. The software efficient and its efficiency come with cost. It is a very expensive but efficient application (Anderson 2006).

The software has especially proven to be important to higher learning institutions and colleges (Kurtz 2012). The campus solutions data assist clients to extract data from their transaction systems and build sophisticated analysis and reports. A current version of ERP has new campus solutions with three data marts, student records, admissions, and student financials. Therefore, the community college is able to produce operational and analytical reports regarding students records, admissions, recruiting and file reports regarding student financials. In addition, these data marts can store data on enrollment and course catalogs (Kurtz 2012).

On the other hand, the technical school uses a proprietary system that is close to a typical Commercial Off-the shelf (COTS) system (Anderson 2006). Apparently, this system is a development of the technical students during their course of study, functions similarly to the PeopleSoft system, and has equal capabilities. COTS system is any non-developmental item of supply that is both commercial and it is possible to sell substantial quantities of the item in the commercial market for instance, computer software, hardware systems, or any free software. COTS entails a pattern modified for precise use. This property has made it easier for students to construct the software.

It is possible to buy the components of COTS and one does not necessarily require developing it from scratch. This has an overall effect of reducing the overall costs and system development as well as long-ERP maintenance costs. Furthermore, the system reduces the initial time of development. Therefore, a typical COTS system reduces the initial costs and time of development. However, the system developed by the technical students relatively differs from a standard COTS system as it experiences rather dissimilar challenges contrary to the expectations (Kurtz 2012).

The system in the technical college is equal to a COTS system. Commercial off-the shelf Software is pre built and normally acquired from a third party vendor. (Anderson 2006). Research shows that organizations currently prefer this application as it has many desirable features costs being the leading one as it is cheap. It is a stable application as many organizations use it. This proves that it is a reliable program. COTS software is usually the output of the efforts of many experts put together and therefore is relatively complex (Matthias 2005). Thus, it is possible that the system in the

technical school is of relatively complex since many students combined their knowledge to come up with the software. The system in the technical school is expectedly of high quality since many students have imputed their skills in its construction.

However, using COTS software puts at jeopardy the security of data of the institution using it (Anderson 2006). The system in the local technical school has several components similar to this system and thus vulnerable to facing security challenges. A study conducted recently showed that only approximately 14% of companies conduct security reviews on commercial applications brought in house, and over half of the companies do not conduct the assessment at all.

Companies rely on the vendor reputation and legal agreements or lack policies dealing with COTS all together. This only means that there is limited visibility of the risks introduced into their software by COTS software. It is possible that no assessment takes place at all in the technical school, as such assessment requires a lot if expertise and this means comprising the security of their data. Vendors of COTS are not stable and may go out of business without any prior notice, which can cause devastating effects to institution using the application. In addition, it is possible to purchase solutions that are of poor quality that may cause loss of data (Kazman 2004)

It is evident that the system in the community college is more efficient than the system in the technical school. The two can merge to maximize the benefits and minimize the challenges. A standard solution derived from the two alternatives should be in a position to provide security to the data stored. Properties that put security of data in jeopardy in a COTS software should not reflect in the new solution formed from the merge (Kazman 2004). Strong properties that guarantee security in ERP system should feature in the new solution. The ERP soft ware has proven to be very expensive while COTS is considerably cheap.

A COTS application is cheap since you can obtain it from several vendors in the market. The new solution should exhibit the aspect of data marts to ease the administration of students. This means that properties from the two software applications inhibiting the client from extracting information from the databases should not feature in the new solution. The data marts should therefore be a strong character in the new solution as they give full access the databases and compiling of comprehensive reports is with much ease (Kurtz 2012).

Admissions data marts, student records data marts and student financial data marts should characterize the new solution consuming (Beydeda 2005). Student records data marts should probably store data on enrollment and course catalogues organized in a systematic order to enable the institution prepare ad-hoc reports. The sample reports from the possible student records data marts ought to include information on class sizes, grade distribution, as well as graduation suitability.

In addition, reports from this category should include total statistical reports with year-to-year comparisons. Admissions data marts should show the life cycle of admissions and provide reports to sustain the admissions process. Sample reports from these databases should monitor applicants, applications, in addition to other measures by prospect profile data. The student financial data marts are important to merge financials of the students and the general financial system (Kurtz 2012).

The solution will have some cost shortcomings as it is bound to be expensive. The expected initial time of system development is long thus time consuming (Carter 2004). However, the advantages of the new solutions overwrite the disadvantages with the data marts solving most of the problems encountered. The three data marts, student financial data mart, admissions data mart, student record data mart solve the principal issue of student management (Beydeda 2005).

References

Anderson, L. (2006). Understanding PeopleSoft. Upper saddle river: John Wiley & Sons.

Beydeda, S. (2005). Testing Commercial-off-the-Shelf Components and Systems.New Mexico: springer publisher.

Kazman, R. (2004). COTS-Based Software Systems: Third International Conference, ICCBSS 2004, Redondo Beach, CA, USA, Febru8ary 1-4, 2004, Proceedings. New Mexico: springer publisher.

Kurtz, D. (2012). PeopleSoft for the Oracle DBA. London: Apress.

Matthias, E. (2005). Advanced planning in fresh food industries [electronic resource]: integrating shelf life into production planning Contributions to management science. New Mexico: springer publisher.

Explore Factors in IBM SPSS Statistical Software

The Explore command in IBM SPSS produces an output that includes several statistics for one variable either across the whole sample or across the subsets of the sample (Kent State University, n.d.). To divide the sample into subsets while utilizing the Explore command, it is needed to move the categorical variable using which the sample will be split into subsets to the Groups based on field.

Therefore, employing a variable as a factor allows for calculating the statistics separately for each group into which the sample is divided by that variable. It often might be useful to use such variables as gender or race as factors. For instance, the variable gender may be used when it is needed to compare the average income that males and females earn, for which aim it is required to calculate the means, standard deviations, and other statistics separately. The variable race might allow for comparing the mean of some other variable across the representatives of different races in the sample. In short, variables that provide a meaningful division into groups can be used as factors.

On the other hand, variables that do not split the sample into meaningful groups are not usually used as factors. For instance, it makes no sense to use a variable such as an ID number of a participant, because ID carries no internal meaning; in addition, there would be as many groups as participants (George & Mallery, 2016).

It also makes no sense to use certain continuous variables as factors, because there would be too many groups. For instance, using an overall course grade on a 100-point scale as a factor would produce too many groups, and such a division would be meaningful. On the other hand, using the 6-point scale (A-F) would allow for a meaningful division into a manageable number of groups.

References

George, D., & Mallery, P. (2016). IBM SPSS Statistics 23 step by step: A simple guide and reference (14th ed.). New York, NY: Routledge.

Kent State University. (n.d.). SPSS tutorials: Descriptive stats for one numeric variable (Explore). Web.

Project Failure, Basics of Project Planning & Alternative Scheduling Software Tools & Techniques

From a lack of communication to overall unfavorable work circumstances, projects can fail if managers do not put not enough planning into their execution. A project manager who is entirely in possession of all the necessary information will be more effective at acquiring, managing, motivating, and empowering the project team, countering all the determinants, which may work against them (Project Management Institute, 2017, p. 309). Thus, utilizing project management and scheduling tools, which exist and are easily accessible in the 21st century, may help facilitate the planning mechanism, which is essential to project execution. Recognizing these software techniques and resources, as well as securing their inheritability for the future benefit of other teams, is a vital part of creating a positive and productive work environment. The proper implementation of such tools may even help to counter circumstances that are outside the teams or even the companys scope.

The Main Reasons for Project Delay and Ways to Counteract Them

The delay of projects relies on both internal and external circumstances. Contractor tardiness, client-end setbacks, changes conveyed from upper management, and even climate can affect the maturity of a project, with circumstances often layering to create concurrent delays (Mubarak, 2015). However, the main reason behind project setbacks is a lack of proper planning, due to incomprehensive analyses or managers not taking into account what-if situations, designed to demonstrate the failure-susceptible areas of a project (Project Management Institute, 2017). Therefore, time management becomes a unique skill that may be taught, with the Project Management Institute providing a wide variety of educative literature on the topic (n.d.). Another essential aspect of proper project execution is the correct allocation of resources, based on realistic expectations and goals, without which the team may be working inefficiently (Project Management Institute, 2017). Thus, proper project-planning techniques may mitigate the main reasons behind setbacks, recognize critical activities, and secure the indispensable teamwork spirit.

A client needs analysis is essential to achieve proper project maturity through this kind of effective planning, recognizing not only possible setbacks but also the goals of various customers. Assuring the consumer of both the teams reliability and integrity through the creation of a needs-reflective plan, as a road map to delivering a quality product or service, is an essential part of project management (Kerzner, 2017, p. 706). Therefore, understanding the customers needs is central to creating a plan to address them, as projects are seldom standalone and instead are supposed to resolve a real issue (Project Management Institute, 2017). Unsatisfied stakeholders are one of the main reasons of company failure, and a client needs analysis must aim to discern clients motivation behind the project, their resources, and end goal (Milosevic & Martinelli, 2016). Effective project planning should imperatively reflect all wants of the customer to produce a satisfactory level of work.

The Usefulness of Project Management Software

Modern-day technologies may not only help perfect the project-planning period of any undertaking but also allow counteracting some of the disadvantages of paper-only methods. Fast-tracking and resource optimization through smoothing or leveling are the two main methods of potentially improving project scheduling, but which retain executive drawbacks, such as the need to monitor the critical path continuously (Project Management Institute, 2017). For example, the software FastTrack Schedule 10 allows handling of not only activity duration but also start and end times, and the critical path, as seen in Figure 1 and Figure 2. In Figure 3, it is possible to see that the application also allows the optimization of resources through their assignment and reassignment, with either a variable or fixed project end-date. These skills are all essential for smoothing and leveling, as well as fast-tracking techniques, to control all aspects required for the project adequately.

Sample Activities and Linked Relevant Information.
Figure 1. Sample Activities and Linked Relevant Information.
Sample Project Schedule with Critical Path.
Figure 2. Sample Project Schedule with Critical Path.
Sample Project Resource Allocation between Activities.
Figure 3. Sample Project Resource Allocation between Activities.

FastTrack Schedule 10 also helps manage its realization throughout all the stages of a projects life cycle:

  1. Initiation: helps judge the practicality of the projects execution with the accessible resources, as seen in Figure 4;
  2. Planning: allows determining resource availability, as seen in previous illustrations, and helps visualize the projects budget, as seen in Figure 5;
  3. Execution: permits creating tasks and sub-tasks, as seen in Figure 6;
  4. Closure: assists in analyzing goals through the provision of achieved statistics, as seen in Figure 7.

Thus, FastTrack Schedule 10 may augment all aspects of the project life cycle, mainly because of its easy-to-apperceive layout, helpful tools, and share-ability between various teams.

Sample Project Resources witch Calculated Cost and Work Hours.
Figure 4. Sample Project Resources witch Calculated Cost and Work Hours.
Sample Project Budget Allocation with only Resource Cost.
Figure 5. Sample Project Budget Allocation with only Resource Cost.
 Sample Activity, Task, and Subtask.
Figure 6. Sample Activity, Task, and Subtask.
Sample Project Status Layout.
Figure 7. Sample Project Status Layout.

Inheriting Knowledge and Lessons from Previous Projects

The creation of a practical legacy is pivotal to success within various industries. Re-using existing templates is one way of achieving transferability of knowledge, but not the most efficient one (Milosevic & Martinelli, 2016). Reviews, seminars, conferences, and other knowledge management practices secure inheritability of essential information, which may then be used to create even more advanced methods of project management (Project Management Institute, 2017). As a figure who is in charge of codifying essential knowledge during the projects lifespan, the project manager is demonstrative of the credibility of the company (Milosevic & Martinelli, 2016). Thus, the creation of a project management office (PMO), which acts out both supportive and directive roles, makes sure that knowledge is passed on from team to team and useful techniques are preserved (Project Management Institute, 2017, p. 48). Teams successfully sharing the skills acquired from previous projects secures the advancement of companies, as it decreases the time spent to find already discovered solutions to recurring problems.

Biggest Challenge in Project Scheduling

Project scheduling remains a demanding action, which circumstances both within and without the company always affect. Of the possible influencing circumstances, changing regulations by the authorities may be one of the biggest challenges to proper planning, as the company and the project team could be forced unexpectedly into unforeseen working conditions. Since outputs need to comply with all applicable standards, requirements, regulations, and specifications, legislative changes overrule all non-compliant plans and necessitate reworking the scheduling plan to appease new criteria (Project Management Institute, 2017, p. 298). While contractors may be negotiated with, government-instituted laws are imperative and inviolable. Thus, government regulations become a potential and particular environmental risk in project planning, which companies and their teams must anticipate and consider when drafting schedules that aspire to be realistic (Milosevic & Martinelli, 2016, p. 388). Thus, overcoming the biggest challenge in project schedule drafting is not impossible but requires foresight and extensive knowledgeability of the current legislative climate in the industry of interest.

Conclusion

While there is no replacement for a good project manager, using tools that help achieve better results is an indispensable part of creating stellar projects. The proposed FastTrack Schedule 10 software allows creating a plan that is easy to share and follow, which may be made easily accessible to both clients and team members. It also facilitates finding and identifying factors that may potentially delay work at any stage, in addition to tracing the relevance of the project to the end goals of the client. Furthermore, re-utilizing the created templates and holding seminars on how to perfect their use is an ideal way of making sure that these lessons remain preserved and built upon by future project managers. Therefore, as a tool, FastTrack Schedule 10 plays the role of both an organizer and a problem-finder, as it helps team members and executives visualize the workflow and compare their achievements against a preset rubric.

References

Kerzner, H. (2017). Project management: A systems approach to planning, scheduling, and controlling (12th ed.). Hoboken, NJ: John Wiley & Sons.

Milosevic, D. Z., & Martinelli, R. J. (2016). Project management toolbox: Tools and techniques for the practicing project manager (2nd ed.). Hoboken, NJ: John Wiley & Sons.

Mubarak, S. A. (2015). Construction project scheduling and control (3rd ed.). Hoboken, NJ: John Wiley & Sons.

Project Management Institute. (2017). A guide to the project management body of knowledge (PMBOK guide) (6th ed.). Newtown Square, PA: Project Management Institute.

Project Management Institute. (n.d.). Time management. Web.

Scrum  Software Development Process

Abstract

This world is growing so fast and number of changes is taking place in different fields. Computerized system and digital solution has added life to number of fields. Scrum is a software development process which ensures high quality and performance. This paper fully deals with scrum terminologies, case studies and other related issues.

Introduction

Scrum is a Scrum is one of the mythologies of Agile Software Development there are number of processes involved in software development. Different algorithms are based on set of rules for producing an effective solution. Scrum is an iterative incremental process for software development also used with agile software development. Different companies are using this technique with the combination of different soft wares and developing new and advance solution which meets the requirements of today market.

Scrum is composed of different sets of roles and widely used in software management, Scrum plays significant role in software development process. The main role performed by scrum is maintenance of work and process done by project manager, controlling the development team and stake holders. Basically software development process involves number of steps like information collection, requirements analysis, problem analysis, design coding, integrating, evaluation and testing. SDLC (software development Life cycle) ideally defines steps involved in software development. SDLC evaluates software quality based on number of steps usually seven step. Following SDLC approach is useful while developing software which ensures high quality software and performance. Information gathering, requirements analysis, testing coding and development all these steps cannot be tracked down as there is no criteria defined in each phase for ensuring the quality and performance.

There are no accurate methods for determining the quality and level completion at every level. It has reported in numerous resources that a large amount of software fails every year. The main reason of software failure is that lack of proper project management. Improper management cannot judge project quality and performance at developing time. In number of software houses, different teams develop different modules and they have no communication medium for solving issues. As a result developer faces poor customer satisfaction, bad reviews, project failure, financial loss, lower return investments etc (Button and Dourish, 192).

Lower productivity and software quality led an organization towards downfall. Software development is a complicated process and needs proper communication and connection between team members during development cycle. Proper co-ordination and communication result in a shape of effective solution. Software development is complicated process and usually number of teams is involved developing software, so proper monitoring and communication saves time and efforts during development. There is always a strong need of communication with client or customer in order to sketch his requirements properly at every phase. After requirement analysis its really important to implement all customers requirements in form of an efficient computerized solution. Scrum is widely used in number of fields for developing soft wares. Scrum improves software development quality by providing reliable path of monitoring solutions at development phase. It improves transparency among team members and makes process transparent and effective (Hartmann, 195). The primary goal of scrum is to provide communication medium and promote free and fair communication among team members. Scrum ensures that every one in team must know the status and development position at every stage. It provides feasibility and ease of retrieving on going project details at any stage. With the aid of Scrum, every one can see and check the status of on going project during the course of project, it also makes process transparent to customers, developers and other team members. Scrum reduces risk by monitoring every phase of development through out the project. Scrum provides ease and support in dealing with users rapid changing requirements in limited time. It easily deals with users changing requirements than other software development processes. Scrum provides reliable and fast communication among team members which helps a lot in resolving time to time issues involved in development phase. This paper deals with SCRUM overview, Advantages, case studies. Later section of this paper provides scrum analysis and reasons of scrum failure.

Scrum Overview

Scrum is a software development process, it provides software development framework which is widely used in real world for delivering efficient computerized solutions. The basic advantage of Scrum is its dealing, it provides easy learning with online tools and it requires little effort to start. Scrum provides better risk management; indication of upcoming problem and issues in development process. Scrum offers loosely couple functions and set of rules implemented using scrum tools. Scrum provides high risk management, transparent process and effective communication. Scrum Development process promotes customer involvement through out development process which saves time, efforts and money (Sharp et al, 233). It improves software quality, risk management and yields higher return on investments. Scrum is specially designed according to the needs of todays market and software development requirements. In 1986, few researchers introduced new theory for improvement in software speed and flexibility, in 1991, they referred this approach as scrum, and this approach proposed advanced implementation methods. In 2001, Achber teamed up with Mike and wrote a book and named it as agile software development with Scrum (Frank Maurer, 56).

Scrum Terminology

There are different terminologies used in understanding scrum process. Scrum work flow, Architecture, Sprint, release, product owner, team member, Product backlog etc.

Scrum Workflow

Scrum is usually composed of number of steps which includes requirements analysis, testing etc. schedule and cost is estimated in early phases of development. New features are identified as backlogs.

Sprints: Sprints are meeting which plays an important role in developing high quality software, it solves number of issues and problem related with development phase. It enables every member to check status of project at any stage.

Mainly, there are three actors involved in scrum process Scrum master, product owner and team.

Product owner is responsible for scrum planning; he is responsible for the backlogs or features and prioritizes the backlog list based on the business value.

Team is responsible for designing, development, testing and other processes. Scrum master is a leader and responsible for team management and proper management and techniques results in a form of effective solution.

Comparison

There are number of software development process are available for software development phase. Scrum is declared best among all other software development processes because of its effectiveness, time saving skills and proper monitoring at every phase of project development. It reduces chances of failure and provides transparent monitoring to all team workers. Usually, waterfall method and spiral methodology are compared with scrum. Waterfall method works on set of rules which requires complete data instruction before improving on next step. Waterfall method works on well defined process, coding, testing, implementation and all other related steps needs to be finished before moving to next step. Requirement and planning phase plans completion dates, final product and project cost during this phase. In this process any unpredictable or unexpected change can reduces probability of success up to high extent. Spiral model and waterfall model both works efficiently but there is a lack of proper management and monitoring in this process which reduces their success rate. Spiral model always requires well defined process which includes several steps are required to be completed before moving on next step. Spiral method is an iterative method but it is more iterative than other processes. In this process, assumption is always made that requirements once taken will remain same through out the cycle. Iterative process and completion dates can partially vary.

Its better than water fall model but success rate is low than other Scrum. Scrum method requires periodic sprint meetings during which any changes to the project are planned. At the end of each sprint cycle or iteration a releasable feature is implemented. Due to these periodic meetings any sudden changes or unanticipated risk is better managed by using the scrum development process (Rogers, 75).

Analysis

Every process has some limitations and some requirements and some set of rules to follow, same in case of Scrum. Usually, scrum is successful in number of cases, but in some conditions it also gets fail, there are number of preventive measures which should be done in order to prevent scrum failure. The first topic which needs to be cover in this domain is Scrum and CMM.

Scrum improves quality of software development and management skills; it provides ease and support of monitoring project. CMM Capability maturity model can be compared with Scrum in order to find out the best points among both. Post motern analysis is done at the end of each sprit cycle; at this stage all mistakes and problem are noted in order to prevent them in next cycle. These are used to minimize such mistakes from occurring during later sprint cycles. So scrum improves the process step by step and it requires rules and techniques that need to be followed. Scrum does continuous process refinement during the post analysis at the end of sprint phase. Scrum is powerful in number of situations, scrum has enough capability of dealing with changing requirements efficiently through out the development process. It empowers organization to categorized features and delivers high quality software according to customers business rules. Scrum promotes the need of effective communication in order to produce effective solutions. Scrum educates team to be focused on work and enhances employees capabilities and increases the accountability of each member.

It provides high transparency, better risk management and proper project monitoring at every phase (Beck, 2001). With the aid of Scrum every team worker can be able to view the status and details of every phase during development cycle. Thus scrum handles requirement changes, unpredictable events or any other personal changes better than other traditional processes. Scrum is suitable for small as well as it is also suitable for large projects. Scrum can also work with large projects and it has ability to deal with large amount of data with proper monitoring. Large projects usually involve complex customers and number of development and testing teams working across geographies. It would be difficult for each of them to attend the scrum meeting. It is also difficult for the management to trust the whole team and make the project transparent to everyone. In most of the cases, customer does not like to participate in development of project, customer usually like to just sketch their requirements and want complete solution on deadline.

Customer do not like to attend meetings and check development phases, scrum solves this problem up to high extent as it follows incremental steps approach during development phase. Sometimes, it is difficult to have hold on complete system as the whole and relate the pieces comprising the whole. Usually dealing with large projects is not easy and number of software development processes fails at this stage. According to Ken scrubber, during software development process, 15 minutes meeting daily should be held in order to produce effective solution which must meet user requirements.

Scrum failure

Each software development process has some limitations, they work well up to high extent but after reaching some level they fail. Scrum requires involvement of every team member in an effective way. Scrum follows some set of rules which require all members to follow and no one should skip scrum rules. If any team member denies attending meeting and reporting his task. Then the transparency and effectiveness of scrum meetings are compromised. These meetings add live to the function of scrum, it helps in solving conflicts and dependencies occur during development process. In some cases, multiple product owners communicate about conflicting requirements during the meeting. Mostly, management does not understand the need and importance of scrum and do not believe in scrum terminologies and scrum techniques.

Scrum usually fails when there are less number of communications and improper handling of scrum. It also fails due to lack of training of both master and trainer. Lack of communication and training are two main reasons of scrum failure. There is a strong need of understanding scrum roles by Scrum Master and clear communication the backlogs and progress of the backlogs to both the product owner and the team members. However, such failures can be controlled by hiring a certified Scrum master. Certified scrum master deals efficiently with scrum terminologies and techniques. Efficient and effective handling of scrum can prevent scrum failure in large organizations. Proper communication and effective handling ensures software quality and increase in investment. There are number of situations where scrum gets fails and in some cases scrum works well, the success of scrum is highly depends on its use and deployment. Improper use of scrum will led an organization towards downfall.

Scrum fits

Scrum fits in number of cases, it is best in small organization dealing with small amount of data. Scrum fits well in small organizations where data is less complicated and small team size. Scrum provides excellent risk management skills, reduce risk factors, it can easily deal with changing requirements of customers through out the process. Scrum is required for social networking companies as they have cut throat competition. Scrum enables them to deliver features quickly though there are frequent requirement changes.

Scrum does not fit

Scrum is best process for software development, there are number of factors in presence of which scrum fails and does not work efficiently. Scrum usually fails in large organization and does not perform well where large team is involved in development process. It is also difficult for large organization to arrange meeting for large development teams. For instance, in large government projects, customers are not willing to participate in the monthly scrum meetings and are not interested in the incremental releases stated by Scrum.

Case Studies

Google

Scrum is widely used by number of organizations and is useful in many ways. It was also used for Ad words project at Google and achieved a great success.

Ad words project had a large core group and it was highly based on different level designs and a part of core group, bunch of stakeholders and the team members were involved in the whole project. Product owners and the stakeholders observed the need of requirement analysis at first stage of project for the production of an effective software after that they determined a product backlog at the start of the project. When Ad words project started they used release burn downs which caused different number of project improperly managed delivery schedules. Scrum Masters analyzed that mismanagement and solves issues with efficiency. Engineers were agreed after realizing the problems to deal with new process, since they believed the new processes will be designed in a way that it must have ability to resolve all the current problems and hurdles that they were facing with recent scheme.

Yahoo

Yahoo is one of the most popular search engines in the worldwide. Millions of users, log on daily to their yahoo accounts for their daily tasks. Scrum is widely used in number of application on yahoo and number of users uses scrum at different level for better results. In number of projects scrum is used for better results. Some projects like music, yahoo use scrum for achieving high popularity. Projects like Yahoo uses Scrum over distributed networks for connecting all different themes in to one. Themes & participant intention affects a lot in decision & policy making. According to one survey it has observed that scrum has great significance in the success of any project, it plays pivotal role in software productivity up to 20%.with collaboration productivity improved by 40% and at least 80% of the teams wanted to continue using scrum for future projects. The use of scrum at yahoo is highly effective and works well which ensures the success of scrum over large frameworks.

Conclusion

This world is modernized now and every one wants best for himself and his family, no one is ready to compromise on his needs and requirements. Computer technology has reached the top and still gaining high popularity in past few years. Scrum provides better communication medium for team member and customers. It gives effective solution with the involvement of customer through out the project. It ensures high investment and revenues, and reduces risk factors. However, there are some cases, where scrum does not help i.e. scrum usually does not help in handling large meeting, dealing with large teams etc. Scrum does not help with large amount of data. Scrum is best for small organization where requirements keep changing through out the process. Scrum is currently used by social networking companies, web development companies and video game industry where the requirements tend to change often.

It is clear that scrum is successfully used in wide range of software projects across large enterprises. There are number of companies which are using Scrum effectively some of them are: IBM, Nokia, siemens, Google yahoo etc. It is usually considered that only leader is responsible for all transactions and team leader is responsible for the success of project or organization but fact is little different, all team members; team leader contributes in effective development process. Sprint meeting plays an important role in software development process. Scrum has gained popularity and still, its gaining so much popularity worldwide.

Works Cited

Beck, 2001, Manifesto for Agile Software Development, Web.

My response Was able to find and whey did u mention 2008. Web.

Button, G. and Dourish, P. On Technomethodology: Foundational Relationships between Ethnomethodology and System Design. Human Computer Interaction, Vol 13 No. 4, 1998.

MY response: Is this the link to this since you didnt mention anything. Web.

Hartmann, D. 2006. Interview: Jim Johnson of the Standish Group. Web.

My response: I was able to find the link but why did u mention. 2009.

Sharp, H., Woodman, M. and Robinson, H. 2000. Using Ethnography and Discourse Analysis to Study Software. In Proceedings ICSE. Web.

Frank Maurer, 2006, agile methods: moving towards the mainstream of the software industry. Web.

Rogers, E. Diffusion of innovations, 3/e. The Free Press, New York, NY, 1983. My Response: This asks for login Yeah it will ask becoz its a paid resource.

Software Engineering and Methodologies

The term methodology refers to a complete description of a system that considers the process, people, modeling language, project management, social structures and products. We however note that not all the systems developed to bear the meaning of this word. They are rather Object-Oriented than being Agent-Oriented. It is with this fact in mind that I have worked developing my experience in software engineering. Through working, I have strived to note the needs of the clients. This has stirred me to expand my experience base to come up with products that have satisfied them.

Since I started working, I have been able to improve my analytical skills considerably. Together with my co-workers in BTR IT Consulting Company, I have been in constant communication with our clients to determine their exact expectations. This is regarding the features they need in particular systems. In this, I have gained experience in avoiding feature creep and in documenting every aspect of a project from its genesis to the end. We have been able to determine what the users are comfortable with, and what they feel we need to make changes or improvements. Likewise, I have learned that I should direct energy to what conforms to clients needs, rather than focusing on the requirements.

In DBR Technology, I applied this aspect effectively. The web designs needed by our clients had all-inclusive details about the requirements. On the other hand, our clients had specific needs that we had to meet. This made us make sure we had effectively synchronized their needs and the documented requirements. The final products fit perfectly to the users needs, and factored in all requirements.

Still while developing the web in DBR Technology, we first determined the scope of the project. After which, we created a project schedule after estimating the work that would be involved. Together with our project manager, we monitored and controlled the project process. In this, we kept the entire team and the management updated with the projects progress. To enhance perfect monitoring, we had status meetings from time to time.

It is also very important that clients be taken through training before receiving complete systems. For this to be easier, I have implemented the principle of involving clients in the development stage. My colleagues and I do this when necessary. This quickens their understanding during the product presentation. This worked very well when we worked for Diagram Data Company. It was very important that the personnel who were involved with the system understood the basics of PHP/MySQL and NET/SQL. This resulted in a better grasping of the principle facts used during development. Secondly, it generally caused a warm welcome of the product at the end of the process. Apart from involving some of them, we also trained the rest of our clients after we had completed the project. This made them acquaint themselves with the new design faster.

I am also involved in producing system prototypes. One example is when I compiled the ideas for the web design in Diagram Data Company. The clients wanted to understand fully how this product would work. We made a prototype that allowed our users to evaluate the design proposal we had presented to them. This helped the users point out some requirements, which were not clear to them. We made them clearer and more understandable. In so doing, we controlled the prototype and consequently the whole project. This became the key factor of the good relationship that exists between these clients and us.

My colleagues and I worked on the prototype. Though each one had a different part to work on, we had some common components we were all working on. On the other hand, we had earlier created three versions of the same web site. We worked in close reference to the repositories that we had kept to track the changes and consequently be coherent. Our repositories contained names of source code components, having the revisions and the variations. It also contained versions of linkers used and the various compilers, the name of the colleagues who did the construction and the time and dates of the construction. We all had meetings to discuss the best way to control the version.

In the same work, we had to manage several configurations. This was to help us introduce the changes to upgrade the earlier versions. In this, we were able to work in isolation, shared the artifacts, and finally collaborated to synchronize all the changes we had made. This led to the successful development of the fourth version of the Web site.

We also implemented all the proposals in that our clients gave us in due time. Together with our clients, we made comparisons to see if the prototype matched well with the specifications of the software program we were building. As I have involved myself with making prototypes, I have gained skill and insight in coming up with accurate project estimates. It has also helped me to be able to project whether we could meet milestones and deadlines proposed. The example of the Diagram Data Company was the first one we adjusted the deadlines set by our clients accurately.

We also have found that all the knowledge gain is important for future use. I have learned to do proper documenting and so creating a good repository. I have been doing this with my colleagues in BTR IT Consulting Company. Our project manager also helped us make sure we have carefully chosen and transferred content to the repository after every project.

Through my work experience, I have known that one cannot perfectly learn Software Engineering and Methodology in class but in practice. Through this, I have gained skills in requirements analysis, software testing, formal specification methods, project planning, estimation and control, and creating Prototypes. I have also seen the great significance in teamwork. In the process, I have also used books to gain a deeper understanding of the methodologies. Some of these books are; The Software Project Managers Handbook: Principles that work by Dwayne Phillips, Learning Software Organizations: Methodology and Applications by Günther Ruhe and Frank Bomarius, and New trends in software methodologies, tools and techniques By Paul Johannesson. My main ambition in studying those and more books was to apply their proposals and suggestions in my working where necessary. This has also helped me in gaining experience.