Do you need this or any other assignment done for you from scratch?
We have qualified writers to help you.
We assure you a quality paper that is 100% free from plagiarism and AI.
You can choose either format of your choice ( Apa, Mla, Havard, Chicago, or any other)
NB: We do not resell your papers. Upon ordering, we do an original paper exclusively for you.
NB: All your data is kept safe from the public.
Tasks to complete
Goal: This project will be used to integrate concepts develo
Tasks to complete
Goal: This project will be used to integrate concepts developed from all the assignments in the second half of this class, specifically. You will identify a data driven business problem that requires preparation of the data. This preparation involves Extracting data (from 3 or more sources), Transforming (or cleaning) the data before Loading it into a database for analysis. In other words, you will experience, first-hand, the ETL process of Data management – preparing the data for further analyses.
Options: You can take this project in one of two directions: (1) Identify a large file, clean the data and normalize it into three or more tables OR (2) Identify three or more large data sources, clean the data and merge them into a denormalized table for analysis. In either case, you will need to identify what you plan to learn from the cleaned and loaded data. BOTTOM LINE: Can you do the analyses WITHOUT going through this ETL process. If so, what’s the point?!
Resource: This articleLinks to an external site.
In preparation for your project this term, I need you to do some digging to identify sources and ideas for a decent project.
There are a couple of decisions that have to be made. And so, I am making part of the project a “deliverable” so you can begin mulling over it. Most ETL tasks involve cleaning and integration. For integration, it is vital that you have an attribute that is common across all three data sets
Cleaning
Cleaning is one of the most important steps as it ensures the quality of the data in the data warehouse. Cleaning should perform basic data unification rules, such as:
Making identifiers unique (sex categories Male/Female/Unknown, M/F/null, Man/Woman/Not Available are translated to standard Male/Female/Unknown)
Convert null values into standardized Not Available/Not Provided value
Convert phone numbers, ZIP codes to a standardized form
Validate address fields, convert them into proper naming, e.g. Street/St/St./Str./Str
Validate address fields against each other (State/Country, City/State, City/ZIP code, City/Street).
Transform
The transform step applies a set of rules to transform the data from the source to the target. This includes
converting any measured data to the same dimension (i.e. conformed dimension) using the same units so that they can later be joined.
generating surrogate keys or FKs so that you can join data from several sources,
generating aggregates
deriving new calculated values,
Adding columns to create PKs and/or FKs
Data Integration
It is at this stage that you get the most value for the project. This typically means you are adding some attribute from a related set that adds ‘Color’ to the data. Perhaps Census data to labor data or other demographic data. The challenge is to locate data that are relatable.
Project direction: You will need to complete a datamart with significant pre-processing (ETL) activities.
Requirements:
Problem being solved: What do you propose to learn from this data? List several of these business questions and show how your project solution (data set) could answer them.
Tools: You must complete the entire project using Visual Studio. OR you can do this with some other tool of your choice (ETL) like Power BI or tableauLinks to an external site..
Volume: Total result data set must add up to at least 5k records, but not more than 100k.
Destination: SQL server table(s). Depending on the direction you are taking, you can move all the data to a single CSV file and dump it into SQL server at the end or direct the final destination tables to SQL server.
Transformation – it must include TWO new columns (for each final destination) that is populated by (a) the current date and time so you know when that data was brought into the final dataset and (b) a second one to know where the data came from (source file name). This may be done through SSIS or in SQL server.
Note: Filename capturing works only when the source is a flat file. So, if your source is NOT a flat file, you may want to make a CSV file an intermediate destination and then use this file as the source (Hint: Use derived column transformation to add a column)
In addition it must include at least 3 of the following transformations: data conversion, derived column, data split, lookup, merge, merge join, multicast, union all, fuzzy lookup or any of the transforms not covered in class.
Data sources: You are welcome to use datasets from work that has been sufficiently “anonymizedLinks to an external site.”. In fact this itself is a valuable transformation task that you can then use to protect your data and make it available for additional analysis/exploration. There are many public data sets that can be used (see “data sources” tab)
Project ideas & Data sets [for ETL]
Goal: Explore various datasets (see below) to see what is missing in any of the data and how you can enhance it by combining info from other seemingly unconnected data (industry, education, poverty and liquor shops?). The links below serve as a starting point for your exploration. Get started!’
Expectation: You can take this project in one of two directions: (1) Identify three or more large data sources, clean the data and merge them into a denormalized table for analysis. OR (2) Identify a large file, clean the data and normalize it into three or more tables so that when you rejoin them, you get more accurate answers to your questions. Sometimes this process may require you to get “reference sources” so your dimension tables (destinations in Model Y above) are more complete/accurate.
In either case, you will need to identify what you plan to learn from the cleaned and loaded data.
There are two main ideas to keep in mind: (1) Cleaning badly prepared data and (2) integrating data from multiple sources. An ETL project usually involves BOTH of these.
When integrating data from more than one source, you need to make sure that they can be linked in the first place. In other words, is there something in common between the two data sets? Some kind of identifier like we use as PK and FK? If not, can you create it?
As you review the following sources for ideas, look for files that can be linked. Otherwise, all you have is data!
Note: You don’t have to get ALL your data from a single source. As long as they are related, you can draw from multiple sources.
I ALREADY HAVE THE DATA SOURCES AND PROJECT BACKGROUND TO WORK WITH. YOU JUST HAVE TO DO THE PROJECT ETL AND PRESENTATION. FOR THIS PROJECT YOU NEED TO USE VISUAL STUDIO 2019(PREFFERED) OR POWER BI OR TABLEAU. I NEED IN A SHORT TIME SO PLEASE BID ONLY IF YOU ARE SURE YOU CAN DO IT . I WILL BE UPLOADING THE DATA FILES HERE.
Do you need this or any other assignment done for you from scratch?
We have qualified writers to help you.
We assure you a quality paper that is 100% free from plagiarism and AI.
You can choose either format of your choice ( Apa, Mla, Havard, Chicago, or any other)
NB: We do not resell your papers. Upon ordering, we do an original paper exclusively for you.
NB: All your data is kept safe from the public.
Place this order or similar order and get an amazing discount. USE Discount code “GET20” for 20% discount