From: route@monster.com
Sent: Monday, September 28, 2015 1:02 PM
To: hg@apeironinc.com
Subject: Please review this candidate for: Talend
This resume has been forwarded to
you at the request of Monster User xapeix03
|
|||||||
|
|||||||
|
|
|
||||||
|
||||||
|
SHILPI GOEL 571-271-2072 shilpigoelva@gmail.com SUMMARY: · 8+ years of Professional IT Experience in analysis,
architecture, design, development, testing and implementation of
Client/Server, n-tier architecture software. · Experienced Information Technology Professional with
demonstrated expertise in installation, programming, configuration, and
administration of MS SQL Server databases. · Strong Data Modelling experience
using tools like Erwin, ER Studio. · Implementation of Data Modeling creating Conceptual,
Logical, and Physical Data Models using Slowly Changing, Dimensions (SCD),
Data Warehouse, and Operation Data Source (ODS). · Worked extensively with Database objects like
Tables, Stored-Procedures, Functions, Packages, Triggers, Indexes, and Views
with Oracle, SQL Server, and DB2 databases. · Extensively used ETL tools SQL Server Integration
Services (SSIS) and DTS Packages, Talend Open Studio. · Performed Requirement gathering meetings, interview
sessions, JAD Sessions to collect data movement related requirements details
and created Source-To-Target mappings. · Hands-on with Data Analysis, Data Profiling,
Standardization, mapping, Metadata, · Experience in creating and deploying SSIS (ETL
process) packages using various Transformations such as Slowly Changing
Dimension (SCD), Multicast, Merge Join, Lookup, Fuzzy Lookup, Conditional
Split, Aggregate, Derived Column, and Data Conversion Transformations. · Expertise with Talend most used components (tMap,
tDie, tConvertType, tFlowMeter, tLogCatcher, tRowGenerator, tSetGlobalVar,
tHashInput & tHashOutput and many more) · Created source to target mappings reading logic from
Informatica mappings. Familiar with Informatica transformations like
Expression, Joiner, Maplet, Workflow, Source qualifiers, Filters, Sorter,
Router, Sequence generator etc. · Extensively used crystal reports, Business Objects,
SSRS to generate reports based on the user needs · Experience in Report generation using SQL Server
Reporting Services (SSRS), and Excel Spreadsheet and Pivot tables. · Proficient in using Report Manager for setting up
Roles and security on the reports. · Experience in configuration of report server and
report manager scheduling, give permissions to different level of users in
SQL Server Reporting Services (SSRS). · Involved in creating Drill down Reports,
Sub-Reports, drill through/down and parameterized reports using SQL Server
Reporting Services (SSRS). · Experienced in Full Life Cycle and Methodology for
implementing Data warehouse and Business Intelligence (BI), Reporting System. · Created Metadata definitions from Business Rules. · Proficient and prompt in learning and adapting to
new technologies. TECHNICAL
SKILLS: · Business/Data Modeling Tools: MS Visio, ERWin 4.1, ER/Studio 7.1 · Data Warehouse Techniques: Slowly Changing Dimensions, · Normalization RDBMS: DB2, Oracle7.x/8.x/9i/10g, MS SQL SERVER
2008/2005/2000/7.0 · Operating Systems: Windows 9x/NT/2k, HP UNIX. · Languages & Technologies: C, PL/SQL, ASP, ASP.NET, VB.NET, JavaScript, XML · Other Tools:
Rational Rose, MS Office, MS Project, ClearQuest, ClearCase, SSIS, Talend 4.0 · Reporting Tools: SSRS, Business Objects 6.0. EDUCATION
& CERTIFICATION: Bachelors
in Engineering in Computer Science, SSGMCE, India, 1999. Certification
in ASP.NET, VB.NET from NOVA Community College Microsoft
Certification Database Administrator in SQL Server 2008. PROFESSIONAL
EXPERIENCE: ETL
Developer/ Data
Analyst
Pennsylvania State University, PA
May 2013 till Present Pennsylvania
State University (Penn State) Out Reach department manages prospective
students’ communication. Working as Data Analyst for data warehouse. Role: · Creation of Source to target mappings, followed
Naming standards and abbreviations. · Conduct JAD/Requirements gathering sessions to get
business rules, transformation logic. · Prepared data profiling reports with valid values,
data quality, nullablity, etc. · Leading efforts to implement Data Integration
process with Talend Integration Suite (TIS) 4.2/5.0 · Create ETL Mapping with Talend Integration Suite to
pull data from Source, apply transformations, and load data into target
database. · Extracted data from flat files/legacy databases
applied business logic to load them in the staging database. · Implemented population of slowly changing dimensions
to maintain current information and history information in warehouse tables
with change data capture (CDC). · Design and Implemented ETL for data load from
heterogeneous Sources to SQL Server and Oracle as target databases and for
Fact and Slowly Changing Dimensions SCD-Type1 and SCD-Type2. · Used common Data Warehousing practices for Data
Modelling using Star schema and Snowflake schema with Facts, Measures and
Slowly Changing Dimensions (SCD) Type1, Type2, and Type3. Data
Analyst (Contractor), Federal Reserve Board, Washington, DC, Jul’12
– Mar’13 Redesign
there HR model by bringing the data from the People Soft as a Source File and
loading it to the target database in Oracle. Create the Dimensions and the
Fact table in the Talend by using the SCD techniques. Role: · Creation of Source to target mapping documents. · Conducted JAD/Requirements gathering sessions to get
business rules, transformation logic. · Design and create Logical and Physical data Models
using Erwin. · Lead the efforts for Standardization of Data and
Naming standards for Data Elements. · Analyze the data from different sources i.e. DB2,
Oracle, SQL Server, and Flat file and define ETL process. · Create Data Definitions, Source to target mapping
spreadsheet, Meta Data repository. · Create database objects, scripts to deploy in target
Oracle database like Tables, Views, and Materialized Views etc. · Created Physical Data Model (PDM) for the
OLTP application using ER Studio. · Create views and queries to pull data from source
SQL Server, Excel, CSV, and flat files. · Create ETL Mapping with Talend Integration Suite to
pull data from Source, apply transformations, and load data into target
database. · Create Workflow jobs to create sequential and
parallel processing of ETL jobs. · Provide documentation of ETL jobs and Workflow jobs. · Conduct test run, debug & fix, and upload to SVN
based Versioning system for deployment. · Used common Data Warehousing practices for Data
Modeling using Star schema and Snowflake schema with Facts, Measures and
Slowly Changing Dimensions (SCD) Type1, Type2, and Type3. Skills: Oracle 11g, PL SQL, ER Studio , MS
Office, Talend 4.1.2. ERWIN 7.0 Data
Analyst (Contractor), Apollo Group, Chicago, IL Jan’12 – Jul’12 Apollo
group is the parent company of University of Phoenix and other
universities/colleges. Working as ETL Developer on data migration &
integration and data warehouse project which is to bring data from four
universities into one central repository from where the data warehouse gets
regular updates. Role: · Creation of Source to target mappings, followed
Naming standards and abbreviations. · Conduct JAD/Requirements gathering sessions to get
business rules, transformation logic. · Prepared data profiling reports with valid values,
data quality, nullablity, etc. · Create database objects, scripts to deploy in target
Oracle database like Tables, Views, Materialized Views, etc. · Create views and queries to pull data from source
SQL Server, Excel, CSV, and flat files. · Create ETL Mapping with Talend Integration Suite to
pull data from Source, apply transformations, and load data into target
database. · Used ER Studio for logical and
physical data modelling and source to target mapping for ETL development · Create Workflow jobs to create sequential and
parallel processing of ETL jobs. · Provide documentation of ETL jobs and Workflow jobs.
· Conduct test run, debug & fix, and upload to SVN
based Versioning system for deployment. · Used Talend joblet and various commonly used Talend
transformations components like tMap, tDie, tConvertType, tFlowMeter,
tLogCatcher, tRowGenerator, tSetGlobalVar, tHashInput & tHashOutput and
many more. · Used common Data Warehousing practices for Data
Modeling using Star schema and Snowflake schema with Facts, Measures and
Slowly Changing Dimensions (SCD) Type1, Type2, and Type3. Skills: Oracle 11g, PL SQL, ER Studio ,MS Office,
Talend 4.1.2. DATA
ANALYST (Contractor), Freddie Mac, Mclean, VA, Oct’11 – Dec’11 The
business analytical goal is the creation of a master business rules
spreadsheet (MBRS) with a deterministic algorithm aimed at ensuring the
strict enforcement of the identities. This
MBRS should reflect the goal of integrating the SF Credit Reporting EUC into
the Enterprise Data Mart. Hence it must consolidate a multi-waved
source/target lineage analysis alongside 3 main categories: population,
transformation and timing. Role: · Conducted JAD sessions for Requirements gathering
and analysis for data intensive projects. · Work on the Single Family portfolio. · Lead the efforts for Standardization of Data and
Naming standards for Data Elements. · Create Data Definitions, Source to target mapping
spreadsheet, Meta Data repository. · Data received from files and FTP. ETL that data over
using the Talend ETL tool. · Created ETL jobs to load data into a central
repository, then import the data into a Staging database. · Validate EUC output and input data elements using a
bottom-up approach. · The rationalized, re-pointed code is validated by
executing it simultaneously with the original code, and by reconciling their
respective data results. Skills: Rapid SQL 7.7.2, Beyond Compare, MS Office, SQL Server
2008, Erwin 7.0, SAS DATA
ANALYST/MODELER (Contractor), Kaiser Permanente, Silver Spring, MD Feb’09
– Jul’11 Role:
· Conducting JAD sessions with business users and
technical staff in finalizing reporting requirements. · Develop complex queries and data extraction
procedures/codes to pull data from Data Warehouse and from Business Objects
reports. · Designing reporting database structure including
Data Models and DDL scripts for ad-hoc and daily, weekly, and monthly
reports. · Analyzing data need & report data elements,
review and verify with business owners, and owing the data elements. · Independently analyze compiles and trends data into
meaningful results to assist department for monitoring performances on region
and identifying opportunities for improvement. · Provides technical expertise in evaluating,
developing, and implementing various aspects of security standards for the
Diamond platform. Skills: DB2, MS Access, MS Excel, SQL, Oracle,
PL/SQL,ER/Studio, Crystal Reports, Business Objects XI R2, Oracle 10g,Talend. DATA
WAREHOUSE ANALYST/MODELER (Contractor), Freddie Mac, McLean, VA, Feb 2008 –
Dec 2008 Role:
· Conducted JAD sessions for Requirements gathering
and analysis for data intensive projects. · Work on different Loan portfolio i.e. Single Family
and Multi Family. · Design and create Logical and Physical data Models
using ER/Studio and Erwin. · Lead the efforts for Standardization of Data and
Naming standards for Data Elements. · Analyze the data from different sources i.e. DB2,
Oracle, SQL Server, and Flat file and define ETL process. · Create Data Definitions, Source to target mapping
spreadsheet, Meta Data repository. · Analyzed data for Data cleansing, standardizing,
sourcing from source database and loading to target for initial load and for
CDC data load. · Created scripts for physical database objects from
Model to physical database like Tables, indexes, and views to DB2 to be
deployed by production DBA. · Worked with Upstream and Downstream users. · Resolved the Production issue with the help of DBA. Skills: ER/Studio 7.6, ERWIN 4.1.4, DB2 8, ORACLE 10G,
PL/SQL, TOAD, Rapid SQL, Clear Case, Clear Quest, SQL Server 2005. DATA
ANALYST (Contractor), Fannie Mae, Washington, DC, Sep’07 – Dec’07 · Conducted Requirement Gathering and Analysis of the
requirements to check for the feasibility of the data and created Physical
and Logical Data models for walkthroughs. · Created data workflows based on requirements to aid
development team analyze and conducted verification for data standardization. · Gathered base load rules for existing tables and
created Technical mapping documents for the Mart. · Created Requirement Traceability Matrix for
requirement and testing coverage · Monitoring and implementing New Reference Data
(Product Types, Transaction Types etc) that needs to be loaded into the Data
mart. · Worked on creating base Meta data definitions for
common sourcing and location. · Translated the conceptual data models to database
objects in Oracle. · Interacted with business units to identify
updated/new-mapping requirements for data profiles as the business
requirements documents are updated. Skills: MS- Visio, MS- Office Suite, SQL Server, Windows
2003 Server. DATA
ANALYST/ MODELER (Contractor), DUN & BRADSTREET (D&B), Morristown,
NJ, Jun’07 – Sep’07 Role: · Analysed mappings in Informatica to load the data
including facts and dimensions from various sources into the Data Warehouse,
using different transformations. · Interpret the Functional requirements and Design
documents and created source to target mappings reading logic from
Informatica mappings. · Familiar with Informatica transformations like
Expression, Joiner, Maplet, Workflow, Source Qualifier, Filter, Sorter,
Router, Sequence generator, Lookup, Aggregate, Update Strategy etc. · Used Informatica in MS Window / SQL Server
environment. · Attend requirements and Design meetings with the end
user, Business analysts and developers. · Report all bugs to the development team and seek
resolution to the problems before signing off on the release. · Write SQL queries for backend data validation. · Created reports using SQL SERVER (SSRS) and Involved
in generating Matrix reports, Sub reports and other complex reports · Designed and executed SQL queries for Unit testing
and report / data validation. Skills: MS- Visio, MS- Office Suite, ORACLE 10G, PL/SQL,
XML, SQL Server 2000, T-SQL, SSIS, SSRS, TOAD, ERWIN 4.1.4, Informatica Power
Center 7, Visual Source Safe, and Windows 2003 Server, UNIX, VS 6.0. DATABASE
DEVELOPER/ANALYST (Contractor), Sprint-Nextel Corp, Reston, VA, Dec’
06 – Jun’07 Role: · Requirements management and Analysis through regular
meeting and sessions with business/stakeholders. · Documented requirements and processes, and came up
with Process designs. · Acquired user requirements and develop processes,
standards, and procedures for data gathering from Ensemble system. · Conducted JAD sessions with stakeholders in coming
up with user requirements. · Performed Gap Analysis to check the compatibility of
the existing Rebate system infrastructure with the new business requirements. · Get the Rebate Data weekly through FTP (file
transfer protocol). · Designed rebate processing system to process
customer rebates using shell scripts and oracle packages. Skills:
ORACLE 10g, PL/SQL, SQL Plus 9.2,
SQL Server 2000, ERWIN 4.1, TOAD, XML, Windows 2003 Server, HP UNIX, MS
Excel, MS Word, MS Access, Business Objects 6. DEVELOPMENT
OF JOB SEARCH WEBSITE – INDIA, Jul 1999 – Jan 2003 · Developed websites and internal use web applications
based on Microsoft ASP platform. Skills:
Windows NT, IIS 4, ASP, VB Script,
JavaScript, SQL Server 7.0, Visual Basic 6.0, MS Access. |
|
|
||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
|