From: route@monster.com
Sent: Monday, September 28, 2015 1:02 PM
To: hg@apeironinc.com
Subject: Please review this candidate for: Talend
This resume has been forwarded to
you at the request of Monster User xapeix03
|
|||||||
|
|||||||
|
|
|
||||||
|
||||||
|
OBJECTIVE: |
10 years of IT experience
specializing in Analysis, Design and Development of ETL processes in
different phases of the Data Warehousing life cycle. |
|
EXPERIENCE: |
6/2012 - Present |
General Motors Ltd |
1301 Orleans St |
|
|
|||
|
Design and develop data
integration processes using Informatica Power-Center to load application data
into Media Asset Management (MAM) reporting databases.Assist in the
architecture, implementation and deployment of an enterprise operational
reporting database warehouse.Design and develop data structures for mining
and information retrieval applications which support small scale information
needs as well as enterprise wide strategic initiatives.Work with subject
matter experts to derive data requirements and design appropriate data
models.Design and develop ETL processes for cleansing and loading data into
the operational warehouse, as well as developing batch and real-time
solutions to support application and integration initiatives.Perform data
profiling tasks including column analysis, structural analysis, simple rule
analysis, and complex rule analysis.Successfully implemented a large scale
project involving integration with Salesforce to create dashboards for
revenue data in Salesforce.Assist with resolving data anomalies and data
quality issues.Perform performance tuning based on Informatica velocity best
practices.Assist with writing and maintaining project documentation,
including but not limited to formal requirements, design documents and test
plans.Interact with UNIX/Windows system administration and DBA groups to
resolve issues.Support development initiatives by working with application
design and development teams to reach optimal data design, quality, and
integration patterns.Support development of ad-hoc and canned reports to
support business initiatives and objectives.Took responsibility of individual
projects and ensured timely completion without data quality issues.Being a
part of the data services team and the nature of work, had to work |
|||
|
4/2010 - 5/2012 |
United Automobile Insurance Group,
Inc. |
||
|
|
|||
|
Design and develop data
integration processes using Informatica Power-Center to load application data
into Media Asset Management (MAM) reporting databases.Assist in the
architecture, implementation and deployment of an enterprise operational
reporting database warehouse.Design and develop data structures for mining
and information retrieval applications which support small scale information
needs as well as enterprise wide strategic initiatives.Work with subject
matter experts to derive data requirements and design appropriate data
models.Design and develop ETL processes for cleansing and loading data into
the operational warehouse, as well as developing batch and real-time
solutions to support application and integration initiatives.Perform data
profiling tasks including column analysis, structural analysis, simple rule
analysis, and complex rule analysis.Successfully implemented a large scale
project involving integration with Salesforce to create dashboards for
revenue data in Salesforce.Assist with resolving data anomalies and data
quality issues.Perform performance tuning based on Informatica velocity best
practices.Assist with writing and maintaining project documentation,
including but not limited to formal requirements, design documents and test
plans.Interact with UNIX/Windows system administration and DBA groups to
resolve issues.Support development initiatives by working with application
design and development teams to reach optimal data design, quality, and
integration patterns.Support development of ad-hoc and canned reports to
support business initiatives and objectives.Took responsibility of individual
projects and ensured timely completion without data quality issues.Being a
part of the data services team and the nature of work, had to work on varied
projects simultaneously.Participated in design reviews, code reviews and
brain storming sessions for various projects and tasks.Co-ordinated with
offshore teams and remote teams to ensure smooth functioning of a project and
resolve data issues.Gave presentations and documented various Informatica
processes and best practices for enterprise level reference.Worked on R&D
projects and implemented them successfully as a part of the Data Services
Center of Excellence (COE).Employer: Confidential, Atlanta GA |
|||
|
3/2009 - 8/2011 |
Hcl Technologies Ltd |
mumbai |
|
|
|
|||
|
Analyzed, designed, developed,
implemented and maintained Parallel jobs using IBM info sphere Data
stage.Involved in design of dimensional data model – Star schema and Snow
Flake SchemaGenerating DB scripts from Data modeling tool and Creation of
physical tables in DB.Worked SCDs to populate Type I and Type II slowly
changing dimension tables from several operational source filesCreated some
routines (Before-After, Transform function) used across the
project.Experienced in PX file stages that include Complex Flat File stage,
DataSet stage, LookUp File Stage, Sequential file stage.Implemented Shared
container for multiple jobs and Local containers for same job as per
requirements.Adept knowledge and experience in mapping source to target data
using IBM Data Stage 8.xImplemented multi-node declaration using
configuration files (APT_Config_file) for performance enhancement.Experienced
in developing parallel jobs using various Development/debug stages (Peek
stage, Head & Tail Stage, Row generator stage, Column generator stage,
Sample Stage) and processing stages (Aggregator, Change Capture, Change
Apply, Filter, Sort & Merge, Funnel, Remove Duplicate Stage)Debug, test
and fix the transformation logic applied in the parallel jobsInvolved in
creating UNIX shell scripts for database connectivity and executing queries
in parallel job execution.Used the ETL Data Stage Director to schedule and
running the jobs, testing and debugging its components & monitoring
performance statistics.Experienced in using SQL *Loader and import utility in
TOAD to populate tables in the data warehouse.Successfully implemented
pipeline and partitioning parallelism techniques and ensured load balancing
of data.Deployed different partitioning methods like Hash by column, Round
Robin, Entire, Modulus, and Range for bulk data loading and for performance
boost.Repartitioned job flow by determining DataStage PX best available
resource consumption.Created Universes and reports in Business object
Designer.Created, implemented, modified and maintained the business simple to
complex reports using Business objects reporting module. |
|||
|
3/2009 - 8/2011 |
Delphi Automotive Systems, LLC |
||
|
|
|||
|
Involved in the design and
development of ETL processes to load data for analytical purposes.Was
responsible for building new data marts (SEO data mart) as well as enhancing
the current ones with new features.Automated processes using the Unix shell
scripts.Since processing of huge files from Omniture was involved,
performance tuning of the mappings, sessions and workflows played a huge role
in bringing down the running time.Co-ordinated with Omniture to resolve
source file related issues.Wrote test cases for integration testing.Performed
data profiling of the sources using Talend open source tool before beginning
the development work. That helped me identify the data patterns.Analyzed data
related issues and proposed possible solutions and picked the best one after
the consultations based on ROI.Worked on the Revenue data mart which involved
a lot of mappings with complicated data calculations where the totals have to
absolutely precise.During development work, focused on reusability and
simplicity to get the best performance.Used advanced mapping techniques to
make the most of the Informatica tool.Worked with the users directly for
requirements as well as any data related issues.Documented all the processes
for the ease of understanding for the new recruits as well as for the
existing ones.Have provided production support and resolved critical issues. |
|||
|
EDUCATION: |
8/2003 - 4/2005 |
University of Mumbai |
|
|
|
|||
|
LANGUAGES: |
Languages |
Proficiency Level |
English |
Fluent |
|
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||