From: route@monster.com
Sent: Monday, September 28, 2015 1:02 PM
To: hg@apeironinc.com
Subject: Please review this candidate for: Talend
This resume has been forwarded to
you at the request of Monster User xapeix03
|
|||||||
|
|||||||
|
|
|
||||||
|
||||||
|
OBJECTIVE: |
Experience in Interactive media
content, Revenue, Media Asset management, CRM and SEO domains. |
|
EXPERIENCE: |
6/2012 - Present |
Johnson Controls |
2040 W Wisconsin Ave |
|
|
|||
|
Developed ETL jobs and ELT process
using the Data Stage 8.0.Developed SQL scripts and UNIX shell scripts as per
the functional and technical specs provided by the client.Worked for the
migration of reporting database from Netezza to Teradata.Designed and modified
datastage jobs and Netezza scripts for migration to Teradata.Worked
extensively with Teradata sql assistant and BTEQ, designed complex teradata
sql scripts to be called using datastage job sequences and unix shell
scripts.Designed complex DataStage Jobs, sequences and tuned them for better
performance.Responsible for preparing ETL Documentation for the developed
processes.Extracted data from different source systems (PeopleSoft, Informix,
Teradata DB, SLQ Server DB) and moved them into the Netezza warehouse and
mart.Developed Sequence jobs to call SQL scripts for the ELT process in the
Netezza target database.Developed UNIX shell scripts to automate the Data
Load processes to the Target.Responsible for handling Production Support
tickets and scheduling Control-M jobs.Scheduled the jobs developed as per the
time mentioned by the business users and monitor the jobs and fix the issue
if there is any job failure.Co-ordinated the offshore dev and testing
teams.Involved in preparing Integration test cases, UAT test cases and
involved in UAT testing.Involved in tuning many SQL scripts and other ETL
processes used in this project.Involved in conducting knowledge sharing
sessions for end-users, to business and to offshore team membersInvolved in
Stream lining the development guide lines for using new Datastage features. |
|||
|
4/2010 - 5/2012 |
The Northwestern Mutual Life
Insurance Company |
||
|
|
|||
|
Generating DB scripts from Data
modeling tool and Creation of physical tables in DB.Worked SCDs to populate
Type I and Type II slowly changing dimension tables from several operational
source filesCreated some routines (Before-After, Transform function) used
across the project.Experienced in PX file stages that include Complex Flat
File stage, DataSet stage, LookUp File Stage, Sequential file
stage.Implemented Shared container for multiple jobs and Local containers for
same job as per requirements.Adept knowledge and experience in mapping source
to target data using IBM Data Stage 8.xImplemented multi-node declaration
using configuration files (APT_Config_file) for performance
enhancement.Experienced in developing parallel jobs using various
Development/debug stages (Peek stage, Head & Tail Stage, Row generator
stage, Column generator stage, Sample Stage) and processing stages
(Aggregator, Change Capture, Change Apply, Filter, Sort & Merge, Funnel,
Remove Duplicate Stage)Debug, test and fix the transformation logic applied
in the parallel jobsInvolved in creating UNIX shell scripts for database
connectivity and executing queries in parallel job execution.Used the ETL
Data Stage Director to schedule and running the jobs, testing and debugging
its components & monitoring performance statistics.Experienced in using
SQL *Loader and import utility in TOAD to populate tables in the data
warehouse.Successfully implemented pipeline and partitioning parallelism
techniques and ensured load balancing of data.Deployed different partitioning
methods like Hash by column, Round Robin, Entire, Modulus, and Range for bulk
data loading and for performance boost.Repartitioned job flow by determining
DataStage PX best available resource consumption. |
|||
|
11/2008 - 3/2010 |
Manpower |
||
|
|
|||
|
Performed extensive research and
implemented the slowly changing dimension logic to suit the project business
rules using initial and incremental loads.Derived a logic for initial reload
of data into the staging table for a scenario to reload data before a past
cut off time.Derived and implemented a method to capture rejected records
from all ETL jobs into a specially designed reject staging table.Derived and
implemented a method to insert rejected records from previous loads along
with new data using staging variables in Transformer stage.Designed a
Batch_run_history table, and derived and implemented a method to define
incremental load start time and load end time, for each table to rollback and
run ETL from previous load start time, in case if the job fails..Implemented
the logic to update the loaded, rejected and unloaded records of the staging
table using multiple instance of a single update jobDeveloped many complex
Job Sequence utilizing user Variable Activity, Terminator Activity,
StartLoop, EndLoop, Notification Activity, Routine Activity etcUsed unix
scripts for passing parameters into the Job sequence and also developed Bath
run tables to update load start timestamp and load end timestamp.Created backup
of Datastage project into clearcase software and restored as and when
required.Used IBM Tivoli Workload Scheduler for scheduling the
jobs.Responsible for Business Analysis and Requirements Collection to develop
ETL to extract data from XML Clob using latest XML Input stages of 8.5
versionTuned oracle SQL queries by creating indexes on the key column |
|||
|
9/2005 - 10/2008 |
Tata Consultancy Services Limited |
||
|
|
|||
|
Developed Sequence jobs to call
SQL scripts for the ELT process in the Netezza target database.Developed UNIX
shell scripts to automate the Data Load processes to the Target.Responsible
for handling Production Support tickets and scheduling Control-M jobs.Scheduled
the jobs developed as per the time mentioned by the business users and
monitor the jobs and fix the issue if there is any job failure.Co-ordinated
the offshore dev and testing teams.Involved in preparing Integration test
cases, UAT test cases and involved in UAT testing.Involved in tuning many SQL
scripts and other ETL processes used in this project.Involved in conducting
knowledge sharing sessions for end-users, to business and to offshore team
membersInvolved in Stream lining the development guide lines for using new
Datastage features.Responsible for conducting the peer reviews, planning and
estimating the project requirements and to report the status to business
managers.Client: Confidential, Columbus, OH. |
|||
|
LANGUAGES: |
Languages |
Proficiency Level |
English |
Fluent |
|
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||