From: route@monster.com
Sent: Monday, September 28, 2015 12:59 PM
To: hg@apeironinc.com
Subject: Please review this candidate for: Talend
This resume has been forwarded to
you at the request of Monster User xapeix03
|
|||||||
|
|||||||
|
|
|
||||||
|
||||||
|
OBJECTIVE: |
Developed parallel jobs using
different processing stages like Transformer, Aggregator, Lookup, Join, Sort,
Copy, Merge, Funnel, CDC, Change Apply and Filter. |
|
EXPERIENCE: |
11/2011 - Present |
Electronic Data Systems Ltd |
1617 Fannin St |
|
|
|||
|
Generating DB scripts from Data
modeling tool and Creation of physical tables in DB.Worked SCDs to populate
Type I and Type II slowly changing dimension tables from several operational
source filesCreated some routines (Before-After, Transform function) used
across the project.Experienced in PX file stages that include Complex Flat
File stage, DataSet stage, LookUp File Stage, Sequential file
stage.Implemented Shared container for multiple jobs and Local containers for
same job as per requirements.Adept knowledge and experience in mapping source
to target data using IBM Data Stage 8.xImplemented multi-node declaration
using configuration files (APT_Config_file) for performance
enhancement.Experienced in developing parallel jobs using various
Development/debug stages (Peek stage, Head & Tail Stage, Row generator
stage, Column generator stage, Sample Stage) and processing stages
(Aggregator, Change Capture, Change Apply, Filter, Sort & Merge, Funnel,
Remove Duplicate Stage)Debug, test and fix the transformation logic applied
in the parallel jobsInvolved in creating UNIX shell scripts for database
connectivity and executing queries in parallel job execution.Used the ETL
Data Stage Director to schedule and running the jobs, testing and debugging
its components & monitoring performance statistics.Experienced in using
SQL *Loader and import utility in TOAD to populate tables in the data
warehouse.Successfully implemented pipeline and partitioning parallelism
techniques and ensured load balancing of data.Deployed different partitioning
methods like Hash by column, Round Robin, Entire, Modulus, and Range for bulk
data loading and for performance boost.Repartitioned job flow by determining
DataStage PX best available resource consumption.Created Universes and
reports in Business object Designer. |
|||
|
7/2008 - 10/2011 |
AT&T Inc. |
||
|
|
|||
|
Implemented the logic to update
the loaded, rejected and unloaded records of the staging table using multiple
instance of a single update jobDeveloped many complex Job Sequence utilizing
user Variable Activity, Terminator Activity, StartLoop, EndLoop, Notification
Activity, Routine Activity etcUsed unix scripts for passing parameters into
the Job sequence and also developed Bath run tables to update load start
timestamp and load end timestamp.Created backup of Datastage project into
clearcase software and restored as and when required.Used IBM Tivoli Workload
Scheduler for scheduling the jobs.Responsible for Business Analysis and
Requirements Collection to develop ETL to extract data from XML Clob using
latest XML Input stages of 8.5 versionTuned oracle SQL queries by creating
indexes on the key columnInvolved in creating PL/SQL Stored Procedures,
Packages, and Triggers for the application, tuning SQL queries and the
database.Performed tuning on sluggish ETL jobs by introducing manual
partitioning, reducing stages, tuning Source and Target SQL queries, creating
indexes on tables etc.Extensively worked in creating table design and
maintained the ETL functional mapping documents.Extensively used stages such
as Oracle Connector (8.5), Oracle enterprise (8.1), Transformer with loop
function, Sparse and Range Lookups, XML input in addition to normally used
stages as join, merge, filter, aggregator, funnel, dataset, sequential file
etcPerformed extensive research and implemented the slowly changing dimension
logic to suit the project business rules using initial and incremental
loads.Derived a logic for initial reload of data into the staging table for a
scenario to reload data before a past cut off time.Derived and implemented a
method to capture rejected records from all ETL jobs into a specially
designed reject staging table.Derived and implemented a method to insert
rejected records from previous loads along with new data using staging
variables in Transformer stage. |
|||
|
5/2005 - 6/2008 |
Infosys Technologies Limited |
Banglore |
|
|
|
|||
|
Automated processes using the Unix
shell scripts.Since processing of huge files from Omniture was involved,
performance tuning of the mappings, sessions and workflows played a huge role
in bringing down the running time.Co-ordinated with Omniture to resolve
source file related issues.Wrote test cases for integration testing.Performed
data profiling of the sources using Talend open source tool before beginning
the development work. That helped me identify the data patterns.Analyzed data
related issues and proposed possible solutions and picked the best one after
the consultations based on ROI.Worked on the Revenue data mart which involved
a lot of mappings with complicated data calculations where the totals have to
absolutely precise.During development work, focused on reusability and
simplicity to get the best performance.Used advanced mapping techniques to
make the most of the Informatica tool.Worked with the users directly for
requirements as well as any data related issues.Documented all the processes
for the ease of understanding for the new recruits as well as for the
existing ones.Have provided production support and resolved critical
issues.Have worked on deployment documentation and coordinated with all the
teams involved for a smooth deployment.Have tested Business objects reports
as a part of the overall test plan.Worked with heterogeneous sources and
targets and also with third party vendors. |
|||
|
EDUCATION: |
3/2002 - 4/2005 |
Delhi University |
|
|
|
|||
|
LANGUAGES: |
Languages |
Proficiency Level |
English |
Fluent |
|
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||