From: route@monster.com
Sent: Monday, September 28, 2015 1:00 PM
To: hg@apeironinc.com
Subject: Please review this candidate for: Talend
This resume has been forwarded to
you at the request of Monster User xapeix03
|
|||||||
|
|||||||
|
|
|
||||||
|
||||||
|
OBJECTIVE: |
Informatica certified developer
with expertise in design and development of ETL methodology for supporting
data transformations & processing data in a corporate wide ETL solution. |
|
EXPERIENCE: |
6/2012 - Present |
st. Vincent indianapolis Hospital |
|
|
|
|||
|
Analyzed, designed, developed,
implemented and maintained Parallel jobs using IBM info sphere Data
stage.Involved in design of dimensional data model – Star schema and Snow
Flake SchemaGenerating DB scripts from Data modeling tool and Creation of
physical tables in DB.Worked SCDs to populate Type I and Type II slowly
changing dimension tables from several operational source filesCreated some
routines (Before-After, Transform function) used across the
project.Experienced in PX file stages that include Complex Flat File stage,
DataSet stage, LookUp File Stage, Sequential file stage.Implemented Shared
container for multiple jobs and Local containers for same job as per
requirements.Adept knowledge and experience in mapping source to target data
using IBM Data Stage 8.xImplemented multi-node declaration using
configuration files (APT_Config_file) for performance enhancement.Experienced
in developing parallel jobs using various Development/debug stages (Peek
stage, Head & Tail Stage, Row generator stage, Column generator stage,
Sample Stage) and processing stages (Aggregator, Change Capture, Change
Apply, Filter, Sort & Merge, Funnel, Remove Duplicate Stage)Debug, test
and fix the transformation logic applied in the parallel jobsInvolved in
creating UNIX shell scripts for database connectivity and executing queries
in parallel job execution. |
|||
|
4/2010 - 3/2012 |
Indianapolis Public Schools |
||
|
|
|||
|
Involved in the design and
development of ETL processes to load data for analytical purposes.Was
responsible for building new data marts (SEO data mart) as well as enhancing
the current ones with new features.Automated processes using the Unix shell
scripts.Since processing of huge files from Omniture was involved,
performance tuning of the mappings, sessions and workflows played a huge role
in bringing down the running time.Co-ordinated with Omniture to resolve
source file related issues.Wrote test cases for integration testing.Performed
data profiling of the sources using Talend open source tool before beginning
the development work. That helped me identify the data patterns.Analyzed data
related issues and proposed possible solutions and picked the best one after
the consultations based on ROI.Worked on the Revenue data mart which involved
a lot of mappings with complicated data calculations where the totals have to
absolutely precise.During development work, focused on reusability and
simplicity to get the best performance.Used advanced mapping techniques to
make the most of the Informatica tool.Worked with the users directly for
requirements as well as any data related issues. |
|||
|
7/2008 - 4/2010 |
Emmis Communications |
||
|
|
|||
|
ETL development using SQL Server
Integration Services.Handled large table loads using SSIS and sqladhoc
queries.Created SSIS packages for data movement and database
maintenanceAssisted data architect for implementing physical databases from
logical design.Have written complex queries and stored procedures as per
business needs.Designed, constructed and implemented physical
databases.Installed, maintainedand administered SQL Server 2000/2005
instances and databases.Scheduled and monitored all maintenance activities of
SQL Server 2000/2005 including database consistency check and index
defragmentation.Conducted and participated in model reviews ensuring that
models accurately reflect the business rules and policies to satisfy the
business requirementsInvolved in performance tuning,resolving deadlock
issues, normalization and de-normalization processes. |
|||
|
6/2005 - 7/2008 |
Hcl Technologies Ltd |
Noida |
|
|
|
|||
|
Maintained the inventory of all
the AV equipment’s being used in the University.Created new database users as
required and grant them appropriate rights and privileges.Performed capacity
planning to create and maintain databases.Installed new versions of the
Oracle RDBMS and its tools.Planned and implemented the backup and recovery of
the Oracle database.Managed sharing of resources among
applications.Troubleshoot problems related to databases and
applications.Implemented and enforced security for the databases.Maintained
database constraints to ensure referential integrity.Have written queries to
retrieve information as and when needed by the stakeholder. |
|||
|
EDUCATION: |
University Of Indianapolis |
||
|
|
|||
|
LANGUAGES: |
Languages |
Proficiency Level |
English |
Fluent |
|
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||