From: route@monster.com
Sent: Monday, September 28, 2015 1:02 PM
To: hg@apeironinc.com
Subject: Please review this candidate for: Talend
This resume has been forwarded to
you at the request of Monster User xapeix03
|
|||||||
|
|||||||
|
|
|
||||||
|
||||||
|
OBJECTIVE: |
Experienced in scheduling Sequence
and parallel jobs using DataStage Director, UNIX scripts and scheduling
tools. |
|
EXPERIENCE: |
5/2012 - Present |
Scottsdale Insurance Company |
|
|
|
|||
|
Experienced in scheduling Sequence
and parallel jobs using DataStage Director, UNIX scripts and scheduling
tools.Developed Datastage parallel jobs using different processing stages
like Transformer, Aggregator, Lookup, Join, Sort, Copy, Merge, Funnel, CDC,
Change Apply and Filter.Worked extensively with Dimensional modeling, Data
migration, Data cleansing, ETL Processes for data warehouses.Have good
experience in development and enhancement of Mainframe applications.Have good
experience in Mergers and Acquisition projects.Experienced in defect
management tools like HP Quality center ,Bugzilla.Extensively worked on
databases like SQL 2005/2000 and Oracle 9i/10g/11g and experience in writing
complex queries and using Oracle analytical functions.Experience using Talend
for Data profiling and MS Excel for creating pivot tables to summarize and
analyze large amounts of data using statistical, logical and database
functions.Assisted in modeling various data marts and operational data stores
using Dimensional modeling and Entity-Relationship modeling.Demonstrate
self-reliance in researching technical solutions and product functionality
and worked on critical projects as a lead developer.Served as a liaison
between the business and the development team,Continuously improve upon
existing approaches by seeking opportunity to creatively transform current
business practices into fresh alternative solutions.Experience working with
large datasets which includes online content and traffic statistics such as
unique visitors, page view, average time on site, etc. for Food Network and
HGTV websites.Experience in using Business Objects and testing
reports.Possess analytical and problem solving skills and capable of managing
multiple projects.Experience in developing test cases, developing test
scripts and analyzing bugs. |
|||
|
9/2010 - 5/2012 |
Phoenix utilities Limited |
||
|
|
|||
|
Involved in analysis, coding and
Unit Testing.Prepared Procs, Job JCL’s for the many projects.Prepared
Technical Documentation of Software Requirement Specifications.Prepared Unit
Test Plan, Test Cases, Captured Unit Test Results and entered into Test Director
for the programs Coded.Involved in building a Data warehouse model as per the
specific requirements of the business. Handling large volume data loads,
automatic error handling and auditing, performance tuning for bulk data load,
aggregation in optimized time using ETL tools.Involved in the analysis of
physical data model for ETL and the process flow diagrams. Developed SQL
Scripts, and SQL*Loader control files for creating a development database.
Created data governance strategies that reduced data redundancy wherever
possible. Experience in working with enterprise scheduling tools like UC4,
Control-M, and scheduling SSIS packages using SQL Server Agent. |
|||
|
7/2008 - 9/2010 |
City of Tempe |
||
|
|
|||
|
ETL development using SQL Server
Integration Services.Handled large table loads using SSIS and sqladhoc
queries.Created SSIS packages for data movement and database
maintenanceAssisted data architect for implementing physical databases from
logical design.Have written complex queries and stored procedures as per
business needs.Designed, constructed and implemented physical
databases.Installed, maintainedand administered SQL Server 2000/2005
instances and databases.Scheduled and monitored all maintenance activities of
SQL Server 2000/2005 including database consistency check and index
defragmentation.Conducted and participated in model reviews ensuring that
models accurately reflect the business rules and policies to satisfy the
business requirementsInvolved in performance tuning,resolving deadlock
issues, normalization and de-normalization processes. |
|||
|
4/2004 - 7/2008 |
Polaris |
||
|
|
|||
|
Involved in the design and
development of ETL processes to load data for analytical purposes.Was
responsible for building new data marts (SEO data mart) as well as enhancing
the current ones with new features.Automated processes using the Unix shell
scripts.Since processing of huge files from Omniture was involved,
performance tuning of the mappings, sessions and workflows played a huge role
in bringing down the running time.Co-ordinated with Omniture to resolve
source file related issues.Wrote test cases for integration testing.Performed
data profiling of the sources using Talend open source tool before beginning
the development work. That helped me identify the data patterns.Analyzed data
related issues and proposed possible solutions and picked the best one after
the consultations based on ROI.Worked on the Revenue data mart which involved
a lot of mappings with complicated data calculations where the totals have to
absolutely precise.During development work, focused on reusability and
simplicity to get the best performance.Used advanced mapping techniques to
make the most of the Informatica tool.Worked with the users directly for
requirements as well as any data related issues.Documented all the processes
for the ease of understanding for the new recruits as well as for the
existing ones.Have provided production support and resolved critical issues. |
|||
|
EDUCATION: |
University of Phoenix |
||
|
|
|||
|
LANGUAGES: |
Languages |
Proficiency Level |
English |
Fluent |
|
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||