From: route@monster.com
Sent: Monday, September 28, 2015 1:01 PM
To: hg@apeironinc.com
Subject: Please review this candidate for: Talend
This resume has been forwarded to
you at the request of Monster User xapeix03
|
|||||||
|
|||||||
|
|
|
||||||
|
||||||
|
Srikant Panda E: srikant.panda57@gmail.com P: 972 342 7076 Current Location:
Dallas, TX Visa Status: H1B _________________________________________________________________________________ PROFESSIONAL SUMMARY ·
7+ years
extensive experience in using ETL methodologies for supporting Data
Extraction, Data Migration, Data Transformation and loading using Informatica
Power Center 9, 8.6.1/ 7.x ·
Experience in
the Implementation of full lifecycle in Data warehouse, Operational
Data Store (ODS) and Business Data marts with Dimensional modeling
techniques Star Schema and Snow flake Schema using Kimball and Inmon
methodologies. ·
Strong
experience in ETL, processing data from various sources into Data Warehouses
and Data Marts using Informatica Power Center (Designer, Workflow
Manager, Workflow Monitor, Metadata Manger). ·
Worked on
Exception Handling Mappings for Data Quality and Data Validation. ·
Experienced
in designing ETL procedures and strategies to extract data from heterogeneous
source systems like Oracle 11g/10g, SQL Server 2012/2008R2/2005, DB2
10, Flat files, XML, SAP R/3, and Cassandra etc. ·
Experience
with MapReduce coding, including Java, Python, Pig
programming, Hadoop Streaming, HiveSQL. ·
Experience in
SQL, PL/SQL and UNIX shell scripting. ·
Expert in
Informatica Metadata Manager, IDE, IDQ and Metadata Exchange
exhaustively to maintain and document metadata. ·
Expert in ER
Data Modeling tools like Erwin, ER-Studio and Visio in developing Fact
& Dimensional tables, Logical and Physical models. ·
Evaluate and
recommend new database software, utilities, and tools and set the strategy
for implementing new technologies. ·
Strong experience
in understanding of Business Intelligence and Data Warehousing
Concepts with emphasis on ETL and SDLC (Software Development Lifecycle),
Quality analysis. ·
Extensively
involved in creating Oracle PL/SQL Stored Procedures, Functions,
Packages, Triggers, Cursors, and Indexes with Query optimizations as part of
ETL Development process. ·
Expert in
documentation and Quality Assurance, Manual and Automated Testing Procedures
with active involvement in Database/ Session/ Mapping Level Performance
Tuning and Debugging. ·
Worked with
Business Managers, Analysts, Development, and end users to correlate Business
Logic and Specifications for ETL Development in an Agile development
methodology. ·
Expert in Plan, direct, or coordinate activities of IT teams for
software development and data conversion activities. TECHNICAL SKILLS: ETL Tools: Informatica Power
center 9, 8.x/7.x, DVO, IDE, IDQ, MM, MDM, DAC, Talend. Reporting Tools: Informatica Data
Analyzer, OBIEE, Cognos Operating Systems: UNIX, Windows 98/
NT/2000/XP, Solaris Database: Oracle 11g/10g, SQL
Server, DB2, MongoDB, Greenplum,
Cassandra,
Teradata.
Database GUI Tools: TOAD, SVN, Agile,
Jira Web Development: HTML, XML Programming Languages: Oracle SQL, PL/SQL,
UNIX Shell Script, Perl Scripts, C# Job Scheduling Tools: Autosys, Control
M, Tidal Education: Bachelors in
Engineering, Computer Science PROFESSIONAL
EXPERIENCE GameStop,
Grapevine,
TX
Dec 2013- Present ETL Developer Description:
Design/Development, Migration and System Automation of GameStop Data
Warehouse of Informatica and Oracle Database on Linux using Open Source and
Scripting. Responsibilities: · Worked on PowerCenter tools including Designer/Repository,
Workflow Manager/Monitor. · Involved in gathering business requirements,
analysis, development and testing and production support of client
application in an Agile methodology environment. · Develop ETL templates,
standards, processes & procedures for loading information into data
warehousing systems and for ensuring reliability of information loaded. ·
Extensively
used Informatica Transformation like Source Qualifier, Rank, SQL, Router,
Filter, Lookup, Joiner, Aggregator, Normalizer, Sorter etc. and all
transformation properties. · Worked on extracting data from various
sources like Oracle, PostGreDB and MongoDB. · Experienced in design of Greenplum
Database Tables according to the Informatica needs. · Expert in fetching data from Apache
Cassandra using DSE (DataStax Enterprise). ·
Experienced
in building and coding applications using Hadoop components - HDFS, Hbase,
Hive, Sqoop, Pig, Flume, Yarn etc. ·
Installed,
Configured Talend ETL on single and multi-server environments. ·
Experience in
DW development and testing using Informatica. ·
Expert in system
automation of delete relational connections in PowerCenter repositories using
Python scripts. ·
Expert in validation of Mappings, Maplets, Sessions,
Workflows and Worklet Objects using Python Scripting. ·
Worked on Talend
RTX ETL tool, develop jobs and scheduled jobs in Talend integration suite.
Extensively used the concepts of ETL to load data from AS400, flat
files to Saleforce. ·
Responsible for Performance in Informatica PowerCenter at
the Target Level, Source level, Mapping Level, Session Level, and System
Level. ·
Experience in
using Data Warehouse Administration Console (DAC 10.1.3.4) for
scheduling and executing Incremental ETL loads. ·
Extensively
worked on Informatica Big Data edition using Informatica Powerexchange.
·
Develop ETL
processing from source systems to the MongoDB and PostGreDB. ·
Expert in Design
and develop in MapReduce/Yarn on an Hadoop platform. ·
Support
technical design, development and unit test of ETL Cloud using Informatica
Cloud. Environment: Informatica Power Center (Repository
Manager, Designer, Workflow Manager, DVO, IDQ, Workflow Monitor), SQL Server,
Oracle 10g, Toad, UNIX Shell Scripting, Agile, Oracle SQL Developer, Tidal,
Informatica MDM, DAC, Talend, DB2,
Python, Greenplum, Cassandra, MongoDB, PostGreDB. CSTBrands, San
Antonio,
TX
Nov 2012-Oct 2013 ETL Developer Description: Design/Development
and Migration of Corner Store (CSTBrands) from Valero (Oil and gas). Up
gradation of Informatica and Oracle Database on Linux. Used Informatica data
services to profile and document the structure and quality of all data. Responsibilities: ·
Involved in gathering business requirements, analysis, development and
testing and production support of client application. ·
Extensively
worked on Talend Coding and Unit testing activities ·
Worked
on Programming using PL/SQL, Stored Procedures, Functions, Packages,
Database triggers for Oracle and SQL Server. ·
Extracting,
Scrubbing and Transforming data from Flat Files, Oracle, SQL Server,
DB2,Teradata and then loading into Oracle database using Informatica. ·
Modified
reports and Talend ETL jobs based on the feedback from QA testers and
Users in development and staging environments. ·
Experience in
Administering DAC 10.1.3.4 and Informatica 9.0.1. ·
Extensively
worked with Slowly Changing Dimensions Type1, Type2, and Type3 for Data
Loads. ·
Tracked
production issues and provided application support in case of job failures. ·
Involved
in handling the production tickets and resolve them in timely manner ·
Extensively
used PL/SQL programming in backend and front-end functions, procedures,
packages to implement business rules. ·
Experience in
implementing user level security for dashboards and reports based on Business
Requirement. ·
Experience in
developing Oracle BIEE Metadata repository (.rpd) model using
aggregates, dimensions, hierarchies & time series functions. ·
Checked
Sessions and error logs to troubleshoot problems and also used debugger for
complex problem trouble shooting. ·
Experience
in oil gas data Conversion project. ·
Assist
less experienced peers and provide technical guidance. ·
Experience
with Oracle 11g Answers, Dashboards, Delivers and BI Publisher. Environment: Informatica Power Center (Repository
Manager, Designer, Workflow Manager, DVO, IDQ, IDE, Workflow Monitor), SQL
Server, Oracle 10g, Toad, UNIX Shell Scripting, Flat Files, Oracle SQL
Developer, SAP ABAP, Tidal, DAC, Agile, Unix, DB2, SQL Loader, Talend,
OBIEE JPMorgan Chase Bank,
NJ
Jan 2012 – Oct 2012 Description: It's a merger project between two major
banks of their credentials and respective applications with data and
enrollments migrations of the existing companies and users. To arrive at a
unified set of applications that will be offered to all the customers of both
banks. Migration of application data from source (database of application
being merged) into existing target (existing database of application being
continued in future) is one of the core activities in this initiative. Responsibilities: · Worked with Power center tools like Designer, Workflow
Manager, Workflow Monitor, and Repository Manager. ·
Extensively
worked with both Connected and Un-Connected Lookups. ·
Responsible
for determining the bottlenecks and fixing the bottlenecks with performance
tuning. ·
Worked
with reusable objects like Re-Usable Transformation and Mapplets. Extensively
worked with aggregate functions like Avg, Min, Max, First, Last in the
Aggregator Transformation. ·
Followed the
required client security policies and required approvals to move the code from
one environment to other. ·
Extensively
used SQL Override function in Source Qualifier Transformation. ·
Deployed the
Informatica code and worked on code merge between two difference development
teams. ·
Extensively
used PL/SQL programming in backend and front-end functions,
procedures, packages to implement business rules. ·
Involved
in monitoring and maintenance of the Unix Server performance. ·
Involved in
maintaining the share point portal to upload the documents regularly and store
them on share point site. ·
Performed
complex defect fixes in various environments like UAT, SIT etc. to ensure the
proper delivery of the developed jobs into the production environment. ·
Responsible
for Unit Testing of Mappings and Workflows. ·
Attended daily
status call with internal team and weekly calls with client and updated the
status report. · Worked on extracting data from various
heterogeneous sources like Oracle, SQL Server, Teradata, MS Access and Flat
files. Environment: Informatica Power Center (Repository
Manager, Designer, Workflow Manager, Workflow Monitor, IDE, IDQ), SQL Server,
Oracle 10g, Toad, UNIX Shell Scripting, Flat Files, SQL Developer, Oracle EBS
11i. Target Corporations,
MN
Feb 2011 – Dec2011 Description: (Application
Development and Maintenance) Target Corporations is one of the largest retail
chains in the US and intends to transform its enterprise wide data
warehousing, data movement and analytics technology environment to improve
data quality and access controls, instantiate common data routines through
the Data Integration (DI) architecture with a view to corresponding cost
reduction based on the simplification. Responsibilities: · Experience on working with complete Software
Development Life Cycle of the application. ·
Involved
in writing Unix and Perl scripts for data parsing, manipulation and updating. ·
Handled production related issues and tickets for resolving
them in time. ·
Worked with Client tools like Designer, Workflow
Manager, Workflow Monitor, and Manager. ·
Responsible for migrate the workflows from development to
production environment. ·
Encapsulated PL/SQL
procedures and functions in packages to organize application. ·
Involved
in creating database tables, views, triggers. ·
Responsible
for migrating from Development to staging and to Production (deployments). ·
Designed for
populating Tables for one time load and Incremental loads. ·
Implemented
various loads like Daily Loads, Weekly Loads, and Quarterly Loads using
Incremental Loading Strategy. Environment: Informatica Power Center (Repository
Manager, Designer, Workflow Manager, Workflow Monitor), Teradata, UDB, UNIX,
Oracle 10g, Mainframes (DB2, VSAM Files, Sequential Files). Keane India, India
Jul 2007 - Dec 2010 Responsibilities: · Involved in Analysis, Database Design, Coding, and
Implementation. ·
Loaded data
into multiple tables using SQL Loader. ·
Used
extensively PL/SQL to develop packages, stored procedure, functions
and triggers. ·
Created and
maintained documents including data quality plan design, mapping inventory,
mapping specifications, change request form, unit test plan, test case list,
target-source matrix. et ·
Used
different joins, sub-query and nested query in SQL query for reports. ·
Implemented
the technical and functional aspects of the project. ·
Interacted
with user for problems faced and providing necessary technical support by
fixing\ the bugs. Environment: Oracle, MS SQL Server2003, PL/SQL, Windows
NT, Oracle 9i, UNIX and windows 2000. |
|
|
||||||||
|
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||
|
Languages: |
Languages |
Proficiency Level |
|
English |
Advanced |
|
|
|