From: route@monster.com
Sent: Monday, September 28, 2015 1:01 PM
To: hg@apeironinc.com
Subject: Please review this candidate for: Talend
This resume has been forwarded to
you at the request of Monster User xapeix03
|
|||||||
|
|||||||
|
|
|
||||||
|
||||||
|
shekar_tr@hotmail.com. 614 653 6843. Shekar
TR Senior ETL / Database Architect Database architect with over 10 years of diverse experience in
Software Development Lifecycle (SDLC), Architecture, Database Design,
Performance Tuning, Development, and Enterprise Reporting in Financial,
Healthcare, Banking (Retail & Credit) and Insurance industries. Database
Architecture / Design Database
Performance Tuning Database
Development
ETL Tools & Development. Database
Conversion / Migration Project
Management Training
/ Mentoring Standards
Definition Software
Configuration Management Release
Management Special focus
on enterprise level database applications, utilizing major commercial
database engines (Oracle, Microsoft SQL Server, and MySQL). Experience
summary: Architecture / Database Design ·
Design and
build complex relational databases and data warehouses. ·
Work with
technical and end users to understand business requirements and identify data
solutions. ·
Develop
strategies for data acquisitions, data recovery, & enterprise level
database implementation. ·
Provide
leadership and direction to database developers. ·
Provide an
in-depth expertise with data security, storage solutions, database
virtualization and replication as well as other complex technical tools and
solutions. ·
Define
enterprise-level standards (coding, architectural, migration), platform and
tool selection. ·
Facilitated
Data requirement meetings with business and technical stakeholders to prepare
Project charter, BRD's, Risk Analysis Documents and resolved conflicts to
drive decisions. ·
Worked
extensively on Erwin and ER Studio in several projects for both OLAP and OLTP
applications using the Data Modeling (Dimensional &
Relational) concepts like Star-Schema Modeling, Snowflake Schema
Modeling, Fact and Dimension tables. ·
Worked with
DBA/SA on Oracle Database/Exadata server’s upgrades, patching, ASM, Active
guard set up, backups and also contingency server’s maintenance &
deployments. ·
Increased
project delivery efficiency through managing, mentoring and developing ·
Integrated
and implemented MDM, business intelligence (BI) and data warehousing (DW
/ EDW) ·
Perform
tasks related to the full life-cycle development of software for internal and
client use. ·
Plan,
analyze requirements, design solutions, develop and test application code.
Knowledge and experience in Windows and Unix platforms using SQL (PL/SQL,
T-SQL, ANSI-SQL & Ux
Shell Scripting ) under Oracle database ·
Development
of ETL solutions using Datastage, Informatica and Essbase Hyperion
tools, or database built-in utilities (SQL Loader, SSIS, etc.). ·
Good
experience with Hadoop environment and open source ETL tool (Talend)
compatible with Bigdata environments. ·
Good Experience in Developing Oracle applications with advanced utilities
like Search Engine projects using Oracle Text and Security Project using
Oracle Context. ·
·
Responsible
for all activities related to the development, implementation, and support of
ETL processes for large scale data warehouses using Informatica Power
Center and Essbase Hyperion 11.1.2, Hyperion
Excel Adin. Performance Tuning ·
Monitor
performance of database management systems in VLDB environments. ·
Provide
performance tuning of database systems for query and data loading
performance, including data partitioning, indexing, data model
review/updates, etc. ·
Effective
troubleshooting of performance and tuning problems in Oracle databases and
Oracle Applications including I/O optimization, SQL Tuning and Backup and
Recovery Planning, Testing and Implementation. Project Management ·
Manage
multiple projects simultaneously with focus on project planning, monitoring,
budgeting, resource negotiation, scope containment, conflict resolution, risk
mitigation, implementation, project status reporting, documentation and other
PMO related processes. ·
Manage,
motivate and lead diverse cross-functional business and IT teams/workgroups
including developers, business analysts, SQA analysts, and multiple outside
vendors for running successful projects with proven ability of achieving projects
in time and on budget targets. ·
Proven
ability to take over a project at any point in the Software Development
Lifecycle (SDLC) and successfully bring to implementation on time, within
budget, and to expected quality results. ·
Proven
history of collaborative leadership of global onsite and offshore teams. ·
Led
Business redesign, automation and performance projects including Reinsurance
business redesign, Personal information (chase), Database, Data warehouse and
software integration projects. ·
Leading Automation
team Experience in automating Monitor Jobs and control Jobs to increase the
performance of the various applications and decrease the manual intervention
to handle the SLA's effectively. ·
Managed End
user requirements by defining the scope and creating the communication
strategy to effectively communicate the various aspects of the project to
different audiences. ·
Managed
database Projects by providing the High level design and low level designs
and designing new applications using data modeling tools. ·
Owner and
manager of the implementation process by coordinating IT groups both internal
and external including but not limited to front end (Java/dot net) team,
Business analysts, data Ware housing team, helpdesk, quality assurance and
testing teams using detailed communications plan; status meetings and issue
resolutions till Go Live and Project close out. ·
Team lead
Experience with consistently increasing responsibilities in team management
which includes Requirement Analysis, Tasks allocation (WBS), Project
Budgeting, Scheduling and managing Deliverables. Education: Graduated (BTech-Bachelor of Technology in Electrical and Electronics
Engineering) from JNTU Hyderabad, India in 2002. SKILLS: Databases: Hadoop, Oracle 12c/11g/10g, Exadata11g,
Oracle Integration with DB2 and Sql Server. Methodologies : Agile
Software Development (Scrum, Kanban, Extreme Programming), MS Visio, MS
Office, Waterfall Process, Use Case Diagrams, Sequence Diagrams, Activity
Diagrams, MS Office. Business
Domains: Credit Cards, Retail Banking, Insurance
(Reinsurance, Group) and Health. Operating
Systems: AIX, UNIX, Linux, Windows 2000/NT/XP. Prog-Languages:
SQL/PLSQL, Shell, Hive, Java script,
C and CSS DB Tools:
TOAD 11.6, PLSQL Developer 8.0, SQL Developer, Erwin. DW / ETL tools: TALEND, IBM
Infosphere server-Data stage 8.5/8.1(Parallel Extender),
Essbase Hyperion and Excel add in and Informatica Power center 8.6 Scheduling Tools: Crontab,
Autosys, PLSQL Job scheduler and Control M Utilities:
WinCVS 1.3.14, VSS, SVN, putty and SSH AGILE: JIRA, Rally and Confluence Trainings Attended: ·
Tata
Consultancy Services, NJ, USA, PMP (Project Management Professional), PMI4.0,
2013 ·
Bank of
America, NJ, USA, ADF (Application Development Framework), 2013 ·
Tata
Consultancy Services, Hyderabad, Oracle BI (OBIEE 10.1.3), 2010 ·
Tata
Consultancy Services, Bangalore, Banking Concepts – An Overview, Oracle 10g
–Total recall, 2010 ·
Tata
Consultancy Services, Bangalore, iSecurity, 2010, IPMS & IQMS, Six Sigma,
2010 Projects experience NIH (NIAMS), MD
Sep’14 – tilldate LEAD Oracle DBA Domain: Health Description: NIAMS (NIH) applications designed and
maintained to facilitate automated Grants processing for the business users
and the grantees. Which also includes end to end grants application process,
review, approval and fund release schedules and updates to the grantees and
investors. Responsibilities: ·
Involved as
Database Team Lead for the team of 4 members to facilitate all the
development and support activities for all NIAMS applications built on Hadoop
and oracle database environments. ·
Involved in
Requirement discussion with clients for budgeting, planning and scheduling
releases. ·
Prepared
Database Design documents and Data Mapping Documents for the ETL jobs with
required DB objects in the database Environment (Hadoop and Oracle 12c). ·
Converting
the business requirements into Technical requirements, HLD and DD for the
customization and get them reviewed by architect. ·
Prepared
and led multiple internal and external IT groups and resources during
deployment and Go live by executing issue resolutions both hands on and
through escalations. ·
Active
participation in decision making and QA meetings and regularly interacted
with the Business Analysts & development team to gain a better
understanding of the Business Process, Requirements & Design. ·
Used Talend
as an ETL tool to extract data from sources systems (Hadoop), loaded the data
into the ORACLE database. ·
Involved as
Architect or designer in Re design applications Projects from APEX to Java
applications and process flow to interact with external applications. ·
Managing
Analysis, Testing and Solution for different Issues (Tickets) requested for
different schemas in Grants processing area. ·
Managing
Reconciliation/Correction of dividend data provided from various NIH
branches. ·
Reviewing
Development of PL/SQL stored procedures along with Unix shell scripts. ·
Designing
and reviewing ETL jobs using Talend tool. ·
Managing
and helping team to follow the complete SDLC cycle. ·
Following
agile methodologies using JIRA to plan the scrum schedules and deliverables
for each scrum. Environment: Hadoop, Talend, Oracle 12c/11g, DataStage 8.1,
Control M, Unix and SQL/PLSQL Barclays US cards, DE
Jun’13– Sep’14 Oracle/LEAD ETL Developer Domain: Credit Cards Description: Barclays marketing applications designed and
maintained to communicate the offers, alerts and notifications to the
Barclays customers including the Barclays partner customers. This includes
all kinds of email notifications, SMS alerts and letter offers. These
applications are built with various Datastage jobs (Parallel, Sequential and
server) to handle the load process and sent the user communications. Responsibilities: ·
Involved as
TEAM Lead for ETL Developers during the analysis, planning, design,
development, and implementation stages of projects using IBM Web Sphere
software (Quality Stage v8.1, Web Service, Information Analyzer, Profile
Stage, WISD of IIS 8.0.1). ·
Involved in
Hadoop POC for the ETL process using TALEND tool. ·
Prepared
Data Mapping Documents and Design the ETL jobs with required Tables in the
Dev Environment. ·
Active
participation in decision making and QA meetings and regularly interacted
with the Business Analysts &development team to gain a better
understanding of the Business Process, Requirements & Design. ·
Used
DataStage and TALEND as an ETL tool to extract data from sources systems,
loaded the data into the ORACLE database. ·
Involved in
Analysis, Testing and Solution for different Issues (Tickets) requested for
different schemas in Credit cards Marketing area. ·
Reconciliation/Correction
of dividend data provided from various vendors ·
Automation
of manual tasks which require more effort/resource and various other tasks ·
Development
of PL/SQL stored procedures along with Unix shell scripts. ·
Development
of parallel jobs using Datastage tool. ·
Managed
onsite and offshore development teams. ·
Designed
and Developed Data stage Jobs to Extract data from heterogeneous sources,
Applied transform logics to extracted data and Loaded into Data Warehouse
Databases. ·
Created
Datastage jobs using different stages like Transformer, Aggregator, Sort,
Join, Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify,
Filter, Change Data Capture, Change Apply, Sample, Surrogate Key, Column
Generator, Row Generator, Etc. ·
Extensively
worked with Join, Look up (Normal and Sparse) and Merge stages. ·
Extensively
worked with sequential file, dataset, file set and look up file set stages. ·
Extensively
used Parallel Stages like Row Generator, Column Generator, Head, and Peek for
development and de-bugging purposes. ·
Used the
Data Stage Director and its run-time engine to schedule running the solution,
testing and debugging its components, and monitoring the resulting executable
versions on ad hoc or scheduled basis. ·
Developed
complex store procedures using input/output parameters, cursors, views,
triggers and complex queries using temp tables and joins. ·
Converted
complex job designs to different job segments and executed through job
sequencer for better performance and easy maintenance. ·
Creation of
jobs sequences. ·
Created
shell script to run data stage jobs from UNIX and then schedule this script
to run data stage jobs through scheduling tool. ·
Coordinate
with team members and administer all onsite and offshore work packages. ·
Analyze
performance and monitor work with capacity planning. ·
Performed
performance tuning of the jobs by interpreting performance statistics of the
jobs developed. ·
Documented
ETL test plans, test cases, test scripts, and validations based on design
specifications for unit testing, system testing, functional testing, prepared
test data for testing, error handling and analysis. ·
Participated
in weekly status meetings. Environment: Hadoop, Oracle 12c, DataStage 8.1, Control M,
Unix and SQL/PLSQL Bank of America, NJ
Jan’12–May’13 Oracle Lead Developer Domain: Retail Banking Financial Services Description: The ‘eLedger’ application is designed to handle
Bank of America’s General Ledger, Funds Transfer Pricing and Customer level
Profitability Reporting activities. The data from ‘eLedger’ is used for
making Strategic Financial Planning and Analysis which helps to improve the
business on the long run thereby providing an edge over its competitors. Responsibilities: ·
Developed
CORPFIN application and delivered with minimal issues and on Time. ·
Design
& Develop the Outlines for Corpfin and Recon applications using
Dimensional model on Essbase. ·
Reverse
engineered the reports and identified the Data Elements (in the source
systems), Dimensions, Facts and Measures required for new enhancements of
reports. ·
Conduct
Design discussions and meetings to come out with the appropriate Data Mart at
the lowest level of grain for each of the Dimensions involved. ·
Designed a
STAR schema for the detailed data marts and Plan data marts involving
confirmed dimensions. ·
Actively
participating in development of business entity ·
Development
of PL/SQL stored procedures and UNIX shell scripts. ·
Scheduling
the Autosys jobs for Essbase cube building ·
Preparation
of fortnightly status report of project activities ·
Setup Grid
Control for Monitoring Exadata X2. ·
Created and
maintained the Data Model repository as per company standards. ·
Conduct
Design reviews with the business analysts and content developers to create a
proof of concept for the reports. ·
Ensured the
feasibility of the logical and physical design models ·
Collaborated
with the Reporting Team to design Monthly Summary Level Cubes to support the
further aggregated level of detailed reports. ·
Worked on
the Snow-flaking the Dimensions to remove redundancy. ·
Designed
Sales Hierarchy dimensions to handle sales hierarchy reporting historically
and dynamically ·
Worked with
the Implementation team to ensure a smooth transition from the design to the
implementation phase ·
Worked
closely with the ETL Developers to explain the complex Data Transformation
using Logic ·
Created ETL
Jobs using Informatica and Custom Transfer Components to move data from
Oracle Source Systems. Environment: Toad, Oracle 11g(OLAP), Autosys, UNIX,
SQL/PLSQL, Hyperion Essbase 11.1.2, Essbase Studio, Hyperion Excel Addin and
Informatica Power center 8.0 JP Morgan Chase, Columbus,
OH
Jan’10-Dec’11 Oracle Lead Developer Domain: Retail Banking Financial Services Description: The process established clearly defined access controls to ensure
masking of the data that is identified and validated as personal information
and ensure approved users receive access to only those data elements
necessary to perform their business role through the process of approval(s)
depending on the sensitivity of the data. Responsibilities: ·
Responsible
in developing “Privacy Initiative” (PI) project using Virtual Private
Database (VPD). ·
Automated
the Testing of security policies on schemas which restricts sensitive data to
the users. ·
Involved in
Analysis, Testing and Solution for different Issues (Tickets) requested for
different info1 maintained schemas. ·
Coded and
implemented packages to perform PI testing and batch job scheduling. ·
Developed
Shell Scripts to automate the testing PI applied columns across the schema. ·
Developed
UNIX Shell Scripts and PL/SQL procedures to extract and load data for batch
processing. ·
Performed
tuning and optimization on SQL queries using Explain Plan. ·
Expertise
in use of Export/Import and Datapump (object level, schema level,
transportable tablespace and full database). ·
Experience
in Performance Tuning databases using various techniques. Collected
performance statistics memory usage, data storage, data manipulation,
physical / logical storage, network traffic and implement / manage
parallelism for optimum performance using Explain Plan, SQL Trace, Tkprof
& Statspack. Environment: Oracle 10g, Crontab, AIX, SQL/PLSQL,
informatcia power center 8.6 Swiss Re Insurance,
NY
Mar’08–Dec’09 Domain: Re Insurance Senior Oracle Developer Description: Automates and streamlines SRLH existing
business processes, integrating with our clients/partners where appropriate.
Provides the means for capturing clean, accurate, and consistent data to
improve decision-making. Automates the end-to-end flow of data within
the division, transforming that data into information. Establishes a global,
low-cost environment and Delivers global processes and systems Responsibilities:
·
Involved in
Discussions with Swiss Re personnel based on the System Requirements Document
to cover the functionality, technical issues and scope of work ·
Analyzed,
diagnose dividend data to identify any problems in data ·
Analyzed
Business requirements for changes. ·
Reconciliation/Correction
of dividend data provided from various vendors ·
Automation
of manual tasks which require more effort/resource and various other tasks ·
Coded in
PLSQL and Unix Shell Scripting ·
Developed
Data Inflation module using PLSQL packages. ·
Code Review
for the team developed procedures. ·
Created
many PLSQL packages ·
Created
many tables, stored procedures on Oracle ·
Worked in
query optimization ·
Participated
in performance tuning ·
Provided
support to the data files by running extracts and performing data imports
through SQL Loader. ·
Conducted
major Swiss Re stakeholder interviews involving SME’s, Business Analysts and
other stakeholders to determine the requirements. ·
Translated
the business requirements into workable functional and non-functional requirements
at detailed production level using Workflow Diagrams, Sequence Diagrams,
Activity Diagrams and Use Case Modeling. ·
Identified
and compiled common business terms for the Insurance system that become
central part of communication among project stakeholders to avoid ambiguity ·
Worked at conceptual/logical/physical data model level using Erwin according to requirements. ·
Involved in
exhaustive documentation for technical phase of the project and training
materials for all data management functions ·
Used
Reverse Engineering approach to redefine entities, relationships and
attributes in the data model as per new specifications in Erwin after
analyzing the database systems currently in use ·
Enforced
referential integrity in the OLTP data model for consistent relationship
between tables and efficient database design ·
Used
forward engineering approach for designing and creating databases ·
Conducted
design walk through sessions with Business Intelligence team to ensure that
reporting requirements are met for the business ·
Developed Data Mapping, Data Governance, Transformation and Cleansing rules for the Master Data Management Architecture
involving OLTP, ODS and OLAP ·
Collaborated
with ETL, BI and DBA teams to analyze and provide solutions to data issues
and other challenges while implementing the OLAP model. ·
Developed
and maintained data dictionary to create metadata reports for technical and
business purpose. ·
Created
action plans to track identified open issues and action items related to the
project ·
Prepared analytical
and status reports and updated the project plan as required. Environment: Autosys, Oracle 10g, TSQL, MS SQL Server AVIVA Group Insurance,
UK
Nov’07–Mar’08 Senior Oracle Developer Domain: Group Insurance Description: The objective of Hibernian Life & Pensions
project is to enhance its Group Pensions Processing infrastructure by
building an application to significantly enhance its Group Pensions
processing ability internally, and provide a new online offering of Group
Pensions products to brokers, trustees and employers. Responsibilities:
·
Involved in
user requirement gathering, analysis, design, coding and testing ·
Involved in
Design Documents of HLD & LLD ·
Coded in
PLSQL and Unix Shell Scripting ·
Developed
modules using SQL/PLSQL ·
Designed,
developed and modified Plsql scripts using the package & modules ·
Worked in
automating feed processing module with table-based data ·
Created
queries to generate complex reports by using inline quires and joins ·
Created
many tables, stored procedures on Oracle ·
Involved in
database design and SQL query performance tuning ·
Created
many batch scripts using KSH and scheduled in CRON TAB ·
Created
batch processes to function independently behind the scenes for data
transfers and database updates ·
Worked on
bug fixing and supported the production applications ·
Participated
in design and development of various modules ·
Worked in
design documentation and change control up Environment: TSQL, MS SQL Server AVIVA Insurance, UK
Oct’06–Nov’07 Oracle PL/SQL Programmer Domain: Insurance Description: British School of Motoring (BSM) is into the
business of providing driving lessons. They book lessons for learners and
allocate the same to the instructors for lesson delivery. Financial
settlement to the instructors takes place, based on the number of lessons
provided and their lesson price. All instructors are registered as franchisee
with BSM. Thus they enter into an agreement for the franchisee and settle for
a franchisee fee to be paid to BSM Responsibilities:
·
Involved in
the Data Migration Activities ·
Involved in
Development of AOL objects like Defining Concurrent Programs and Executables
for finance modules. ·
Prepared
requirement documents, high level design and unit test plans ·
Developed
and tested new scripts required to automate and drive the procedural
execution of applications for efficiency and troubleshooting purposes ·
Worked on
creation of Control Files for SQL Loader to load data from Text Files. ·
Created new
stored procedures, triggers, views as part of Implementation. In AR, AP, GL
Modules. ·
Created
Forms and Reports in AR, AP modules. ·
Used SQL
Loader for data loading ·
Created
Unix Shell Scripts for sequential execution of scripts including data
extraction, transformation and loading. and Stored Procedure execution ·
Involved in
client interaction at requirements capturing stage ·
Developing
test plans. Environment: E-business Center 11i FINANCE Modules,
PLSQL, Shell Scripting AT&T,
USA
Nov’05–Oct’06 Description: Route Management System supports the analytical activities of the
Route Managers who negotiates the bilateral contracts with foreign carriers
in the termination of international traffic. The system provides
settlement direction of network minutes and Unit Cost of Net Settlement (UCNS)
by route, termination point and product. The Route Managers are
able to monitor and track the performance and commitment of the bilateral
contracts and manage the UCNS to achieve a low, competitive cost base for
voice international interconnections with over 500+ carriers in over 230
countries. This would allow the managers to better price and
respond to competition and achieve route profitability. RMS sources
network volume and ISP and LCR rates from PISCES. A RMS ISR
Agreement Data Entry Tool captures the ISR agreements. The system
utilizes Oracle Financial Analyzer (OFA) to provide a flexible reporting
platform for rapid evaluation of the country traffic and any changes to the
fundamental processes involved in delivering international minutes. In addition, RMS also
provides a web version of OFA available with RMS online, which provides
additional dial codetermination point network traffic detail retrieval. Responsibilities:
·
Responsible
for Onsite Client interaction for information gathering. ·
Developed
an ISR data Entry tool to allow users to enter the terms of new ISR or Traded
deals under negotiation signed or unsigned ·
Coding and
bug fixing ·
Created new
stored procedures, Triggers and modified existing packages and other schema
objects ·
SQL query
tuning on complex reports ·
Maintained
the version control through Visual source safe. Environment: PERL Solaris 2.x, KSH, Sybase 11x, T-SQL
AT&T, USA
Feb’05–Nov’05 Domain: Telecom Description: International Settlements
is the process whereby telecommunication administrations involved in
providing telecommunications services, such as International Long Distance
Service (ILDS), are compensated for the costs of jointly providing such
service. Another way to look at International Settlements is as a
sharing of the revenues generated by partnering to provide telecommunications
services. Each administration reports to the other the number of minutes it
billed in a given period and the share due the other administration.
The two reports are compared, and the administration, which bills more, pays
the other administration. The International Settlements Analysis Tool (ISAT) is a collection of
6 multidimensional Oracle Express (OES) databases containing international
network and biller data. Network data contains the data that flows
through the actual physical network. Biller data contains the call details as
it appears in the bill. ISAT provides each user with Online Analytical
Processing (OLAP) access to the OES databases. Using OLAP technology, ISAT
not only gives users fast access to any item in each database but also an
easy-to-use tool to view and analyze the data under multiple dimensions such
as time, geography, products, and others. Responsibilities:
·
Developed
utilities and different shell scripts which were used while unit testing the
code fixes. ·
Created new
stored procedures, Triggers and modified existing packages. ·
Created Reports
and customized Graphs to display real-time Views of the Data. ·
Maintained
the version control through Visual source safe. ·
Worked on
new modules and enhancements ·
created
views and table joins for generating Reports ·
Analyzing
the PLSQL code, log files and core dump taken from the defective systems. ·
Worked on
Shell Scripting. ·
Involved in
functionality testing and integration testing Environment: PERL Solaris 2.x, KSH, SQL/PLSQL, OFA,
OSA |
|
|
|||||||||||||||||||||||||||||||||||||||||||||
|
|||||||||||||||||||||||||||||||||||||||||||||
|
|