From:                              route@monster.com

Sent:                               Monday, September 28, 2015 1:02 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Anindita Indra 

Last updated:  09/06/13

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Ellicott City, MD  21043
US

Mobile: 21526849064   
mamon_anindita@yahoo.com
Contact Preference:  Email

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Informatica Consultant

Resume Value: egp5ss2ziziw2ygu   

  

 

ANINDITA INDRA

Sr. Informatica Developer

Cell: 215 684 9064                                                                      Email: anindita.indra@apex-its.com




PROFILE

Around eight (8) years of total IT experience in HealthCare, Investment Banking, Market Data, with Design and Development in ETL (Extract-Transform-Load) using Informatica, Oracle, Teradata , Unix shell scripts. Extensive Business Data Analysis, Requirement Analysis, User interview, Data Quality Analysis and Production Support and UAT.

 

Expert in Data Warehouse concepts using Informatica Power Center 9x/ 8.x Designer, Workflow Manager & Workflow Monitor], MAINFRAME Jobs, JCL, Mercury Quality Center, UNIX Shell scripts, Teradata, SQL Developer and Toad, JIRA, TOAD, SQL Assistant, .

 

Sr. Informatica consultant at CMS (Center for Medicare and Medicaid, Baltimore MD) with detailed knowledge of Part A, Part B and PartD Data, IDR (Integrated Data Repository Warehouse), NMUD, DESY/DADS, SECTION 508 Testing and Data Analysis/Testing. Conduct detailed and comprehensive business analysis while working closely with Federal Government Task Leads (GTL) and business end-users/stakeholders to identify System and operational requirements and improvements.

 

SUMMARY

Informatica Development:

 

Ø      Extensive experience with CMS (Center for Medicare and Medicaid) in Data Warehouse Data Quality Design Techniques, Data modeling, Testing systems using Informatica Power Center tools like Designer, Workflow Manager and Monitor.

Ø      Run Informatica workflows in Testing/ Pre PROD [INT] using workflow manager, workflow monitor or using shell scripts on Solaris Test Server, analyze test results using SQL Developer /TOAD.

Ø      Experience in designing and developing complex mappings from varied transformation logic like Unconnected and Connected lookups, Router, Union, Filter, Expression, Aggregator, Joiner, Update Strategy etc.

Ø      Experience in optimization and Performance tuning of targets, sources, mappings and sessions. Designed ETL Environment involving various source and target databases like Oracle, Flat Files, DB2 (fixed width, delimited), XML, SQL Server.

Ø      Develop SQL and PL/SQL codes through various procedures, functions, and packages to implement the business logics of database in Oracle.

Ø      Help Sanitize test data before loading and ensure Personally Identifiable data (PII) is masked in Test DB. Used TIBCO to transfer the file from the landing directory of UNIX to the Mainframe.

Ø      Run Informatica workflow to remove the success file from the UNIX directory and update the DB2 with the file name so that file is available for post processing.

Ø      Used CMS Mainframe for validation purpose of application processes. Well versed with Mainframe commands/jobs and JCLS.

Ø      Prepare Test Data in Test Data Base, work with DBA to refresh Test DB periodically. Extensively worked on developing, monitoring and Jobs scheduling using UNIX Crontab / TIVOLI.

Ø      Expertise in Data Flow Diagrams, Process Models, and Entity Relationship diagrams with modeling tools like Erwin & Visio.

Ø      Create Functional Requirement Documents, update and maintain documentations as required by project management in Alfresco.

Ø      Expertise in Data Flow Diagrams, Process Models, and Entity Relationship diagrams with modeling tools like Erwin & Visio.

Ø      Involved in Developing Test Plans, Test Cases & Test Scripts in Mercury Quality Center.

Data Analysis/Production Support:

Ø      6 years of strong experience on Data Analysis, Business Requirement Analysis, Gap Analysis, Data Cleansing and Root Cause Investigations. Generate Post Mortem report for production job failures and data reconciliation efforts. Perform Data Quality, Data Profiling and analyze results.

 

SKILLSET

 

ETL Tools/Applications

Informatica Power Center 9x ,8x /Power Exchange 9x,8.5/ 8.6, Talend Open Studio, Mercury Quality Center, IBM RequisitePro, TOAD, SQL Developer, Test Director, Jira, Alfresco.

 

Databases

Oracle 11g, Toad, Oracle Enterprise Manager, Teradata 13.1.0, DB2 Connect

Operating Systems

UNIX (HP-UX) /Sun Solaris 10,NDM 4.0, Mainframe, TIBCO

Programming Languages

SQL, Unix, PL/SQL

Reporting/ BI

Micro strategy, OBIEE 10, Crystal Report/ BO-XI , SAS EBI

Section 508 testing

508 testing using JAWS application.

 

EDUCATION

 

Ø      MS – Electrical Engineering                                             Temple University, Philadelphia PA

Ø      Bachelor of Engineering in Electronics and communication                                                 VTU India

 

PROFESSIONAL EXPERIENCE

 

Center for Medicare and Medicaid (CMS).Baltimore, MD                                      Nov 2011 till Present

Sr. Informatica Consultant

 

DESY’s extract portion of the system resides on the CMS production mainframe. The extract portion performs the physical extract of data from the appropriate data stores such as National Claims History (NCH), National Medicare Utilization Database (NMUD), Medicare Provider Analysis & Review (MEDPAR), and the IDR. Upon completion of the extract process, per the user’s DUA, the final data either resides on the CMS mainframe for the user to manipulate and use directly, or the files are copied to tape and shipped to the end user.

 

Roles and Responsibilities:

 

Ø      Worked on DESY’s redesign project to integrate with CMS IDR (Integrated Data Repository). COTS ETL (Extract –Transform- Load) tool Informatica been used to replace legacy extract process using mainframe jobs and java control language (JCL)/queries.

Ø      Met with business stakeholders and other technical team members to gather and analyze application requirements; created FDR/TDR , BRTM for projects as required and uploaded them in Alfresco.

Ø      Re-designed/creating Informatica PowerCenter 9.5.1 workflows, mappings, codes and other related shell scripts/java scripts.

Ø     Designed mappings using reusable Mapplets from various sources like Oracle, DB2 and flat files and loaded it into Teradata target.

Ø      Designing technical specifications for building Data Mart

Ø      Created fast export script to load the file in the landing directory of the Unix.

Ø      Unit and Regression testing of the Informatica Mappings created according to the business requirements.

Ø      Create Impact analysis report based on Performance testing results .

Ø      Created Test Plan, Test Case Specification document, Test Summary Report, BRTM  for the project and maintained in Alfresco.

Ø      Understand Drugs Data, Beneficiary Data, consumer Data and Risk/Importance of Personally Identifiable [PI] data. Ensure all PI data is masked before Test and performance Database.

Ø      Experience with Medicare Part A and Part B data from IDR.

Ø      Used of SQL tools like Teradata SQL Assistant to run SQL queries to validate the data. Analyzed RDR/FDR and TDR deeply, identifying key cases to test, complete all testing in allotted time.

Ø      Worked on initial prototype for Part D data integration process, create sample test data using Data Definition Language (DDL) scripts in IDR staging areas on Teradata 13.

Ø      Validate correctness of Data encryption/ data masking / record counts and other details for the production DESY. Create/Update user guide and other technical documents as required.

Ø      Actively involved in Integration testing and smoke testing with team members .

Ø      Defined Mapping parameters and variables and Session parameters according to the requirements and performance related issues.

Ø      Created Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.

Ø      Provide production support and troubleshoot or find failure reasons for Informatica workflows; work with Informatica Administrators to resolve system issues.

Ø      Work on Root-Cause analysis and provide in detail solutions for data issues whenever required. Worked with Oracle DBAs on performance tuning of Oracle SQLs.

Ø      Worked on Data Analysis and compared VDM for the Shared System Harmonization and reconcillation of the data.

Ø      Worked on SAS Enterprise guide using the Medicare VDM.

             

·    Environment: Informatica Power Center 9.5.1, Oracle 11g, Teradata 13.0.1 , Connect Direct 4.0, JAVA,UNIX, Shell Script, ALFRESCO,JIRA. VIPS ASSISTS, Mainframe DB2T Dev region, Web Sphere Application Server (WAS),Message Queue (MQ) Server, Web Sphere Business Integration (WBI) Server, Section 508 Testing,

 

 

Keystone Mercy Health Plan. Philadelphia, PA                                                     Jul 2009 to Oct 2011

Informatica Developer              

 

Roles and Responsibilities:

 

Ø      Proficiency in utilizing Informatica Power Center 8.6 for designing workflows to load Oracle Data Warehouse as per requirement.

Ø     Met with business stakeholders and other technical team members to gather and analyze application requirements; created FDR/TDR for projects as required and uploaded them in Project SharePoint Portal.

Ø     Designed mappings using reusable Mapplets from various sources like Oracle, SQL Server and flat files and loaded it into Teradata target

Ø      Designing technical specifications for building Data Mart

Ø      Unit and Regression testing of the Informatica Mappings created according to the business requirements.

Ø      Create Impact analysis report based on Performance testing results and reject or raise flag for re-evaluate faulty codes and/or time consuming codes. All reports are maintained in Sharepoint Portal for the project.

Ø      Understand Health Care Data, Insurance plan claims, Beneficiary Data, consumer Data and Risk/Importance of Personally Identifiable [PI] data. Ensure all PI data is masked before Test and performance Database.

Ø      Used of SQL tools like SQL Developer/ TOAD/SQLPLUS to run SQL queries to validate the test data. Analyzed RDR/FDR and TDR deeply, identifying key cases to test, complete all testing in allotted time.

Ø      Actively involved in Integration testing and smoke testing with developers and support team. Document all phases of testing and raise flags in project meeting for any anomalies noticed.

Ø      Defined Mapping parameters and variables and Session parameters according to the requirements and performance related issues.

Ø      Created Workflows and sessions using Informatica workflow manager and monitor the workflow run and statistic properties on Informatica Workflow Monitor.

Ø      Provide production support and troubleshoot or find failure reasons for Informatica workflows; work with Informatica Administrators to resolve system issues.

Ø      Work on Root-Cause analysis and provide in detail solutions for data issues whenever required. Worked with Oracle DBAs on performance tuning of Oracle SQLs.

Ø      Used Micro Strategy to create reports and dashboard for the business users.

 

Environment: Informatica Power Center 8.6.1, Oracle 10g, SQL, PL/SQL, XML, MS Access, Web services, JAVA,UNIX, Shell Script, Micro strategy, OBIEE 10, MS SHAREPOINT, IBM REQ PRO

 

 

Goldman Sachs INC, NJ                                                                                           Jun 2006 to May 2009

Informatica DEVELOPMENT-DATA ANALYSIS

 

Roles and Responsibilities:

 

Ø      Handled all Data Analysis and Development activities (Detailed responsibilities listed above)

Ø      Used of SQL tools like SQL Developer/ TOAD/SQLPLUS to run SQL queries to validate the test data. Analyzed RDR/FDR and TDR deeply, identifying key cases to test, complete all testing in allotted time.

Ø      Actively involved in Integration testing and smoke testing with support team. Used Mercury Quality Center to document all tests. Worked with developers to follow-up on redesign and UAT phases of testing. Update QC as and when required.

Ø      Create used cases, test data preparation, run testing and validation workflows from workflow manager and unix shell scripts using PMCMD STARTWORKFLOW commands, redirected script output to flat log files for debugging.

Ø      Looked at bad files of Informatica workflows for error tracing, checked Oracle PL/SQL procedures; run Oracle PLAN and resend long running stored procedures for adding Oracle Hints in it.

Ø      Created Data Marts for both kinds of applications, CRM and financial and used effective querying and formatting tools to present the data to the end users

Ø      Design and Development of ETL routines, using Informatica Power Center within Informatica Mappings, usage of Lookups, Aggregator, Ranking, Mapplets, connected and unconnected stored procedures / functions / Lookups, SQL overrides usage in Lookups and source filter usage in Source qualifiers and data flow management into multiple targets using Routers were done.

Ø      Involved in migrations of Informatica codes from dev-test-perf-prod environments. Cleansing and extraction of data and defined quality process for the warehouse.

Ø      Implemented dimensional model (logical and physical) data model in existing architecture using Erwin 7.

Ø      Provide production support and troubleshoot / find failure reasons for Informatica workflows.

Ø      Work on Root-Cause analysis and provide in detail solutions for data issues whenever required. Worked with Oracle DBAs on performance tuning of Oracle SQLs.

Ø      Responsible for Data Analysis, Data warehouse and ETL Development, Oracle Database Design and Development, Data Conversion, Data storage, SQL, PL/SQL and UNIX.

Ø      Conducted team meetings and proposed Data Requirements, Data Mappings and Verifications. Gap analysis for staging & warehouse tables. Participated in the System analysis and data analysis.

Ø      Used UNIX scripting as pre/post-session commands to schedule loads using CRONTAB jobs

Ø      Worked with complex mappings using expressions, aggregators, filters, lookup and procedures to develop and populate it in Data Warehouse.

Ø      Created procedures to drop and recreate the indexes in the target Data warehouse before and after the sessions.

Ø      Created deployment groups, migrated the code into different environments.

Ø      Written documentation to describe program development, logic, coding, testing, changes and corrections.

Ø      Involved in configuring LDAP authentication for Informatica.

Ø      Worked on performance tuning of ETL procedures and processes. Also used debugger to troubleshoot logical errors

Ø      Created PL/SQL procedures and functions.

Ø      Experience working with Business Objects Testing.

 

Environment: Informatica Power Center 8.11/8.6.1, Oracle 10g/9i, SQL, PL/SQL, XML, MS Access, Web services, JAVA,UNIX, Shell Script

 

ACCENTURE INDIA SERVICES (India)                                                                       Jan 2004 to July 2005

Informatica Development

 

Role and Responsibilities:

 

Ø      Involved in Dimensional modeling using Erwin to design Fact and Dimension Tables

Ø     Designed mappings using reusable Mapplets from various sources like Oracle, SQL Server and flat files and loaded it into Teradata target

Ø      Designing technical specifications for building Data Mart

Ø      Extensive experience with database languages such as SQL and PL/SQL which includes writing triggers, Stored Procedures, Functions, Views and Cursors.

Ø      Used Informatica partitioning for performance tuning of ETL – includes…key-range partitioning, hash, and database partitioning.

Ø      Used complex Informatica transformations including normalizer, unconnected stored procedure, dynamic lookup, SQL, XML source.

Ø      Built an out-of-box ETL solution for identifying/alerting critical workflow failures to page the Ops team via helpdesk.

Ø      Performed the ETL migrations from UAT to Prod using different approaches in version enabled environment – approaches included XML export-import, direct component copy and deployment groups.

Ø      Used FastLoad external loader for target Teradata for stage loads.

Ø      Used workflow variables for some incremental loads and for conditional flow execution.

Ø      Extensively used Transformations like Router, Aggregator, Normalizer, Joiner, Expression, Lookup, Update strategy and Sequence generator, Procedure. Knowledge in use of SQL and Java Transformations.

Ø      Worked with Web services.

Ø      Created SSIS Packages to migrate slowly changing dimensions.

Ø      Created E-mail notifications tasks using post-session scripts.

Ø      Wrote SQL, PL/SQL, stored procedures & triggers, cursors for implementing business rules and transformations.

Ø      Created procedures to drop and recreate the indexes in the target Data warehouse before and after the sessions.

Ø      Created deployment groups, migrated the code into different environments.

Ø      Written documentation to describe program development, logic, coding, testing, changes and corrections.

Ø      Provided support to develop the entire warehouse architecture and plan the ETL process.

.

Environment: Informatica Power Center/ Power Mart 8.1.1/7.1.1, Oracle 10g/9i, SQL, PL/SQL, XML, MS Access, Web services, JAVA, UNIX, Shell Script, Teradata

 

 

 

 

2

 



Experience

BACK TO TOP

 

Job Title

Company

Experience

Informatica Consultant

Apex It Services

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Years of relevant work experience:

2+ to 5 Years

Work Status:

US - I am authorized to work in this country for my present employer only.

 

 

Target Job:

Target Job Title:

Informatica

 

Target Company:

Company Size:

 

Target Locations:

Selected Locations:

US-MD-Columbia

Relocate:

No

Willingness to travel:

Up to 25% travel

 

Languages:

Languages

Proficiency Level

English

Fluent