From:                              route@monster.com

Sent:                               Monday, September 28, 2015 1:03 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Anisha Lays 

Last updated:  08/12/15

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


New York City, NY  10001
US

anishadatamodelarlays@gmail.com
Contact Preference:  Telephone

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Anisha Lays

Resume Value: mqyyc2hfvdgrh28u   

  

 

 

                                                                                Anisha

Modeling, designing and implementing OLTP, OLAP databases and data warehousing solutions

Technically sophisticated Business Analyst / Data Analyst / Data modeler/ Cloud Developer and architect offering 14 years of experience translating the needs of non-technical clients into data storage. Strong interpersonal skills, highly adaptive at diplomatically facilitating discussions and negotiations with stakeholders. Skilled in mining and analysis solutions that accommodate long-lasting capacity and scalability for a solid business future.

PROFESSIONAL SUMMARY

·   14+ years of experience in IT industry with 12 + years of specialized practical experience in clarifying business requirements through Business Analysis, Data Modeling, Dimensional Modeling, Data Analytics, Data Quality, Design process and system improvements to increase productivity and reduce costs.

·   Excellent Analytical skills to understand the Business Process and Functionality. Developing Functional Specifications for business process refinement and automation, Data Modeling, system architecture, and conducting feasibility studies.

·   Proven experience in relational and dimensional data modeling, data management, data warehousing, data transformation, metadata and master data (reference data) management and business intelligence tools.

·   Experience in

·   12+ years of Dimensional Data Modeling experience using Data modeling, Relational Data modeling, ER/Studio, Erwin, Sybase Power Designer, Star Join Schema/Snowflake modeling, FACT & Dimensions tables, Conceptual, Physical & logical data modeling.

·   Experience in computing platforms that allow analitics to combine structured, complex data and build processing and analytical applications across data sources and types;

1.   Advanced information management and new data processing techniques may be applied to extract the value locked up in this data called Hadoop (HDFS) along with processing large data sets in parallel across a Hadoop cluster and the utilization of Hadoop MapReduce framework.

2.   Additionally experienced in NameNode where Hadoop stores all the file location information in HDFS and tracks the file data across the cluster or multiple machines.

3.   Predicting market trends or identifying market shifts or in other words, allowing the businesses to look through the windshield for what is coming.

·   Work on Background process in oracle Architecture. Also drill down to the lowest levels of systems design and construction.

·   As a Architect UML models and leverage the advanced executable code generators to target different domains.

·   Determine the physical architecture of oracle ( Datafiles”*.dbf”, Controlfiles “*.ctl”) est.

·   Highly skilled in the usage of ETL tools for DataStage (Talend, Informatica, SSIS, SSRS) developing features of Teradata (PDE, Amp, Bynet, PE, vDisk and Virtual storage system (VSS).  Knowledge of MultiLoad, TPUMP and BTEQ utility.

·   Experienced in Businesses Analytics, to provide comprehensive predictive analytic, financial performance and strategy management, to improve business performance and predict future outcomes.

·   Experience in conducting Joint Application Development (JAD) sessions with SMEs, Stakeholders and other project team members for requirement gathering and analysis.

·   Responsible for interacting with business partners to identify information needs and business requirements for reports.

·   Strongly capable of handling VLRDB (Very Large Relational Data Bases) of about 5TB with expert level working knowledge of the architecture involved.

·   Experience in back-end programming including schema and table design, stored procedures, Triggers, Views and Indexes.

·   Expertise in Normalization/De-normalization techniques for optimum performance in relational and dimensional database environments.

·   Highly proficient in Data Modeling retaining concepts of RDBMS, Logical and Physical Data Modeling until 3NormalForm (3NF) and Multidimensional Data Modeling Schema (Star schema, Snow-Flake Modeling, Facts and dimensions). Complete knowledge of data ware house methodologies (Ralph Kimball, Inmon), ODS, EDW and Metadata repository. 

·   Expert in the Data Analysis, Design, Development, Implementation and Testing using Data Conversions, Extraction, Transformation and Loading (ETL) and SQL Server, ORACLE and other relational and non-relational databases.

·   Experience in leading the offshore team, supporting team effort through sharing of technical knowledge to turnaround deliverables to meet aggressive deadlines.

·   Managed Full SDLC processes involving requirements management, Workflow analysis, source data analysis, data mapping, metadata management, data quality, testing strategy and maintenance of the model.

·   Consolidate and audit metadata from disparate tools and sources, including business intelligence (BI), extract, transform, and load (ETL), relational databases, modeling tools, and third-party metadata into a single repository.

·   Expert level understanding of using IBM Cognos SPSS, Cognos TM1 and Business Analytic s family software.

·   Expert level understanding of using different databases in combinations for Data extraction and loading, joining data extracted from different databases and loading to a specific database.

·   Excellent understanding of an Approach to MDM to creating a data dictionary, Using Informatica or other tools to do mapping from sources to the Target MDM Data Model.

·   Excellent understanding of Hub Architecture Style for MDM hubs the registry, repository and hybrid approach.

·   Excellent understanding and working experience of industry standard methodologies like System Development Life Cycle (SDLC), as per Rational Unified Process (RUP), AGILE and Waterfall Methodologies.

·   Excellent knowledge in the ETL (Extract, Transform and Load) of data into a data ware house/date mart and Business Intelligence (BI) tools like Business Objects Modules (Reporter, Supervisor, Designer, and Web Intelligence).

·   Well versed with client server environment and tools like SQL*Loader, UNIX shell scripts and TOAD.

·   Used NetCDF Java library is an implementation of the CDM which can read many file formats besides NetCDF

·   Common Data Model (CDM) for the scientific Datasets which merges the NetCDF, OPeNDAP, and HDF5 data models

·   Excellent team member with interpersonal and communication skills and trouble-shooting capabilities, highly motivated, result oriented with strong analytical, organizational, presentation and problem solving skills.

·   Expert in developing effective working relationships with client team to understand support requirements, develop tactical and strategic plans to implement technology solutions, and effectively manage client expectations.

·   Excellent knowledge for creating Databases, Tables, Cluster/Non-Cluster Index, Unique/Check Constraints Views, Stored Procedures, Triggers, Rules.

·   Have training in AWS cloud databases (Amazon RDS ) and using Amazon virtual private cloud (VPC) which supported two EC2 platforms (VPC and Classic). Create new option group and assign it to the DB instance such as Oracle TDE. Create Amazon RDS MySQL DB instance such as MySQL versions, storage engine security. 

TECHNICAL SKILL SET

Data Modeling Gears

Erwin r9.5/7x/6x/5x, ER/Studio 9.7/9.0/8.0/7.x, Toad 9.7.2/7.4/7.3.

BI Tools

IBM Cognos, Birst, IBM Cognos TM1, Jaspersoft, Pentoho, Sisense, MicroStrategg, Birst, Targit, Talend.

Databases

Oracle 11g/10g/9i/8i/7.x, Teradata, HP ALM, DB2 UDB 8.x/7.x, DB2 Z/OS 9.x/8.2, SQL Server 2012R2, 2008/2005/2000, MySQL, MS- Access, Flat Files, XML files.

Programming Skills

SQL, PL/SQL, Shell Scripting

Operating Systems

Win 95/NT/98/2000/XP, LINUX, Sun Solaris 2.X/5/7/8, IBM AIX 5.3/4.2, HP-UX, MS-DOS

Scheduling Tools

Autosys, Maestro (Tivoli)

Data-Warehousing/Big Data-Warehousing Platforms

Informatica PowerCenter, Informatica V 9.5.1 Hot fix, SheriPint, IBM InfoSphere Data, Apache, Taland Open Studio Hadoop 2.6.0, Birst, NoSQL, MapReduce, Jaspersoft BI Suite, SQL Server Integration Services, SAS, SQL-SERVER Integrated Services, Oracle Data Integrator.

Other Tools

Teradata, SQL Assistant, Oracle ATG, DB Visualizer 6.0, SAS, Microsoft Office, Microsoft Visio, Microsoft Excel, Microsoft Project, SQL Server 2005, 2008, 2008R2 and 2012R2, (SSIS), SSRS, SSAS. EMC PX12-450R Network Storage Array.

EXPERIENCE: 

Highmark Inc, Pittsburgh, PA  May 2015 - Present

Role: Information Architecture & Modeling

Highmark is a not-for-profit health care company based in Pittsburgh, Pennsylvania, United States. It is the largest health insurer in Pennsylvania, and through a purchase in 1996, the largest health insurer in West Virginia and also later Delaware. As Highmark Blue Cross Blue Shield, it primarily serves the 29 counties of western Pennsylvania. As Highmark Blue Shield it services 21 counties of Central Pennsylvania and the Lehigh Valley. There is also a presence in the border areas of eastern Ohio, and all of West Virginia through its subsidiary Highmark Blue Cross Blue Shield West Virginia. Highmark has a stake in the largest health insurer in northeastern Pennsylvania as well, Blue Cross of North East Pennsylvania BCNEPA. It is currently[timeframe] one of the largest not-for-profit health insurers in the United States which owns several for-profit subsidiaries.

Key Achievements:

 

·   Work with EDW Modeling team: Teradata DBAs

·   With EDW team share Model on SheriPint

·   Create Erwin subject area for the changes.

·   Name subject are

·   DB work item Quote

·   Update Service Manager phase to In work

·   Completing model changes, Save reports

·   Create a Physical Data Model report named: DB work item PDM.pdf

·   Create a Data Dictionary report named: DB work item DD.pdf

·   Send notification that model change is ready for review

·   Data Analysis, Design, Development, Implementation and Testing using Data Conversions, Extraction, Transformation and Loading (ETL) and SQL Server, ORACLE and other relational and non-relational databases

·   Turn over model to DBAs upon customer approval

·   Save model on LAN in ToDBA folder

·   Update Service Manger phase: DBA Analysis

·   Save reports to Published folder

·   Email to requester and CC Team

Waddell & Reed, Shawnee Mission, KS  April 2015 / May 2015

Role:  Data Analyst

Waddell & Reed asset management and financial planning company provides customized financial planning and investment services to clients throughout the United States. It operates asset management and distribution subsidiaries, including Ivy Investment Management Company and Waddell & Reed Investment Management Company.

The project name is IDQ EST Business Information.

The project scope is move the Legacy Data from mainframe, source system (CERD, IIR) to SQL Server is Target changing to MDM using Informatica Data Quality empowers to managing data quality across the enterprise. Use Informatica Hot Fix to discover and define business logics and collaborate on business project. To delivers authoritative and trustworthy data to all stakeholders, projects, and business applications—on premise or in the cloud.

Key Achievements

·   Proactively monitor and cleanse data across the enterprise

·   Review approach definition and identify potential solutions

·   Ensure quality standards, criteria and targets are documented in the quality plain

·   Remove process and role ambit 

·   Script table as inset to new Query edit

·   Enable business and IT collaboration in the governance of data

·   MSD SDLC Phase

·   Achieve better business outcomes and maximize your return on data

·   Repository / Domain Details

·   Connect IDQ Developer environment

·   Stage environment to migration and test

·   Work on subject Area

·   LDG Data Review

·   Spec Assigned

·   MDM Reviewed

·   Source to Target Matrix

Cogent Data Solutions, Hoffman Estate, IL April 2014 / April 2015

Role: Senior Data Modeler / Data Analyst

Developed data models for Cook county, IL government. The data model contained data elements of patients and are used to give those patients Preventive health-care. Some elements that are contained in these data models are: find best GP (primary care physician), Hospital, Emergency room (ER), test centers for diagnostics, drugs etc. The scope of the project is to bring down cost of health-care from $850 per person to $90 to $150 per person. Used MapReduce and a distributed processing system to file and store data. The project was to design and develop

Data Marts from Oracle Data Warehouse, to migrate the data from different ( Teradata ) sources into the databases using SSIS in generating customer profile reports, customer transaction reports, compliance reports, Audit reports using SSRS. Used the data governor to data quality monitoring against production data in the goldensource to communicate errors in data entry back to the operations team members or to technology for corrective action. Involved in all the steps and scope of the project reference data approach to MDM, Creating a Data Dictionary and Mapping from Sources to the Target in MDM Data Model.

Used Talend Open Studio for ETL design and Data Integration makes the development process simpler and faster. Birst’s user-ready data to designed and optimized for ROLAP-style analytics, Kimball style star-schema with a multi-dimensional view of all data. Used Birst's user-ready data store supports Type SCD1 and SCD2 dimensions, and manages snapshots automatically. Data loading and updates are done through incremental process with built in Birst change detection. I did using Birst’s unified business model to join data from Birst’s user-ready data store with data in real-time from another source, cloud or on-premises. Birst Used in to combines ad hoc analysis of data and banded report into a single user interface report or visualization

writing. Accessed on Mobile devices via HTML5 or Birst’s Native iPad application or scheduled for delivery by email, Birst business intelligence platform exposed that into dashboard,

Use CDM which contain elements that define PHI under HIPAA, including encounter dates and date of birth. Use Distributed analytic programs the date fields for analysis, person-level analysis under an IRB, with all necessary data agreements in place and necessary “cross-walks” between the arbitrary identifiers included in the CDM and their originating data are not specified in the scope of the CDM. Maintained each data partner. Locally maintained “mapping tables” are tables necessary to implement so that each data partner has the ability to map arbitrary identifiers back to the originating data and patient. Use Mapping tables for implementation of the CDM v1.0. Designed the table identify periods during which a person is expected to have complete care so capture that data. Members with medical coverage, drug coverage, or both should be included. A record is expected to represent a unique combination of PATID, ENR_START_DATE, and BASIS. A break in enrollment (of at least one day) or a change in the chart abstraction flag should generate a new record. That ENROLLMENT table provides an important analytic basis for identifying periods during which medical care should be observed, for calculating person-time, and for inferring the meaning of unobserved care.

Key Achievements

·   Perform administrative tasks, including creation of database objects such as database, tables, and views, using SQL DCL, DDL, and DML requests.

·   Focus on integration overlap and Informatica’s newer commitment to MDM with the acquisition of Identity Systems.

·   Observe and solve the issue of MDM.

·   As a Architect implement MDM hub to provide clean, consistent data for a SOA implementation.

·   Using Erwin to customize report display and export reports to HTML.

·   With Erwin Tools generate graphical diagrams that drill-into to view detailed metadata.

·   In Erwin DM creating Logical and Physical Model.

·   Used BTEQ script to create a sample tables. Redefine the partitioning of a populated table.

·   Use RDBMSs to implement simple CRUD ( Create, Read, Update, and Delete ) functionality.

·   In Teradata use cursor logic to generate a sequence of dynamic SQL statements.

·   Developed complex mapping to extract data from diverse sources including flat files, RDBMS tables, legacy system files, XML files, Applications and Teradata.

·   Use Birst’s ready data store sits on top of all data and combines different sources of corporate data. Optimized ROLAP-style analytics, providing a Kimball style star-schema with a multi-dimensional view of all data.

·   Involved in gathering user requirements and specifications.

·   Involved in stored procedures, functions, and database triggers and maintained referential integrity and implemented complex business logic.

·   Created Dimension types such as Standard dimension, Parent-Child dimension and Role Play dimension in SSAS.

·   Designed and created views for security purposes. Implemented rules, defaults, and user-defined data types. Tested queries to optimize procedures and triggers to be used in production.

·   Developed SSIS Templates which can be used to develop SSIS Packages such a way that they can be dynamically deployed into Development, Test and Production Environments.

·   Created SSIS packages for different data loading operations for many applications.

·   Performed Data modeling for an existing Databases using Toad Data Modeler and ER Studio.

·   Designed VB forms using the forms and crystal reports were made for business reporting purpose.

·    

Environment:  Oracle 11g, IBM infoSphere, IBM Cognos, IBM Cognos TM1, Erwin r9.5,ER/Studio 9.7, IBM Info Sphere Data , HP Hadoop, Windows 7 XML, Excel, Access, Visio, SSIS , SSRS, SQL Server 2005, 2008, 2008 R2 and 2012 , EMC PX12-450R Network Storage Array.

Volkswagen Group of America, Auburn Hills, MI March 2010 / March 2014

Role: Business Analyst / Senior Data Modeler

Re-factored the data sources and performed work based on requirement priority. Model storm data throughout the project’s complete Software Development Life Cycle (SDLC) using JIT. Managed requirements, having single definition for a data element or business term, Allowing business leaders to better understand their clients’ needs and wants and data analysis, domain knowledge, collection of information organized into interrelated objects like table spaces, tables, and indexes. Performed data profiling, designing the blueprint for improved data quality. Performed data modeling, system analysis, architecture and design, development, testing and deployment of business applications. Used processes such as Disciplined Agile Delivery (DAD) and Unified Process. Extract, Transform, Load transactions (ETL) using Database. Used and Agile approach to Master Data Management (MDM). Also was directly involved with the actual modeling effort itself. Attended the JAD sessions for Business requirements gathering and represented in a logical data model, creating data mapping documents, writing functional specifications and queries. Created Erwin reports in HTML, RTF format depending upon the requirement. Published data model in model mart, created naming convention files. Co-coordinated with DA's to apply the data model changes, including requirement analysis, design and implementation. Use Informatica PowerCenter to move data MDM into the hubs. Used the data governor to data quality monitoring against production data in the goldensource to communicate errors in data entry back to the operations team members or to technology for corrective action. 

Key Achievements

·   Created complex Stored Procedures for data retrieval.

·   Created SSIS Packages using Pivot Transformation, Execute SQL Task, Data Flow Task, etc to import data into the data warehouse.

·   Created user defined functions in SSRS using VB script.

·   Responsible for creating OLAP cubes for deep through analysis using SSAS 2008.

·   Use the power of Birst’s unified business model to join data from Birst’s user-ready data store with data queried in real-time from another source, cloud or on-premises.

·   Used data profiling tool for automates the discovery process.

·   Used data profiling automation to uncover the characteristics of the data and the relationships between data sources before any data-driven.

·   Developed strategies for data warehouse implementations, data acquisitions, provided technical and strategic advice and guidance to senior managers and technical resources in the creation and implementation for data architecture and data modeling

·   Perform administrative tasks, including creation of database objects such as database, tables, and views, using SQL DCL, DDL, and DML requests.

·   Integrated crystal reports using Erwin Data Modeler.

·   Us Erwin to support for Teradata 13.0 and SSL.

·   Set in Erwin Diagram Parent to Child identifying relationships.

·   Transformation and reverse engineering using Erwin tool.

·   In Erwin DM creating Logical and Physical Model.

·   Used BTEQ script to create a sample tables. Redefine the partitioning of a populated table.

·   Using cursor logic to generate a sequence of dynamic SQL statements in Teradata.

·   Use RDBMSs to implement simple CRUD (Create, Read, Update, and Delete ) functionality.

·   Developing complex mappings to extract data from diverse sources including flat files, RDBMS tables, legacy system files, XML files, Applications and Teradata.

·   Defined corporate Metadata definitions for the enterprise data supported databases including operational source systems, data stores and data marts developed logical and physical data models and documented source data from multiple sources, internal systems, external source systems, and third party data.

·   Used various Transformations in SSIS Dataflow, Control Flow using for loop Containers, and Fuzzy Lookups. Created several Staging Databases. Used SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts and performed action in XML

Environment: Oracle11g, IBM Cognos TM1, Erwin 7.5.2, NoSQL, Hadoop, IBM Cognos, Talend, SSIS , SSRS XML, PL/SQL, SQL Server 2005, 2008, and 2008 R2 Excel, Access, Visio, Windows XP.

Client: Chartis International, Parsippany, NJ  Aug 2009 / Feb 2010 

Role: Data modeler / Business Data Analyst

Attend the JAD sessions for requirements gathering, creating data mapping documents, writing functional specifications, queries. Created Erwin reports in HTML, RTF format depending upon the requirement, Allowing business leaders to better understand their clients’ needs and wants. Published Data model in model mart, created naming convention files, co-coordinated with DBAs’ to apply the data model changes, Including requirement analysis, design and implementation. Used MDM to maintain data quality initiatives and ensure integration with organization’s current standards (i.e. Informatica, or Trillium) that Siperian organization's find full data management solution in it. Check the MDM Data Model can provide data for performance reports. Used the data governor to data quality monitoring against production data in the golden source to communicate errors in data entry back to the operations team members or to technology for corrective action.

Description:

Chartis (Chartis, Inc.), is a division of AIG in the process of being spun off, formerly referred to as the 'Property Casualty Insurance' part of AIG. American International Group, Inc. (AIG) is a leading international insurance organization serving customers in more than 130 countries. AIG companies serve commercial, institutional and individual customers through one of the most extensive worldwide property casualty networks of any insurer. In addition, AIG companies are leading providers of life insurance and retirement services in the United States.

Key Achievements

·   Responsible for creating Databases, Tables, Cluster/Non-Cluster Index, Unique/Check Constraints Views, Stored Procedures, Triggers, Rules.

·   In Erwin DM creating Logical and Physical Model.

·   Creating a User-defined Function To Verify a Date.

·   Perform administrative tasks, including creation of database objects such as database, tables, and views, using SQL DCL, DDL, and DML requests.

·   Used BTEQ script to create a sample tables. Redefine the partitioning of a populated table.

·   Ensure compliance of Data Quality in MDM.

·   Determine the Target in MDM.

·   Specify the Mapping between Sources and Target in MDM.

·   Developed SSIS Packages to Consolidate date from various data Sources and also Data loads from various types of sources files (EXCEL, Access, Flat files, CSVs) and converted XL to SQL reporting.

·   Designed and developed complex mappings to extract data from diverse sources including flat files, RDBMS tables, legacy system files.

·   Use RDBMSs to implement simple CRUD ( Create, Read, Update, and Delete ) functionality.

·   Involved in daily loads (FULL & INCREMENTAL) into Staging and ODS areas, troubleshooting process issues and errors.

·   Creating a Stored Procedure to Update Accrued Interest.

Environment:  Oracle 10g, Erwin 7.5.2, HP ALM, , Talend, SQL Server 2005, 2008 and 2008 R2 Windows , IBM InfoSphere Data, SSIS , SSRS XML, Excel, Access, Visio.

Client: PRA Internationals, Raleigh, NC   Jun 2008 /July 2009 

Role: Data Modeler / Business Data Analyst

PRA Internationals is dedicated to providing a variety of high quality professional health services to the health communities. As a top five CRO, we have worked on 100+ marketed drugs across several therapeutic areas and conducted the pivotal or supportive trials that led to FDA and/or international regulatory approval of 50+ such drugs.

Project description:

The purpose of this project was to design an information system for a patient health records in a hospital. Computer-based patient records (CPR’s) give all the information about a person’s health. The project used Health Level (HL7) naming standards in its effort to attain integrity with government records. Development of data model contained data elements such as patient (CPR), primary care physician (GP), Emergency room (ER), test centers for diagnostics and reports, drugs available at store etc. Scope of the project is limited by ignoring the insurance plans and payment modes made by the patient. Provided Logical Data Model (LDM) and Physical Data Modeling (PDM) reviews with Data SMEs. Keep focus on tight integration overlap and Informatica’s newer commitment to MDM with the acquisition of Identity Systems. the data governor to data quality monitoring against production data in the goldensource to communicate errors in data entry back to the operations team members or to technology for corrective action.

Key Achievements

·   Responsible for creating Databases, Tables, Cluster/Non-Cluster Index, Unique/Check Constraints Views, Stored Procedures, Triggers, Rules.

·   Perform administrative tasks, including creation of database objects such as database, tables, and views, using SQL DCL, DDL, and DML requests.

·   Used BTEQ script to create a sample tables. Redefine the partitioning of a populated table.

·   Creating a User-defined Function To Verify a Date.

·   Used various Transformations in SSIS Dataflow, Control Flow using for loop Containers, and Fuzzy Lookups. Created several Staging Databases. Used SSIS and T-SQL stored procedures to transfer data from OLTP databases to staging area and finally transfer into data marts and performed action in XML.

·   Designed and developed complex mappings to extract data from diverse sources including flat files, RDBMS tables, legacy system files.

Environment:  Oracle 10g, Erwin 7.5.2, SQL Server 2005 and 2008, Windows XP, XML, Talend, Excel, Access, Visio.

Client: Circuit City Stores, Inc. Glen Allen, VA  Feb 2000 / Nov 2008

Role: Business data analyst / Data modeler

Creating Logical and Physical Data Model (Erwin 3.0) Reviewed numerous Use Cases, Activity Diagrams, UML Models, and ER Diagrams. Create EDW Strategy/Roadmap to include Master Data Management Planning. Data Modeling Design of a 'Common Entities' LDM to consolidate Master Data from multiple sources and allow for data sharing across the organization. Used Oracle Designer 9i and Rational RequisitePro. Also work on e-commerce to provide unique selection of products.

Key Achievements

·   Successfully created and managed a conversion testing effort which included a data quality review, two system test cycles, and user acceptance testing.

·   Using Erwin to logical and Physical DM in SQL-SERVER.

·   Set referential integrity roles and history options in Erwin Editor.

·   Use RDBMSs to implement simple CRUD ( Create, Read, Update, and Delete )functionality.

·   In Erwin Diagram display Logical and Physical relationships.

·   Set in Erwin Diagram Parent to Child identifying relationships.

·   Optimized the performance of queries by modifying the existing index system and rebuilding indexes.

·   Involved in writing SQL programming for implement Stored Procedures and Functions for different tasks.

Environment: Windows 2000, Oracle 8i, ATG Merchandising UI, MS Access, SQL Server 2005 and 2008, Erwin 3. 

Education: Master of Economics (Economics, Statistics and Civics) - UNIVERSITY OF KARACHI, PAKISTAN 1990

Branford Hall Career Institute, Connecticut USA

Diploma in Computer Network Management ( SQL Server, Window Server 2006 /2012 R2, Oracle VM Virtual Box , Switches, Routers and Security ) Grade with A

 

1

 



Experience

BACK TO TOP

 

Job Title

Company

Experience

Information Architecture & Modeling

Highmark Inc

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Experienced (Non-Manager)

Date of Availability:

Immediately

Work Status:

US - I am authorized to work in this country for any employer.

Active Security Clearance:

Active Secret

US Military Service:

Citizenship:

US citizen

 

 

Target Job:

Target Job Title:

Data Modelar

Desired Job Type:

Employee
Intern
Temporary/Contract/Project
Seasonal

Desired Status:

Full-Time
Part-Time
Per Diem

 

Target Company:

Company Size:

Industry:

Computer/IT Services

Occupation:

IT/Software Development

·         General/Other: IT/Software Development

·         Usability/Information Architecture

 

Target Locations:

Selected Locations:

US-PA-Philadelphia
US-VA-Fairfax/Manassas/Reston

Relocate:

Yes

Willingness to travel:

Up to 100%

 

Languages:

Languages

Proficiency Level

English

Fluent