From:                              route@monster.com

Sent:                               Monday, September 28, 2015 1:01 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Vivek Shrivastava 

Last updated:  07/16/15

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Glen Allen, VA  23059
US

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Vivek Shrivastava - Enterprise Architect

Resume Value: m332pbrb5m7caym5   

  

 

Vivek Shrivastava

 

(571) 274-0469 vivshrivastava@gmail.com

Dreaming with Big Data

 

PROFILE

Highly experienced software professional with a record of innovating, developing and supporting successful projects and solutions incorporating a wide range of applications and technologies. Consistently recognized and tasked by executive management to improve software efficacy and efficiency by adopting innovative strategies to analyze and solve problems with large datasets.

CORE COMPETENCIES

Analytics / Data Management • Self Starter • Entrepreneur • Open Source Contributor

Software Development Lifecycle Management • Process Improvement • Natural Troubleshooter

TECHNICAL EXPERTISE

Operating Systems: Linux Solaris Windows Mac OS X

Tools: DMExpress by Syncsort Jmeter

Languages: Java Python Perl Shell Script PL/SQL SQL HQL Pig Scala

IO: Oracle MySQL Hive Redis HBase Neo4j Voldemart

Miscellaneous: Asterisk VoIP Amazon AWS Solr Nutch Hive Flume Gephi Django Rapidminer

Lucene Solr Nutch Hadoop Maven Map / Reduce Rally git Pentaho JasperSoft

SVN Impala Numpy ActiveMQ  Control-M HCatalog Apertium HTK CMUSphinx

Elasticsearch ZeroMQ RabbitMQ Talend Kettle Cloudera Hortonworks strace Spark

Storm R Bootstrap SAS Snort OSSEC Docker  Kafka

Machine Learning: Logistic Regression SVD K-Means Labeled LDA HMM

Speaker:Speaker at  LA BigDataCamp 2013 on “Hadoop Security”

Innovation:Developed “Wadoop” a Data Security Management Software and presented at Los Angeles

                                       BigDataCamp 2014

PROFESSIONAL EXPERIENCE

Calance

Director, Big DataDec 2014 - Present

Currently working as Director Big Data, in Advance Technology practice at Calance. In that capacity my responsibilities include working with various customers, developing business and expanding customer base in Big Data area. I am helping a pharmasecutical client expand their Bigdata platform and ensuring security and compliance in the environment.

·  Implemented NIST based cybersecurity framework on Hadoop

·  Designed and implemented solution for privacy analytics of healthcare data

·  Designed and implemented data masking solution using Format Preseving Encrytion (FPE)

·  Integrated SAS on Windows platform with Hadoop

·  Established company’s creditiblity in Bigdata and enhanced revenue sources

Wipro Techonologies                        Richmond,VA Feb 2013 – Nov. 2014

Architect, Analytical and Information Management

Worked with Wipro Technologies and designed bigdata solution for Wipro’s clients like AON, AllState,CapitalOne,Citi in Banking and Insurance. In this role I led team of 12 people and was responsible for understanding customer needs, design appropriate solution using big data technologies, advised roadmap for development,overseen implementation and trained users.

·  Designed and implemented solution to convert mainframe EBCDIC data to Hadoop based platform

·  Integrated Hive metastore with business policies for data management,data quality,tagging etc.

·  Cluster setup with proposed petabytes capacity,hardware selection, LVM setup,Kerberos,LDAP integration

·  Speech to text conversion using open source techonology and setup sentiment analysis

·  Played key role in integrating Bundle.com merchant address matching algorithm with CapitalOne enhanced transaction functionality on Bigdata infrastructure

·  Reduced 36 hours of legacy data processing of multiline credit bureau records in under 15 mins in Hadoop, the solution is also adopted by Datameer as plugin.

·  Developed innovative solution for centralized user management in Hadoop, data lineage and automatic feed restartability,debugged wide variety of application and performance problems across all layers

 

 

Hughes Research Laboratory/HRL       Malibu,CA June 2012 – Feb 2013

Consultant

 

Played an important role in architecting and development of a project for IARPA (Intelligence Advanced Research Projects Activity) which involves collection and processing terabytes scale data in Hadoop and detection of complex event signals with national security significance.

·   Developed advanced language detection in unstructured messages

·   Machine translation of Spanish,Portugese to English using Apertium,SRLIM

·   Sentiment analysis, geo location identification,entity extraction in twitter messages

·   Developed automatic date detection and extraction in unstructured data

·   Used Labeled LDA (Latent Dirichlet Allocation) to find topics in the messages

·   Developed search solution using Elasticsearch and improved n-gram data retrieval

·   Improved Elasticsearch indexing speed by 10x with index size more than 10TB and subsecond response time

·   Integrated NoSQL with Hadoop using Redis master slave configuration for faster data processing

 

Fulcrum Analytics New York,NY         June 2012-Oct 2012

Consultant

Architected and developed prototypes and big data solutions for Giant Eagle as a client of Fulcrum Analytics. Giant Eagle is a grocery store chain with wide presence in east coast. The project was a pilot to explore Hadoop for bigdata solution and to provide solution for existing complex use cases. Responsibilities include setting up Hadoop cluster from ground up, help Fulcrum Analytics understand bigdata technology,interaction with business leaders and develop solutions for Giant Eagle use cases.

 

·   Provided training to Fulcrum developers and operational people for cluster setup and operational support

·   Developed solution in Pig and Hive for Giant Eagle use cases for efficient processing and as a model implementation

·   Developed solution for drill down report using HBase. The response time is reduced to sub-second whereas its taking anywhere between 3 to 30 minutes for Giant Eagle using Exadata and Teradata

·   Evaluated Jaspersoft and Pentaho for BI solution

·   Identified skew in data and adjusted implementation for faster processing

·   Developed Pig Macros and UDF ( User Defined Functions ) to enhance functionality

·   Developed solution for generating plan-o-gram and reduced 30 hours of processing time in Teradata to 5 hours in Hadoop

·   Used Bloom filter to convert reduce side join to map-side join

·   Prepared project report and made recommendations for Giant Eagle

 

Shopzilla/BizRate.com Los Angeles, CADec, 2008-June 2012

Consultant

Worked with Business Intelligence BI,Datawarehouse DWH and Hadoop teams in various roles of Architect, Development, Data analysis,Data Quality Assurance and Operational Support.

 

·   DevOps for weblog collection and real time billing system

·   Designed and developed regular Fact and Dimensions transfer process in Python to transfer data in Oracle to New Datawarehouse and Analytical platform (ZAP) in Hadoop and Hive

·   Developed SDK(Software Developed Kit) to migrate existing datawarehouse in Oracle to new Hadoop/Hive in Star schema

·   Java application development for Hadoop

·   Responsible for development and maintenance of real time billing system in Java

·   Developed MapReduce program in Python for ETL and bringing data in HDFS

·   Developed Regression suite for redirect logging service and brought the testing process from 3 days to 15 mins

·   Responsible for identifying bot requests and recommendations to improve traffic quality

·   Developed reporting website to identify delay in data processing

 

 

 

 

NBC UniversalStudio City, CA     2008

SAP SRM Software Implementation

Led user management implementation in SAP-SRM and consequent SAP – SRM integration with SAP- FICO system. Certified the implementation and invented simulation technique to certify user roles and ensured expected behavior. The technique produced seamless and transparent migration for users.

·  Developed BADI in ABAP to add functionality in Shopping cart module which was only available in SRM 5.0 and avoided the immediate need for upgrade.

·  Developed existing PO modification in extended classic scenario in SRM 4.0, a functionality only available from SRM 5.0. 

·   Debugged abnormal system behavior and developed solutions/processes to mitigate the reoccurrences

Yahoo!Burbank, CA   2006 to 2007

Consultant

Provided data quality certification for search-advertising project PANAMA. Certified ad group, campaign creation algorithm using existing listing, and overall data quality of entire migration program. Collaborated with five-member core data migration team for new architecture. Responsible for data quality and data integrity of more than 700,000,000 search advertisements. Used and trained fellow team members in Jmeter to test Yahoo! web service API for sponsored search.

·   Developed Data Certification suite in Shell script and Perl, including UTF-8 characters for international data.

·   Thorough certification process resulted in zero defect implementation in entire United States and International migration of more than 700 million search advertisements

·   Represented as subject matter expert for migration and migration business rules.

Comverse Technology Mount Laurel NJ  2004 to 2006

Team Lead RAMP group

Led and managed reliability, availability, maintainability, and performance (RAMP) of the NextGeneration application and system. Defined goals and monitored feedback from team members for inclusion in development. Collaborated with developers, assured the requirements specification coverage provided by the analysts and the discrepancies in the application. Constructed the testing strategy and test plan, as well as various scripts in Perl and C to automate administrative tasks for the disaster recovery plan.

 

·   Developed web based bulletin board communication method to reduce the amount of email communication between onshore and offshore team.

·   Developed “ProblemSolved” a concept and implementation using Google Desktop search to reduce development time and increase productivity within onshore and offshore teams.

 

·              As an Entrepreneur and Open Source contributor

o       Integrated www.Cenegenics.com data collection with Salesforce.

o       Self-learned Lucene and Solr and developed search engine for pdf files for Cenegenicsfoundation.org.

o       Implemented Asterisk, an open source VoIP implementation for to manage millions of telephone calls.

o       Implemented Callweaver, an open source VoIP FAX implementation for using T38 protocol.

o       Developed location aware search solution for whoiswhoelite.net

o       Open source committer for OpenHatch.org.

o       Open souce committer for Luigi

o       Active member of local Perl monger group and Southern California Python group.

o       https://github.com/jledbetter/openhatch/commits/master?author=vivshri

o       https://github.com/spotify/luigi/pull/121#issuecomment-21087864

o       http://www.slideshare.net/sawjd/wadoop-vivek-shrivastava

 

Career Note: In addition to the aforementioned experience, from 2000 to 2004 served as a UNIX Administrator for IBM Global Services, India and UNIX Administrator for Parliament of India.

 

EDUCATION

Bachelor of Engineering, Electronics and Instrumentation

Shri G S Institute of Technology and Science, Indore, India, 1999



Experience

BACK TO TOP

 

Job Title

Company

Experience

Enterprise Architect

HCL

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Executive (SVP, VP, Department Head, etc)

Work Status:

US - I am authorized to work in this country for any employer.

 

 

Target Job:

Target Job Title:

Enterprise Architect

 

Target Company:

Company Size:

Occupation:

IT/Software Development

·         Software/System Architecture

 

Target Locations:

Selected Locations:

US-VA-Richmond