From:                              route@monster.com

Sent:                               Monday, September 28, 2015 1:00 PM

To:                                   hg@apeironinc.com

Subject:                          Please review this candidate for: Talend

 

This resume has been forwarded to you at the request of Monster User xapeix03

Michael Shevenell 

Last updated:  05/11/15

Job Title:  no specified

Company:  no specified

Rating:  Not Rated

Screening score:  no specified

Status:  Resume Received


Darlington, MD  21034
US

Mobile: 410 533 0010   
Home:
410-533-0010
shevenello@yahoo.com
Contact Preference:  Telephone

Quick View Links:

Resume Section

Summary Section

 

 

RESUME

  

Resume Headline: Michael Shevenell - Senior Software Engineer

Resume Value: 54hb9tdeztewvfbu   

  

 

Michael Shevenell

4528 Conowingo Road, Darlington, Maryland 21034

410-533-0010 shevenello@yahoo.com

______________________________________________________________________________

Team Lead and Senior Software Developer

Security Clearance: DoD SSBI/TS and SSA Cleared

Interested in Part-Time Consulting for Hadoop or Splunk On-Site in Maryland, Delaware or Remote working

 

A Senior Software Engineer with 26 years of experience developing software applications using C, Python, Java, J2EE, C++, Perl, PHP, Ruby and database development using Greenplum EMC, PostgreSQL, Oracle, MySQL, Spunk and Hadoop/MapReduce. Over the past eight years Mr Shevenell has been working as team lead for a senior group of Software Developers, Electrical Engineers and Scientists responsible for designing and implementing large “big data” Cyber security research projects. He is experienced in developing applications, which take advantage of various types of Linux file systems, Hadoop distributed file system(HDFS) and cluster environments. Extensive experience with deployments using the Amazon EC2 environment using CloudWatch and Hadoop. He Possesses experience in Linux system administration networking and troubleshooting, a useful skill in a research development environment.

Currently, he is technical domain expert and project lead who contributes to all software project phases from development through the implementation of advanced Hadoop and fraud detection solutions for the Social Security Administration (SSA).​ The projects include the design, architecture and implementation of several 100 plus-nodes using Cloudera Hadoop (CDH4) and (CDH5) clusters.  He performs analysis of fraud detection scenarios using Hadoop analytics tools such as Hive, MapReduce, HBase, Impala, Spark, Splunk, Pentaho, Hue, Tableau, Talend, Mahout, SAS and Project R for Statistical Computing. Substantial experience developing Splunk searches and design of Splunk dashboards. He is responsible for the Extract, Transform and Load (ETL) of data into Hadoop HDFS from various formats such as DB2, EBCDIC, ASCII, CSV, XML, JSON, Network flows and Binary PCAP data. He has experience with Hadoop HDFS loading tools which range from local Hadoop command “hadoop fs –put” to Flume, Sqoop and ETL MapReduce methods. His Cyber intrusion detection experience in Machine learning, Neural Networks and Statistical Computing has been successfully applied to SSA fraud detection scenarios. He is a key member of the SSA  Advanced Data Analytics and Fraud Detection Lab. The purpose of the Lab is to leverage Open Source and commercial Hadoop technology to advance SSA fraud detection capabilities. The function of the lab is to evaluate new tools/methods, and to model a framework for expert systems, combining techniques such as classifiers, recommenders and complex event processors.  

 

Qualifications Summary

Type

Description

Equivalent Years
of Experience

Job

Northrop Grumman Contractor for SSA  2014-Present

              1

Job

ICF (formerly Jacob & Sundstrom), 2001–2014

13

Job

Cove Software Systems, Inc., 1996–2000

4.0

Job

Universal HiTech Development, 1992–1996

4.0

Job

Arinc Research Corporation, 1991–1992

1.0

Job

Oracle Complex Systems Group, 1989–1991

2.0

Job

Gould Electronics Company, 1987–1989

2.0

Job

University of MD Plasma Physics Laboratory, 1984–1987

4.0

Cert

Infosec Institute Application Security, 5/2006

0.5

Cert

Security+, 6/2011

0.5

Cert

Linux+, 7/2011

0.5

Educ

B.A., Computer Science and Math, University of Maryland

2.0

Educ

A.A., Electronics, Prince George’s Community College

2.0

 

Total Equivalent Years of Experience

34

Accomplishments and Strengths

Built and designed and deployed an operational Hadoop-based network packet processing engine which provides scalability for a large data set and by harnessing MapReduce, HDFS, Flume, MongoDB, Impala and Spark. Apache Flume is used for ingesting data into Hadoop which consists of a set of input (source) and output (sink)  plugins, as well as a transport (channel) between them. The PCAP data format is collected using a modified Tcpdump tool and became the ideal choice for ingesting data using Flume. Additional data sources such as binary and Netflow text are used as input to Hadoop as well. Spark and Shark are used as the data analytics cluster computing framework to improve the performance over MapReduce. Real time streaming functionality was added using Spark as well. Cloudera Impala was used to provide SQL functionality to the application. Used Ruby and Python for development of the Hadoop application

Ability to design, architect, and install all components of a large data center Hadoop cluster, ranging from the installation of Linux, to the configuration of primary name nodes, secondary name nodes, data nodes, network switches, etc. The configuration includes the install and setup of Hive, HBase, HDFS, MapReduce, Impala, Hue, Flume, Sqoop, Spark, Yarn and the Cloudera Hadoop Manager. Has experience using both open source and commercial version of Cloudera (CDH4) and (CDH5), as well as experience with system administration of Hadoop clusters.  Installed and configured RHadoop onto Cloudera CDH5 using Rstudio. Configured rmr,rhdfs,rhbase etc.

Developed Hadoop Hue applications using the Cloudera Hue Django/Python SDK. The new applications provide a custom GUI interface to execute HiveServer2 and Impala queries. The output from the queries produce custom graphical analytics, such as dashboards and visualization of fraud detection data. The custom Hue applications use the Apache HiveServer2 Thrift API for executing Hive queries and listing tables. The same interface is used to communicate with the Cloudera Impala applications.

Developed a enterprise security framework using the Splunk cloud solution. The solution connects Splunk to an Hadoop back-end which allows Splunk analytics access to machine-generated data. Custom modules were developed for IT management, virtualization monitoring, infrastructure monitoring and enterprise security. The custom modules were developed using Javascript for client side development and Python and the Django framework for server-side development. Experience designing Spunk dashboards using both XML and HTML JavaScript.

Currently  developing Application Performance Management (APM) using Splunk, Cassandra, Prelert and Dynatrace. Using Dynatrace APM transaction streams are captured in real time and sent into Splunk. Dynatrace  APM’s PureP-ath captures and correlates all end user actions including requests, click paths, 3rd party services and CDNs providing insight into every individual transaction and access to critical information about user experiences and operational performance.

Develops and designs social networking Web interface applications using PostgreSQL, Oracle SQL, PHP, ASP.NET, CORBA, JDBC, Python, and JAVA programming languages. Familiar with current versions of the integrated development environments (IDE) Eclipse and Netbeans. Used Netbeans and Eclipse to develop JAVA , PHP portlets, Ruby on Rails and, to develop Mobile Andriod phone applications. Has experience developing solutions using Drupal and Liferay and, implementing JSR 168 or JSR 286 compliant portlets for the Liferay Framework.

Experience working as a Cybersec-urity Analyst duties included, performing static & dynamic analysis of malware Advanced persistent threat (APT) and its delivery mechanism (malicious documents e.g. pdf, doc, etc.).  Analyzed high-level language constructs (branching statements, looping functions, network socket code, and more) of malware/APT.  Performed digital fingerprinting to determine foreign adversary/actor behind malware/spear phish, and correlated the data back with the Intelligence community. Used malware (APT) analysis to develop IDS signatures (Snort), Firewall rules, Anti Virus signatures. Assisted in the writing and review of organizational security policies to support internal control (access management, contingency planning & testing, Security Awareness, intrusion detection, Patch Management, Anti-Virus, etc.). Implemented risk management framework for organizations, and developed affective strategy for continuous monitoring. Developed secure guideline for cloud computing, worked on projects integrating IT governance controls in cloud computing.  As lead software developer for a new Intrusion Detection System (IDS) my role included monitoring active sites. Software developers were required to participate in the Cyber analysis duties to promote a better understanding of the threat and potential vulnerabilities

Develops and designs system software systems using Python, PHP, Java, Perl, Ruby, CORBA, and C programming languages using the Agile framework. Has daily working experience using Subversion (SVN) and, also the SVN site administrator. Built and installed the current ARL local working version of SVN

Strong understanding of relational database concepts including stored procedures and triggers. Experience with SQL optimization techniques. I programmed using several stored procedures implementations including Oracle PL/SQL, PostgreSQL PL/pgSQL, MySQL version and Microsoft Transact SQL.

Has working administration and programming knowledge using the Sun Grid Engine and management of large terabyte Data Direct cluster file-systems and Hadoop HDFS.

Designed and implemented a very fast full-text Google-like retrieval system to query IDS data using open source Sphinx indexing and the Greenplum-PostgreSQL database. The system was developed using C, Java and Python.

Customized and added new features to the Snorby IDS front-end which is based on Ruby on Rails. The new features included a real time detection views of active Snort alerts as well as custom alerts from Suricata.

Tested, debugged, and maintained UNIX cluster file system GPFS and the Lustre file system. Built and installed custom Lustre file system from source code. Attended Lustre User Group (LUG) conference in California

Understands how to program customized modules for the Nagios infrastructure monitoring system

Designed and built mathematical algorithms to correlate network and host-based IDS alerts. Understands and practices mathematical techniques commonly found in security research papers. Has experience using MATLAB from Mathworks. Programmed MATLAB test scripts using Java and Python. Familiar with neural networks, machine learning and Bayesian network techniques and models. Programmed Cognimem neural network hardware to perform IDS streaming data learning. Developed neural network system to perform intrusion detection supervised learning system using C, C++ and Java languages.

Has expert knowledge of IDS Snort configuration tools and configuration rules. Understands the complete life cycle of a Snort alert from rule creation and detection on the sensor to the display of the alerts

Has substantial hands-on experience with CISCO routers, including making changes to ACL rules and the installation and initial configurations of BGP

Compiles and enhances most of the popular public domain intrusion tools, including Snort, Shadow, Wireshark, Metaspolit, and tcpdump

Experienced with configuration and customization of  the Ruby based Chef, Capistrano and Puppet cloud infrastructure automation frameworks and tools

Compiled, configured, installed, and evaluated several network scanner programs used for security audits; programs included nmap and nmapFE, GTK+ Xwindow front-end, Nessus, which uses nmap, but also includes an up-to-date security vulnerability database, and Saint, a program that rapidly scans for Microsoft vulnerabilities undetected by nmap and Nessus

Developed software tools used by second-level threat cell analysts to track hacking attacks. Used Ruby on Rails to construct user interface.

Has strong applied understanding of database management and programming of a large distributed cluster database Greenplum EMC

Strong experience  with CSS, Ajax, JavaScript, Ruby on Rails, Apache Velocity, XML, Restful APIs, Spring, Hibernate,  Apache HTTP, Glassfish, and Apache Tomcat 

Has working administration and programming knowledge using Linux Kernel Virtual Machine (KVM), VMWare ESX/​ESXi, Clustering/​Distributed Computing

Experience with ORM software such as N/Hibernate, myBATIS etc.  Experience with build and testing frameworks such as N/ANT, N/JUnit, Maven, GIT

Experience with apache web server configuration, tuning and administration

Developed an instant messaging system using Ajax IM and the Node.js library.

Designs and programs Android phone applications using Eclipse and the Java Android phone STK development toolkit

Experience with implementing web services using SAML

Experience developing Linux device drivers and compiling new Kernals. Developed a Linux device driver for a custom network interface controller (NIC) and developed a device driver for a custom CD controller. Knowledge and experience on the steps and options required to build a custom Linux Kernel from source code. Most recent custom build was Linux 3.12

Lead for a team of software engineers using Scrum processes in an Agile software development framework. Lead team in transition phase from traditional waterfall to Agile Scrum and understands iterative software development.

Accountable to manage project team members complete tasks and deliverables on time.

Applied a Test-driven development (TDD) cycle to several recent software projects

Manages project milestones, schedule, budget, burn and report status

Understands the value of commitments to delivery made by a development team

Created appropriate tracking and performance metrics that encouraged team work

Familiar with project management techniques, SDLC processes and tools

Created appropriate tracking and performance metrics that encourage team work

Participates in technical proposal design, prototypes and writing

Experience with presentation software such as PowerPoint and OpenOffice

Capable and confidant at providing technical presentations to new and existing clients

Experienced with the business of presenting technical papers at conferences and exhibits

Willing to travel based on client, team or project needs

 

Technical Skills

·   Programming Languages C/​C+​+, Fortran, Java, C#

·   Script Languages: ASP.NET, AWK, Bourne shell, Bash, Expect, Groovy, JavaScript, PHP, Perl, Python, YUI, Ruby, Tcl, Ajax, Rails, Groovy on Grails, Ruby on Rails, Django framework

·   Markup Languages: Ant, CSS, XML, HTML, HTML5, LaTex, MathML, SAML, SOAP, XSLT

·   Scientific Languages: GNU Radio, MATLAB, LabVIEW, CUDA, OpenMP, OpenCL, Verilog, VHDL, Fortran 2008, SPICE, NGSPICE, Xilinx Vivado, Xilinx Platform Studio, Xilinx ISE WebPACK, Xilinx ISE Design Suite and ChipScope, Xilinx Software Development Kit (SDK)

·   Portal Development: Druple, Egroupware, Liferay, SiteFinity

·   Database and Hadoop related technology: SQL, Oracle, PL/SQL, PostgreSQL, pgadmin3, Mysql, Mysql Workbench, Greenplum EMC, Oracle, Microsoft SQL Server 2008, Sphinx, Hadoop/MapReduce, MongoDB, NoSQL technology,  Apache Cassandra, Apache Hive, Apache Spark, Shark, Flume, Hadoop YARN, Cloudera Impala SQL

·   eCommerce software: Magento, Bamboo, Activiti BPM Process Engine,  ActiveMQ , Single Sign On, SAML 

·   IDE: Eclipse, Netbeans, Microsoft Visual Studio 2013Application Servers: Tomcat 7, Glassfish 3.1.2

·   Version Control: Subversion SVN, CVS, GIT, Team Foundation Server (TFS)

·   Software Systems: Apache, Mono ASP, Splunk, RabbitMQ, Nagios, Junit, GNU Radio, OMNeT++ , NS-3 network simulator, Common Open Research Emulator (CORE) Extendable Mobile Ad-hoc Network Emulator (EMANE), Microsoft LightSwitch, Dynatrace, Logi Analytics and Chef Deployment Tool

·   Virtual Machines: Linux Kernel Virtual Machine (KVM), VMWare ESX/​ESXi, VirtualBox, Capistrano, Clustering/​Distributed Computing

·   Linux Services: DHCP, DNS, TFTP, LDAP, NFS, PXEBOOT, CRON, SENDMAIL, CUPS, IPTABLES, BIND, SYSLOG

·   Protocols: IEEE 802.11,abg n ac ad, IEEE 802.16, arp, atm, IPv4, IPv6, IPSec, rip, ospf, tcp, udp, nfs, pgp, bootp, dns, dhcp, ftp, finger, http, https, imap, irc, ldap, mime, ssh, smtp, snmp, soap, smb, WebDev, Wireless ad hoc routing OLSR, MMRP, OSPF

·   Network Security Tools:  Metasploit, Snort, Backtrack, Aircrack, OSSEC HIDS, Nagios, Nessus, NMAP,  Wireshark, TCPDUMP, Kismet, Shadow, Nessus, OpenVAS

·   Intrustion Detection Systems: Security Onion, Prelude, Suricata, BRO, OSSEC HID, Snorby, NIDS, IPS, HIDS, Honeynet

·   File Systems: NFS, Lustre, GPFS, Apache Hadoop HDFS

·   Operating Systems: Unix System 5, BSD, Linux, Windows NT, Andriod, Tiny OS, IBM AIX, Ubuntu, Debian, openSUSE and Enterprise, CentOS, Red Hat Fedora and Enterprise Red Hat,

·   Office software: Micosoft Office software and OpenOffice Productivity Suite, Microsoft Project, Team Foundation Server, Project Server

·   SAS Data Miner Tools including Base SAS, Enterprise Guide and Enterprise Miner 2014-2015  

 

 

 

Patents

 

R. Pino, M. Shevenell, “Method And Apparatus For Monitoring Network Traffic,” United States Patent  Serial #: 13/658,513, Publication# WO2014066166 A3 October 2012.

 

Journal and Conference Publications, Invited Talks and Book Publications

 

Participated as Technical Reviewer for a Small Business Innovation Research (SBIR) Program on the topic of Cybersecurity Tools for HPC Systems, U.S. Department of Energy, Gaithersburg, MD, December 3, 2014.

Pino, Robinson E., Kott, Alexander, Shevenell, Michael (Eds.), “Cybersecurity Systems for Human Cognition Augmentation,”Springer, November 2014.

Michael Shevenell,“ A Reconfigurable Neuromorphic FPGA Network Intrusion Detection Architecture, “ Presented paper, CyberSci Summit 2013 , Fairfax, VA 2013.

Robinson E. Pino, Michael Shevenell, “Network Science and Cybersecurity,” Springer, June 2013.

Michael Shevenell, “An FPGA-based IDS architecture to improve intrusion detection performance while reducing size, weight and power compared to software based IDS,” Army Research Laboratory, September 2013.

M. Shevenell, R. Pino, H. Cam, “Computational Intelligence And Neuromorphic Computing Potential For Cybersecurity Applications,” SPIE Defense, Security, and Sensing, Machine Intelligence and Bio-inspired Computation: Theory and Applications VII" conference, Baltimore, MD, April 29 - May 3, 2013.

Michael Shevenell, “Computational Intelligence and Neuromorphic Computing Potential for Geospatial Research and Applications”, Invited Talk, The 3rd International Conference on Computing for Geospatial Research & Applications, Washington, DC, July 1-3, 2012.

Michael Shevenell,Intrusion Detection Using a Lightweight Correlation of the IEEE 802.11 Physical and MAC Layers”, Network Science and Reconfigurable Systems for Cybersecurity Conference, Presented, Beltsville, MD, August 2012.

 

Michael Shevenell, Conference Assistant Chair, The Design Automation Conference (DAC) , FPGA Design and  Embedded Systems for  Wireless IDS systems. June 3, 2012.

Michael Shevenell,  “Design and implementation of a Open Network and Host-Based Intrusion Detection Testbed with an emphasis on accuracy and repeatability”,  Presented IEEE  ITNG Conference,  IEEE Publication, April 16, 2012.

Michael Shevenell, et.al., “Design and Implementation of an Open Network and Host Intrusion Detection Testbed—Information Technology”, New Generations 2012.

Professional Affiliations

- The Institute of Electrical and Electronics Engineers (IEEE), Member                                2009-Present

Education, Training and Certifications

Certification: Application Security, Info Sec Institute, 2006

Certification: Security+, CompTIA, 2012

Certification: Linux+ and LPIC-1, CompTIA and LPI, 2012

Technical Training: Network Traffic Analysis Using tcpdump and Intrusion Detection-Snort Style, ARL, 2001

Technical Training for Zynq Xilinx at Design Automation Conference (DAC) 2013

San Francisco, CA

Technical Training for Zynq Xilinx Vivado and FPGA Design, Xilinx Columbia, MD, 2013

DoD 8570 Complaint

Degree: B.S., Computer Science and Math, University of Maryland, 1995

Degree: A.A., Electronics, Prince George’s Community College, 1986

Employment History

Team Lead                                                            Izarinc contractor for Northrop Grumman

02/2014—PresentBaltimore, MD

 

Serves as technical domain expert and project lead who contributes to all software project phases from development through the implementation of advanced Hadoop and fraud detection solutions for the Social Security Administration (SSA). The projects include the design, architecture and implementation of several 100 plus-nodes using Cloudera Hadoop (CDH4) and (CDH5) clusters.  He performs analysis of fraud detection scenarios using Hadoop analytic tools such as Hive, MapReduce, HBase, Impala, Spark, Splunk, Pentaho, Hue, Tableau, Talend, Mahout, SAS and Project R for Statistical Computing. He is responsible for the Extract, Transform and Load (ETL) of data into Hadoop HDFS from various formats such as DB2 EBCDIC, ASCII, CSV, XML, JSON, Network flows and Binary PCAP data. He has experience with Hadoop HDFS loading tools which range from local Hadoop command “hadoop fs –put” to Flume, Sqoop and ETL MapReduce methods. His Cyber intrusion detection experience in Machine learning, Neural Networks and Statistical Computing has been successfully applied to SSA fraud detection scenarios. He is a key member of the SSA  Advanced Data Analytics and Fraud Detection Lab. The purpose of the Lab is to leverage Open Source and commercial Hadoop technology to advance SSA fraud detection capabilities. The function of the lab is to evaluate new tools/methods, and to model a framework for expert systems, combining techniques such as classifiers, recommenders and complex event processors.  

 

Currently, developing new Hadoop Hue applications using the Cloudera Hue Django Python SDK. The new applications provide a custom GUI interface to execute HiveServer2, and Impala queries. The output from the queries produces custom graphical analytics such as dashboards and visualization of fraud detection data. The custom Hue applications use the  Apache HiveServer2 Thrift API for executing Hive queries and listing tables. The same interface is also used for communicating with Cloudera Impala applications.

Researching and prototyping a custom Field Programmable Gate Array (FPGA) to parallelize real-time data compression of Hadoop HDFS bound data. 

Received outstanding team lead award on 9/2/2014 for leadership skills and technical capabilities as part of Task 752 Office of Information Security (OIS) Predictive Analysis for Disability Fraud Phase 2 Project.

 

Research and Development Program ManagerICF International
02/2001–01/2014Baltimore, MD

Project Team Lead responsible for new research, experimentation, and software development for the Army Research Laboratory (ARL) Network Security Branch.  Responsible for the daily management of a team of Software Developers, Electrical Engineers and Scientists working on a number of research projects, ranging from theoretical IDS methodologies, IDS re-engineering, to experimentation and research. One of the tasks as Project Team Lead was managing and designing experiments for a wireless tactical IDS project. 

Develop a enterprise security framework using the Splunk cloud solution. The solution connects Splunk to a Hadoop back-end which allows Splunk analytics access to machine-generated data. Custom modules are being developed for IT management, virtualization monitoring, infrastructure monitoring and enterprise security. The custom modules were developed using Javascript for client side development, and Python and the Django framework for server-side development.

Using a customer-driven requirement to select a new IDS interface framework for a pilot project. We used a wide selection process to test various Java and PHP frameworks. After a 6-month selection process testing various frameworks, we selected the Java Liferay framework as the new IDS interface standard.  We demonstrated to the IDS development group that the Liferay framework has the capabilities and features required to meet the needs of a future IDS interface infrastructure. The Development group accepted the framework as the new IDS interface standard. Designed and implemented the selected IDS interface framework using a role-based and social network architecture built-in to the Liferay framework. The framework uses Java, PHP, and PostgreSQL and SQL stored procedures, which allows for low-cost, rapid software development, while also providing a robust application to meet the customer’s needs.

Programmed and developed a Vulnerability Scanning System using Nessus, PostgreSQL, MySQL and PHP programming languages. Drupal was used as the CMS framework and provided the basic functionality such as login and administration requirements. Drupal PHP modules were developed to provide the custom portal interface to the scanning application. This system is used to scan an 8,000+ node network. The results of each scan are stored in a database. The system provides for scanning report display, scan approval or failure options, and report generation and scanning report vulnerability analysis. Using this database, network engineers can accurately know vulnerabilities on every system and active network ports. For example, the database is used to calculate the exact number of FTP and Web ports open and what IP numbers are using these open ports.

Responsible for the design, integration, and development of the tactical host testbed. The testbed is used as an experimentation platform for an Army project. The testbed hardware architecture ranges from UNIX/Linux and PC systems to mobile phone devices. The clients on the network send host-based detects using OSSEC software, while a IDS sensor is used to collect network traffic. Both network and host-based alerts are correlated in the IDS cluster and displayed using the IDS Liferay interface. To create the testbed, we used the existing IDS infrastructure, various network testing tools, and vulnerability generators. We integrated the components and deployed them in a scalable network environment. The testbed provides several benefits for testing current tactical and future wireless testing. Cost reduction resulted from using all software components in the testbed, from open source software. The testbed included Emane and CORE modules to perform physical and MAC layer intrusion detection experiments. Hadoop and MapReduce were used to store and process testbed data. Hadoop was used to create a scalable Internet traffic measurement and analysis tool needed to handle terabyte or petabyte network traffic.

Has a complete working understanding of Intrusion Detection architecture, from hardware and software installation, configuration, and design to the details of interface queries and post-processing activity. Involved in the general repair and maintenance of the Lustre and GPFS cluster file systems and Hadoop HDFS filesystem. He also has a working relationship with the hardware manufacture of the cluster Data-Direct. He installs and maintains the Sun Grid Engine, which controls jobs on the IDS cluster. He also has an understanding of how process communications occur using Python version of CORBA ORB, which is used in all versions of a Intrusion Detection System. The back-end of the IDS uses Python and CORBA ORB to communicate with the hundreds of IDS network data collection sensors. He selected and designed the IDS interface, a Java social networking role-based framework that uses Liferay and Quercus, a Java based PHP engine. Using Quercus in Liferay allows the existing PHP applications to be transitioned to portlet applications using standard PHP code and JavaScript, rather than writing the portlets in Java code. New portlet applications were programmed using Ruby on Rails, Java, Apache Struts, JFS, Ajax, JavaScript and Groovy on Grails.

Owner/Network Programmer and AdministratorCove Software Systems, Inc.
1996–2000Annapolis, MD

Mr. Shevenell was President and Founder of Cove Software Systems. The company designed and supported the operation of over 300 UNIX/Linux and NT 2000 Web servers and provided 24/7 operation and security of the servers. He was responsible for all daily operations, which included Cisco 7500 series routers setup and configuration. He designed the LAN and WAN using Cisco routers and switches and programmed Internet Web applications using Perl, C, C++, Oracle Version 8, and Java. He performed UNIX system security and installed and maintained firewalls. He located and scanned for vulnerable systems using whois, domain name system queries, ping sweeps, port scans, and OS detection. He developed extensive experience in finding back doors, Trojan horses, viruses, and buffer overflows on Sun and Linux systems.

Programmer ManagerUniversal Hi-Tech Development, Inc.
1992–1996Adelphi, MD

He developed business database applications using Oracle, Perl, awk, X-Window (X11), Motif, C, and C++ for Army Research Laboratory. He was Program Manager for Universal Hi-Tech Development employees at Army Research Laboratory.

Network and Database Programmer              Arinc Research Corporation
1991–1992Annapolis, MD

Developed UNIX-based imaging systems used by Special Forces Operations at Pope Air Force Base. The system was used to support Operation Desert Storm. He served as Team leader on a project that developed a UNIX image retrieval system using Oracle, C, and C++ languages. The systems were developed for Warner Robins Air Force Base.

Database and Application Programmer              Oracle Complex Systems Corporation
1989–1991Arlington, VA

Developed Oracle database applications for RFPs for Government and commercial applications. He was team leader for the development of the Cataloging Tools Online System (CTOL) for the Defense Logistics Agency.

Electronic Design Programmer and UNIX AdministratorGould Electronics Corporation
1987–1989Lanham, MD

Designed and developed several RFPs for DoD and other Government agencies and developed applications using C-language for optical jukebox systems. He developed Sun OS UNIX device drivers for optical platters and designed and developed UNIX barcode scanner applications using Lex and Yacc programming languages.

Electronic EngineerUniversity of Maryland
1984–1987College Park, MD

Developed and tested embedded signal processing applications and programmed programmable logic arrays and embedded C-language programs on a project for SDI and the Department of Energy.

 

 



Experience

BACK TO TOP

 

Job Title

Company

Experience

Senior Software Engineer

ICF International

- Present

 

Additional Info

BACK TO TOP

 

Current Career Level:

Manager (Manager/Supervisor of Staff)

Date of Availability:

Within 2 weeks

Work Status:

US - I am authorized to work in this country for any employer.

Active Security Clearance:

Inactive Clearance

US Military Service:

Citizenship:

US citizen

 

 

Target Job:

Target Job Title:

Senior Software Engineer

 

Target Company:

Company Size:

Occupation:

IT/Software Development

·         Computer/Network Security

 

Target Locations:

Selected Locations:

US-MD-Baltimore

Relocate:

No

Willingness to travel:

Up to 25% travel