Engagements listed for generically-named clients, i.e. Energy Client, were performed under NDA and the engagement description is based solely on publically available statements.
CLOUD ARCHITECT DEVOPS ENGINEER MOBILE DEVELOPER
As a DevOps Engineer, I help provision and manage AWS infrastructure. My goals are to achieve high-availability and elasticity while minimizing costs.
As an iOS developer, I worked as part of a large team to deliver new features to an existing high-profile iOS application.
As both programmer and architect, I served as part of a large team to design, build, and deliver mathematics and ELA content in a next-generation educational application. This application is one of the most ambitious applications to ever be targeted for a mobile platform.
Designed and constructed an award-winning iPad application to assist client in realtime energy pricing. Application is heavily integrated with backend web service APIs.
Designed and constructed an iPad application to deliver client’s product catalog in an innovative new way.
Built several iPhone applications for a variety of top-tier media clients. Application allowed capture and minor editing of live video and realtime encoding and distribution of videos via client web services.
I moved to the DESC - Deloitte Entity Search and Compliance - project in 2005. DESC is a critical risk management system for DTT as it is where new opportunities are vetted for compliance across a variety of strict standards.
Primarily an ASP.NET application with a clustered SQL Server persistent store, DESC is the perfect example of a modern web application with heavy demands across many architectural measurements:
Performance. Used thousands of times daily by partners and other practitioners, the performance of DESC application had to be fast enough that these valuable employees were not wasting time waiting for slow page loads or inefficient queries.
Availability. DESC is a mission critical application. Unexpected downtime was not acceptable. High availability was provided through robust handling of error scenarios at the code level as well as through provisioning redundant hardware to eliminate single points of failure.
Scalability. DESC was designed to be installed on as many web servers as needed in order to scale out and meet the performance requirements of the application.
Securability. DESC contains an incredible amount of sensitive data and, thus, security was a paramount concern. Beyond normal authentication requirements, the data and the actions within the application required proper authorization. For example, for a given entity, only authorized parties were allowed to view certain details about the entity or to perform various actions for this entity. This authorization was provided through a complex data-driven rules engine that analyzed both the entity as well as its relationships to other entities and inbound changes that were dynamically applied to the entity structure.
Maintainability. DESC is a very complex application but was designed in a structured way such that each significant component is maintainable on its own without having to understand the rest of the system. Appropriate unit test coverage was provided so that components could be refactored by new developers without jeopardizing the correctness of existing code. New developers would only have to become familiar with a single component before delivering useful code.
Deployability. The DESC application was packaged as a set of assemblies which could be deployed automatically by IT staff to servers throughout the data center with minimal external dependencies and configuration.
Extensibility. Entity compliance is an ever-changing domain and, as such, any application in the space must be easily extended so that new rules can be injected into the various approval workflows. The DESC application was written such that these rules, as well as other significant components, were distinct, testable pieces of the system. Rules and components could be created and added on demand while others could be modified in place or removed, all with the system dynamically updating accordingly.
I was brought onto the Deloitte Platform project at the end of July 2002. The Deloitte Platform is a global initiative within the DTT Global Office of Information Management to create a Web Services-based reusable architecture for deploying future globally-accessible web applications.
Initially my role was as software engineer in a Proof of Concept project where it was demonstrated that a number of existing applications and data stores could be exposed via a Web Services interface and hosted inside of two separate enterprise portals, SAP Enterprise Portal and Microsoft SharePoint Portal Server. I built a Web Service using C# in Microsoft Visual Studio.NET which encapsulated two Lotus Notes databases and made them searchable in the two portals. The Web Service was published in the UDDI directory hosted on our Microsoft Windows.NET Server. The Proof of Concept was a success and resulted in a change of strategy.
Instead of building a number of custom Web Services to provide portal functionality such as single sign on, search, and collaborative workspaces, the decision was made to partner with a number of 3rd party vendors and expose their existing functionality through Web Services interfaces.
My role changed to technical architect and I began a long process of evaluating 3rd party products for use in the portal. DTT uses both SAP Enterprise Portal and Microsoft’s SharePoint so all proposed solutions must work seamlessly with both portals. The products evaluated include the following: Verity K2 Enterprise, Vignette Content Management Suite, eRoom v5 and v6, Groove, Epicentric Modular Web Services, Netegrity SiteMinder, and Documentum. For each of these products I wrote an in-depth technical evaluation.
Upon final selection of the key products, which include all of the above except for Epicentric and Documentum, I was tasked with creating a global infrastructure to upport a full lifecycle deployment of the Deloitte Platform. Working with in-house and external administrators, I devised an infrastructure supporting development, QA, staging, and production for deployment in the United States, United Kingdom, and a to-be-decided Asia / Pacific installation. The development environment consisted of 9 servers and scaled up to an initial 20 server installation in production, supporting 15,000 users. The architecture is scaleable and will eventually support 150,000 internal and external users and will host the three major web sites for DTT including Deloitte.com, Deloitte Resources (intranet), and Deloitte Online (extranet).
My responsibilities as technical architect for DTT were as follows:
The assessment I wrote for Kemper in November 2000 was reviewed and a migration plan was approved to improve the architecture of the APLUS application. I was contracted to assist in this effort. The majority of my time on this project has been spent between development and production support. As part of the APLUS team I am assigned various production support issues that arise. Sometimes those issues are simple data changes, sometimes they are complicated bug fixes in the application code. The development time is spent adding new features to the application required by the business customer and cleaning up the existing code by separating logic into presentation and business tiers. A data tier was also proposed but has not been approved to date. The eventual goal being to have a thin client with the majority of the presentation logic and a server-based business tier. In July 2001, APLUS was one large client application.
The effort has been largely successful. New features have been added on time and on budget while significant improvements have been made to the structure and stability of the code base. The overall footprint has decreased despite additional features because of increased code reuse between areas of the application.
My responsibilities as architect and developer for Kemper were as follows:
The FIS Clearance web site was created to allow United Kingdom producers to clear business for FIS in the United States. After ruling out modifications to the APLUS client application, a web application was decided to be the best solution. Over a 12 week period, the site was created from scratch, including about 3 dozen ASP files and associated includes and 3 middle tier business objects. The ASP pages were written with Visual InterDev with integrated SourceSafe. The business objects were written with Visual Basic 6. The business objects returned data to the ASPs in a custom XML format. The MSXML component was used to create and parse these XML data streams. Photoshop 6 was used to create the images for the site.
As with other Kemper web applications, the site is served on IIS 4 and goes through the Webthority proxy server which handles SSL and LDAP-based authentication. Traditionally, a significant amount of work has gone into working with the Webthority team to test the security of the site and to ensure maximum availability.
My responsibilities as architect and developer on this project were as follows:
One of the most time-consuming tasks involved in using APLUS in the generation and printing of various forms and letters. Before this project it took between 40 seconds and 2 minutes to generate a given document in the field. The same document only took 10 seconds or so when generated local to the database. The overhead was due to the slow connections between field offices and the home office and the massive number of database queries required to generate a given document.
I proposed offloading the document generation task to a powerful server in the home office. A request would be initiated by APLUS in the field, the server would receive the request, generate the required document, and return a completion result. The documents were and continue to be stored on a file server at the home office so the document itself did not need to be sent back across the network.
The solution was implemented as a web service using SOAP and the accompanying components. The document request was sent via SOAP, collected by the SOAP Listener on the server, and then the documents were generated. A significant amount of componentization of the APLUS logic was required to make this work. The document generation logic was eventually hosted in MTS and was separated between several objects.
This was one of the bigger successes for APLUS as the average time for document generation went down to 3-4 seconds in the field. The generation time was better than expected due to the server actually generating the document much faster than in the field. The server was a dual processor, 1 GB machine where the average field machine was an older Pentium model with 32 MB RAM and most users having Lotus Notes loaded simultaneously, i.e. very slow machine.
My responsibilities as architect and developer for Kemper were as follows:
AON was in the process of writing an web-based application to allow agents in the field to capture customer information at the customer site, get online, submit the information, and retrieve a quote for the proposed coverages. The application was to be run on the agent’s laptop using Personal Web Server. A local SQL Server database contained the customer and quote data and was replicated with the master SQL Server database in the office.
I was brought in to assist in meeting the proposed deadline and became responsible for the web application. The architect directed me to create an XML document definition to contain the variables captured in the customer coverage application interview. Each page would collect the data, insert it into the XML document, and pass the document to the next page to persist the data. All of this was done using ASP written in JScript and JScript classes.
My responsibilities as developer for AON were as follows:
TelePlus is the online system that claims operators use to take claim information over the phone. TelePlus is mostly a COBOL application with a semi-GUI front end. Kemper is in the process of replacing the front end with a web-based front end served in the Internet Explorer browser. My responsibility was to review all front end screens and document the conversion of those screens to web page equivalents. I was also required to review the COBOL code and identify key business logic which could be moved to an NT middle tier.
My responsibilities as business analyst for TelePlus were as follows:
Kemper receives over $200 million in legal bills from over 700 law firms which it does business with. Ideally, each legal bill is audited for compliance with the services agreement between Kemper and the law firm. The average legal bill is generally 10-20% overstated. In other words, the majority of law firms charge Kemper more than they are allowed to in their agreement with Kemper. Unfortunately, only a handful of internal auditors are on staff to perform these audits and, hence, the vast majority – over 80% - of the legal bills go unaudited. If these legal bills could be audited, Kemper would save millions per year in legal fees.
The JPI system allows law firms to submit legal bills to Kemper electronically for payment. The system performs an automatic audit on the invoices in the electronic document based on a set of threshold rules and math checks. Some reductions are made automatically. Other invoice line items are flagged for manual review by internal auditors. The system then provides tools for the auditor to do the audit more quickly. After an invoice is complete, an Explanation of Payment (EOP) is sent to the claim adjuster and made available to the law firm via an online “inbox” provided by the external site. Emails are sent to the adjuster using AspQMail and Lotus Notes.
As architect and developer on this project, I was responsible for designing and building the manual review interface and for designing and building an interface which allows admin personnel to set up law firms, agreements, roles, timekeepers, and other key items in the database. I was also responsible for overseeing the design and construction of the law firm interface as well as implementation of certain business rules in the overall system. I was responsible for maintaining the DB2 and SQL Server 7 versions of the ERD. DB2 was the production database but both Access and SQL Server 7 were used in prototyping.
I was responsible for the architecture of the system overall and its integration with existing Kemper facilities. I was responsible for selecting and configuring the web server and application server hardware. External user logon is performed by Webthority against the enterprise LDAP server. The proxy server maintains the 128-bit SLL certificates. The web sites are hosted on two web servers with business components residing on an application server. The web servers have hardware-based load balancing using Network Dispatcher. The web servers are running NT 4 with IIS 4. The application server is running NT 4 and ADO 2.5. Data access is restricted to the application server for security reasons. All business and data access components are located on the application server and referenced on the web servers via DCOM.
The internal web site is comprised of almost 9,000 lines of ASP code across over 30 web documents. A single COM object is used and is comprised of over 3,000 lines of Visual Basic 6 code. The invoice loader application is an additional 2,000 lines of VB code. I was responsible for the code reviews of the external site and the EOP object.
My responsibilities as architect and developer for Kemper were as follows:
This was a short, two week engagement. I was contracted by Kemper to analyze a small, 2-tier application written in VB with an IBM DB2 Universal Database backend. The application was originally intended only for departmental use. The application was very successful and popular, though, and had been rewritten and / or extended for three additional lines of business. I examined two versions of the application and documented a migration path to convert them from the 2-tier architecture to a scaleable 3-tier architecture using VB and COM, ASP, MTS, and the existing DB2 database. I analyzed each form and code module in the VB application and documented the complexities of the forms and the functions in the code. I itemized the tasks for separating the application into 3 tiers. I assessed the current IIS infrastructure and made recommendations for maximizing performance, scalability, and availability. I also assessed the possibility of deploying the application using CITRIX nFuse as a quick method of getting the application out to users on the web. Finally, I assessed the challenges in interfacing the application with existing or pending CORBA Java components using a COM / CORBA bridge or SOAP.
My responsibilities as enterprise architect for Kemper were as follows:
Comro.com’s original Internet site was built using a 2-tier architecture. When I started at Comro my first responsibilities were to help them finish moving the original site over to a 3-tier architecture using ASP, VB COM, and SQL Server 7. I was responsible for the RFP functionality, the region maps, modifications to property searches, and the Comro Direct functionality. Each area required five to a dozen functions in a COM object and usually a half dozen stored procedures. Once the site had been migrated to the new architecture, it was moved to the production servers.
The Comro Direct and RFP features both used CDO and scheduled NT processes to automate sending of user email messages.
Comro did not have a replication architecture in place prior to my arrival. They also did not have separate development and staging environments or solid code promotion procedures. I contributed significantly in planning and deciding how to implement replication and how to design and promote between a multi-stage server environment.
My responsibilities as a web developer and SQL programmer for Comro.com were as follows:
HTS contracted me as an interim practice manager for their new e-commerce practice. They had an immediate opportunity with Nalco that required quick business requirements gathering and analysis. I attended several client meetings with the HTS account manager to interview key client personnel. During these interviews I documented the requirements of the applications needed and other details which I would use to assess the complexity of the project and estimate an accurate work effort. After the client interviews I went back and broke the project down into several stages for each application. For each application I identified technologies which I believed would yield the best cost-benefit ratio. For each stage I identified the resources needed and assigned tasks to each resource. When the project plan was complete I created a formal project proposal which documented the overall scope of the project and detailed costs and schedule estimates. Once HTS approved the project proposal, we submitted it to the client.
My responsibilities as interim practice manager for HTS were as follows:
When I first started at Arthur Andersen, my responsibility was to learn the EOR application and to begin debugging the existing version. When a bug is reported, a System Investigation Request (SIR) is created by the tester and assigned by management to one of the developers. I completed several dozen major SIRs, the application was promoted to production, and the EOR code base was used to generate additional applications: QARM, Fraud, BPO, and BRCA. Each application was then customized to the specific requirements of the application sponsor.
One of the other responsibilities I had was to manage a daily data migration between application data stores on several SQL Servers, some V6.5 and some V7.0. The migration code was encapsulated inside of several DTS packages and scheduled with the SQL Server Agent. The packages moved approximately 600,000 records each day, some requiring BCP due to translation of code page specific characters to Unicode (nvarchar). Replication was investigated, proven, and is due to be implemented in the environment soon.
Several application components and modules were written in Visual Basic, compiled, and promoted to production to run when needed or on a scheduled basis. One of the applications used CDO and SMTP to automate emailing statistical reports to management.
Another task required me to use the Wise Installation system to create installation executables to install EOR-based applications on multiple web servers. The installation was required to set up ODBC connections, install COM objects into MTS, and install all ASP code and HTML pages.
My responsibilities as a web developer and SQL programmer for the various Arthur Andersen applications were as follows:
PROJECT MANAGER, ARCHITECT
Carlson Wagonlit Travel (CWT) hired Proxicom to help it create a digital strategy for competing against Expedia, Travelocity, American Express, and others.
My responsibilities as the project manager and technical lead were as follows:
General Motors Corporation (GM) hired Proxicom as general project management and architecture contractors for their gmbuypower.com site. I was one of 3 primary project managers for the effort. I was responsible for the site rollout in Mexico, Taiwan, Brazil, and Australia. The entire gmbuypower.com team was made up of over 300 contractors from 9 different vendors, including IBM Global Services, EDS, CompuWare, and others.
I worked with the architecture team (30+ architects) in planning the Version 1 architecture and the Version 2 architecture. The majority of my contributions on the architecture team were in the area of configurator selection and implementation. The configurator software, provided by a third party, allows a customer to select various options on a car. This is a hugely complex piece of software as it has to keep track of all the options combinations, the cost of each, and various rules regarding inclusion and exclusion of additional options.
Additionally, I participated heavily in the construction of a single UI to be used in all 40 initial rollout countries and the underlying components which allowed local variables such as postal code vs. zip code to be resolved. I spent several weeks in Australia leading a team of 18 developers, architects, UI designers, and several business strategists in working directly with the Holden company of GM to implement the Holden version of gmbuypower.com. There were quite a lot of technical and political challenges to resolve in Australia, including differences in calculating distance to dealerships based on latitude and longitude, postal code, and legal issues regarding storage of personal data. The Holden site was run on NT with IIS and ASP. We spent a lot of time revising the ASP to match the functionality of the JSP used in the United States version of buypower.
My responsibilities as a project manager and technical architect were as follows:
PROJECT MANAGER, ARCHITECT
McDonald’s hired USWeb to lead the effort in building their corporate Intranet, later called “Archie”. The Intranet was to be rolled out to the 3,000 or so Oak Brook employees with a view toward rolling out to all 8,000 or so US domestic employees. Access to some Intranet applications would be given to McDonald’s suppliers via extranet and to the Owner / Operator stores. The Intranet would then be rolled out internationally after that. When I left they were beginning to roll out to Sweden using Domino R5.
I was the project manager, lead architect, and LotusScript and ASP coder. The system was hosted in various point releases of R4.6.x of Lotus Domino and IIS 4 on NT 4. All of the Intranet data – articles, press releases, etc. – were stored in Notes databases for security reasons. The home page of the Intranet was hosted on NT as was the content editor. The home page was made up of modules which were periodically updated by an NT service. The NT service would connect to Notes through ODBC (not easy) and pull headlines and article summaries out of the database to populate these modules. In addition, there was a polling / survey module which had its data stored in an Access 97 database. The content management interface was used by almost 100 content managers throughout the organization to post their articles and other content. The content management interface was in Notes but the forms where the actually content editing was done was ASP because of a dependence on DHTML and IFRAMEs.
PRACTICE MANAGER, ECOMMERCE & GROUPWARE
Designed and constructed a marketing presentation printing application. Users selected presentation inserts and the application created a customized proposal / presentation for customers. The entire application was hosted on the web.
Designed and constructed a Lotus Notes database for tracking engineering programming requests.
Duplicated functionality of an existing Visual Basic application in the Lotus Notes database.
Converted an existing knowledge repository into a Lotus Domino hosted web-based knowledge repository
Designed and constructed an updated version of the Parts Maintenance Planner application which tracked and estimated machine maintenance schedules and costs.
Designed and constructed an application used for tracking and distributing career goals and progress.
Designed and constructed a knowledge repository for internal use. The repository held documents as attachments in the Lotus Notes databases but also used SQL to access financial data held in Platinum and Proamics application databases. All content was delivered dynamically to a browser-based client.
Designed and constructed a web-based marketing materials catalog. The existing paper-based catalog had approximately 2,000 items available for dealers to order. An Adobe Acrobat document was created for each item in the marketing materials catalog and then placed as an attachment in a document in the online catalog. A comprehensive search engine was created to allow users to easily search the online catalog.
Designed and constructed internal application for tracking differences in actual and estimated consultant billings.
SEPT 1992 – DEC 1997
Revision of word processing control to handle Japanese Kanji character set and other Asian character sets. Required conversion of internal data format to Win32 Unicode (double byte) character set.
Revision of word processing control to handle Hebrew character set and right-to-left typing to support Middle Eastern typing direction.