Technical Portfolio:

Grant – Portfolio Summary and Samples

Grant Anderson
12/15/2013

Here’s a brief (ok, not so brief but it’s only a small part of what I’ve accomplished in my career) summary and sample of some the areas of my accomplishments and software and database/data warehouse/OLAP cube development.

I’ve put my experience and accomplishments into the following categories:

  • Project Management
  • Windows C# Software Development
  • Web ASP.NET C# Development and Applications
  • Database and Data Warehouse Architecture and Development
  • Production Systems Architecture
  • OLAP Cube Design and Development
  • Technical Documents
  • Patents
  • Technical Publications
  • Video Training Courses

Project Management

I have been a software development architect, senior manager, DBA, database developer, software developer, BI consultant, and some other roles.  I’ve seen a lot of project management approaches and techniques, including “Scrum Madness”, and have found them all rather inadequate and problematical.  What if there were a new project management technique that was simple, lightweight, developer friendly and yet didn’t neglects users or managers?  Could such a project management system be developed?

Yes, and I’m working on it.  I call it the T3 software development project management methodology.  T3 stands for “on Time, on Target, and on Track” a modification of the old military artillery objectives of “on time and on target”.  I take a somewhat military approach to software development and project management.  I believe that “super star” developers, I call them “Commando Developers” because they can do it all quite readily and most effectively, can get more projects done more successfully using a “Commando” type approach then your standard waterfall or even Scrum approach.  This approach is inherently agile but it has several salient aspects that are often missing from typical software development methodologies.

One aspect is the generation and use of standardized documents to show status and results throughout the agile software development process.

The main document elements are:

  • Software Vision Document – What is the vision for the software or software system?
    • What exactly is it supposed to do?
    • What form(s) will it take?
    • What will it look like?
    • What does a prototype look like?  (You’ve done a prototype, right?)
  • Software Component Design Document – What are the components that need to be built?
    •  You gotta have a diagram or set of diagrams here!
  • Version Release Plan – Summary – What versions are you going to release during the development process with what major features when?
    • This is a plan for agile releases with dates.
  • Version Release Plan – Details – What are detailed items that will be contained in each of the version releases in your Version Release Plan – Summary?
    • You should be able to have a good and fairly accurate list if you have a solid software vision, you’ve done a good component design, and you really know what you are doing.

And please note that the version plans are dynamic documents not static documents.  They are updated as they pass through the software development process. They are living and working documents and function much like status and development reports.

And here’s some screen shots of some genericized Version Plans that actually use in practice.  They are great because you always know what’s in a release, what’s expected to be in a release, and you always have a version plan to follow (even if you have to add and move some things so as to deal with shifting priorities). And you know how good your Software Vision and Component Design (and you) are by how close you stay to the original plan.

 

Windows C# Software Development:

I have been developing Windows software in C# for quite a few years.  (Before that I developed in C, C++, and VB on Windows and various Unix variants.)  I extensively use the DevExpress (www.DevExpress.com) libraries in the GUIs that I build in order to provide easy to use yet advanced GUIs with a high level of functionality. 

I can’t show you the complex software tools and systems that I’ve built, as that would not be in conformance with my intellectual property agreements with the companies I am involved with, but I can show you some of the many Windows tools and utilities that I’ve produced recently and over the last several years.

Most of the corporate tools and apps and GUIs that I produce are actually produced as Windows applications and distributed internally via ClickOnce to users’ desktops.  Why is this?  It is simply because developing Windows applications can be much faster, more complex and functional, and easier to maintain than developing web apps.  Often 5x faster I estimate.  And by using Microsoft ClickOnce you can deploy to a web server and the users can simply click on a link in their browsers and the complete app is quickly and easily downloaded and runs on their own PC.  Minimal web server resources are used.  And the apps and tools can be much more feature some and functional than web apps.  I do web apps too and we’ll cover that in the next section.

Grant’s Access Test Tool

So when you need to see, and sometimes troubleshoot, if your users have network connectivity to your web and database servers, because the corporate network guys are continually closing up network port forwarding due to security concerns (or their own whims) what do you do?  Well, you send them a long email, or series of emails, instructing them to run ping on the server names and if those don’t resolve than on the IP address, and then you’d like to see if they can access the database servers but you can’t give them the database logins and passwords you use in your apps (because they shouldn’t have them), so what do you do?

Well, you go back to the beginning of this tedious and time wasting exercise and give them the web link to Grant’s XYZ Access Test Tool. And ask them to click on it and then click the “Go!” button when the app comes up.  They wait a few seconds for the network connectivity tests to complete and then click “Copy to Clipboard” and paste it in an email to you.  And you can both see that the network connectivity is clear or exactly where problems are.  

It’s very convenient and useful.  Here’s a screen shot of this handy dandy tool:

Grant’s Cube Processing SSIS and Tool

So I got tired of individually selecting the SSAS (Analysis Services) dimensions in BIDS (Business Intelligence Development Suite) and processing them one by one.  Which I needed to do much too often because Microsoft’s “Process All” does not always process all of the dimensions before it processes the measures.  This bug/quirk results in very strange and weird behavior in your cubes and can drive you crazy until you figure it out.  Microsoft hasn’t fixed this, and some other SSAS idiocincracies, nor do they let you know about it advanced.  So, tiring of this one by one selection in BIDS 2005 and 2008 (since fixed in 2008 R2), I wrote a SSIS dimension processing script package that automatically reads via script all of the dimensions in a cube database and processes them automatically (full process) so you don’t need to specify each dimension individually in SSIS (or forget to add one and then wonder why your cube dimension doesn’t work).  I used this quite a bit, configuring it for each cube database I built.  (A cube database can hold multiple cubes by the way).

So I got tired of continually having to run SSIS and wait for the script objects to turn green.  Besides, I need a better tool and a command line version too for some new ETL I was building.  So I built the Cube Processing Tool where I can connect to any cube database and selectively build the cube or cubes and one or more or all dimensions.  Much faster and more flexible than an SSIS script package.

Here’s a screen shot.

Reporting Tools

Since I build data warehouses and OLAP cubes I deal often with reporting systems (but thankfully don’t have to build reports very often!) and I’ve built a number of report tools, systems, and utilities.  I’ve found that reports are often rather scattered across multiple tools, technologies and sources.  It’s complicated.  The user has to go here and then there and then there and use this program or this web page or drill down from this dashboard and it can get confusing for them.  And they often just don’t know what reports are available to them and thus miss out on what an integrated reporting system can do for them.

So I started out building a simple reporting Main Menu program that coalesced all the different type of reports into a simple tree type menu.  Users could simple explore the report options available to them and click their way to any report that they would need.  (Note that I didn’t have access to DevExpress controls for these following GUIs.  Nowadays, I have my own licensed copy!)

This morphed over the years into me building the Trinity Report Manager and Reporting System for AT&T in 2009.  This was a C# Windows ClickOnce app using DevExpress for color and functionality.  It allowed users to see a tree of all the reports available to them and click on them and see them in a MDI form or tabbed or detached window view.  It supported SSRS reports, web based reports, any report program, and a built-in OLAP cube browser, and a special set of SQL and MDX cube report capabilities in what I called a “Dynamic Report Viewer”.  This Dynamic Report Viewer made use of stored report definitions in the reporting database to automatically prompt users for report parameters and then execute the SQL or MDX necessary for the report and show a grid or chart or both of the results.  It was quite flexible and I had 87 beta users and lots of rave reviews at the time that management changes ended the development program.  I would show you some screen shots here but I can’t due to the IP restrictions.

I very well may build an independent commercial and freeware version in the future and then you’ll be able to see some screen shots here (and download and actually use it).

Data File Loading

One very persistent and annoying problem area that I’ve encountered is with loading data from files sent in by various customers.  They had different structures and often very strange lines in the files (such as repeating headers and footers from the exports from different reporting systems.

Every tried to import a million row data file with a couple of null characters sprinkled somewhere in the file?

What happens is that you try and try to load it via SSMS (SQL Server Management Studio) data import functionality (or via SSIS) and it craps out time and time again SOMEWHERE in the file but you can’t tell exactly where the data import is crapping out.  Or what caused the problem.  I watched a database developer struggle for a week on this type of problem.  So in a couple of hours I wrote a working prototype tool that scans a file looking for problem characters, reports where and what they are, and optionally strips them out of the file.  Run time on a million+ row file was less than 20 minutes I recall and it produced a file that could then be imported without errors.  Problem solved.  (I guess I build tools to solve problems in the environments I find myself in.)

So this morphed into me building a full suite of data file import analysis, cleaning, and loading tools.

Here’s some screen shots:

Instead of struggling with dev and commercial tools that don’t handle the situation or problems I will tend to make tools to get around or through the problem.  (I also build complete apps of course!)

Difficult problems I tackle and solve.

Here’s a Business Rule Engine GUI to create process rules for a backend business rules engine that I also built.  Yes, I also build back end process engines.  I’ve developed a nice IPC (Inter-Process Control) system to control Windows services beyond what is available with the simple start-pause-stop default Windows service control.  Generally, you want to (and need to) be able to monitor and control back end processes but I find that most people just use the “start it and pray” and “customer will just scream when it stops working so I’ll know then” approaches.  I like a much more pro-active approach and I like to see at any time what is happening with my databases, cubes, and processes.  So when a user (or a manager) calls me and says, “The XYZ system is down!” I can say, “I already know and I’m working on the cause and solution right now”.  (I write good and reliable code so when that happens it’s most often a network event as the cause.  I think that my code is much less buggy than most other developers because I get few bug reports.  Mostly just requests for new features.)

Web ASP.NET C# Development and Applications:

Last year I decided to make a list of all the software, both Windows and Web apps and tools, that I have developed or have in the prototype stage over the last 10 years or so.  I came up with a list of over 100 pieces of software.  The above are just a few of them.  I’ve thought about cataloging them and putting them into a database with an image screen shot and description and then doing a web interface so that you can browse them but that’s a Phase 2 project.  For now, Phase 1, you get this long list with screen shots sprinkled in for a small sampling of the many software apps that I have built.  I am rather prolific.  And creative.  When I see the need for a tool or an app to do something that should be automated or created I will tend to build it.  Because, I guess, that have the “Builder” mentality.

“I Think…And see the potential for something Better…And thus…I Build!”

Here are some of the recent web application systems that I’ve built:

  •   ERD – Engineering Rules Database.
    •  Developed KPI/KCI data dictionary for network engineers.
    • C# in ASP.NET with DevExpress components.
  •   EBT – Engineering Budget Tool.
    • C# in ASP.NET with DevExpress components.
    • Financial tracking tool replacing Excel spreadsheets to track network component budgeting and financial planning.
    • Built additional components in Windows forms:
      • Excel spreadsheet data loader to SQL Server database.
        •  Automated with extensive data cleaning functions.
      • Excel-like data editing tool with data export capabilities.
    • Aspire Web Application Framework.
      • C# in ASP.NET with DevExpress components.
      • Modular components including a custom dashboard.
      • Database driven.
      • Used to power the EBT application.
    • DnF Charts Tool Application System.
      • Provides semi-automated construction of complex performance charts via a wizard interface for network engineers and managers.
      • C# in Windows forms with DevExpress components.
      • Click-Once web deployment for the Builder/Editor component.
      • Web ASP.NET chart viewer.
      • Chart Manager provides management of hundreds of charts in a database repository.

I cannot show you screen shots here due to IP agreement considerations.

However, I can show you a nice screen shot of my web dashboard system (a work in progress):

Users can add, delete, move, and re-arrange the widgets and save the custom widget layouts or restore the original default layout.

Using the menu at the top of the page users can gain access to all the other screens (pages) in the application.  The dashboard page can be used as a customizable user landing page or as a feature for advanced users.  Widgets are essentially standard ASP.NET ASCX controls so anything that can be done in C# ASP.NET can be done in a widget, including charts, gauges, lists, images, and so one.

What the world (and your web app project) needs is a nicely functional and affordable web dashboard!

Database and Data Warehouses Architecture and Development:

I have develop many databases and quite a few small to medium size data warehouses (often called “data marts”) as well as OLAP cube staging databases.  This section will highlight some facets of my data warehouse experience.

ETL.  What is ETL?  ETL = Extract Transform and Load.  ETL is the SQL (or other means) that you use in your database(s) to load data into a data warehouse or OLAP cube data staging database.  It typically consists of a number of stored procedures which are scheduled nightly or daily to run.

Here’s a simple list of a set of ETL stored procedures that I have developed for loading and OLAP cube staging database:

While the prefix of “uf_sf_” isn’t the most useful (this was standard for the company involved) you will notice that the rest of the naming convention is quite nicely useful consisting of:

  1. A two digit number sequence that indicates exactly which script is in the sequence.  (This is very useful as they have to be run in this exact sequence.)

  2. My name as to who wrote the procs.
  3. The name of the OLAP cube these are for.
  4. The type of operation that the procedure performs.
  5. And the class of the database object that the proc operates on.

This makes for a nicely organized and maintainable set of ETL procs.  

Here’s a brief sample of my SQL for an ETL proc:

And About SSIS:

 

I also do SSIS – SQL Server Integration Services.  I generally use SSIS as a wrapper for ETL SQL.  I don’t usually do a lot of complex data transformations in SSIS.  The reason is quite simple:  Using SQL natively is quite a bit faster, i.e. much faster performing.  Why extract millions of rows of data to another server, do some work on it, then put it back?  Inserts and updates (and every transaction) is logged in SQL Server which massively slows things down.  Better to as much as possible right on the SQL Server rather than slinging data around to different servers.  And a lot of complexity is avoided also.  Just about anything that can be done in SSIS can be better done directly in SQL often using some cleverness with CTE (Common Table Expressions) and SQL CLR functions.  Thus the folks who will wind up maintaining the ETL some time in the future don’t have to be SSIS gurus just to figure out relatively simple data transformations.

And there’s another thing about SSIS – You can’t tell which SSIS package is taking what CPU time and RAM amount.  In other words, with SSIS packages you can’t tell how they are performing because Microsoft didn’t see fit (after all these years of SSIS) to differentiate how the threads for the packages are performing in the system performance counters.  You can only see how SSIS as a whole is doing.  So you have no real idea of how well or bad your SSIS processes are performing.  So jumping on the “do everything in SSIS” is not necessarily a good idea.  (Unless you’re a moral-less developer who wants to keep everything as complex as possible and undocumented for your own guru-hood and job security.  I’ve run into quite too many of those types of people!)

 I also document my procs.  Here’s some screen shots:

  

Production Systems Architecture:

Software is often, for complex software and not just fairly simple tools, a complex system that needs to be properly architected, designed, implemented, deployed, documented, and maintained.  Often there are multiple servers and multiple pieces involved.  The properly planning for a production environment is often overlooked by developers who are really just programmers or coders and not true developers of complete production system.

Here is nice diagram from my past work in developing software solutions that utilize multiple servers and user groups.

A complex software system needs to be architected not just developed.

Here are some screen shots of the some architecture and design diagrams that I’ve produced.

A software system processes data.  So there needs to be a Data Processing System diagram and architecture.

Data goes into a software system…

And is processed and output.

And there’s another often overlooked dimension to a software system…Process Control.  How do you control and monitor the process?  You have an architecture and a system set up for that, right?

OLAP Cubes Design and Development:

I have built a lot of OLAP cubes.  I am quite good at them.  I have built cubes much fasters than others because I know exactly what I am doing, I can quickly determine how the existing database or data warehouse data needs to be cubed, and I do the entire cube system – Specs, data staging databases, ETL, OLAP cube design and development, and documentation and training.

So how exactly do you show someone the cubes that you have built?  Well, that is a challenge because OLAP cubes don’t really have a visual face unlike software GUIs.  And they are highly proprietary so I can’t show you any with corporate data. Instead what I will show you here are some screen shots of some prior cube documentation that shows some of the cubes I’ve built, the numbers of their measures and dimensions so you can get an idea of their complexity, and some purposely grainy screen shots of before and after cube data staging schemas.

Here’s a list of some cubes that I have done with their size and number of measures and dimensions.

Virtually every OLAP cube has a time dimension.  Here’s a screen shot of my Date Hour dimension which provides hourly granularity for my production cubes.

And virtually every OLAP cube starts out with an existing database with an existing data model that is often a mess.  Here’s one that I had to start with (purposely out of focus here):

 

 

 

 

 

 

 

And the “after” screen shot after I have developed a clean data staging database with multiple fact tables and dimension tables in a proper dimensional data model with good dimensional integrity.  There’s a lot of work that goes on behind the scenes in preparing the data to be in format that can  be cubed.  And that can make a good cube.

There’s a bunch of techniques that I’ve developed over my years of cubing that allow me to develop cubes quickly and easily.  And more than a few “traps” that I avoid.

I’ve thought of writing a book on real world OLAP cube development but other topics and adventures have occupied my time instead.

OLAP cubes are a lot of work and should only be undertaken after a good cost versus benefits review or analysis.

Dashboard Design and Development:

I have developed a number of dashboards using a number of products and technology.

For the CareCentric Company (now out of business) I developed a Windows desktop dashboard that used SQL Server Reporting Services (SSRS) reports and charts encapsulated into SSRS report viewer viewports to make a nice dashboard for their health care equipment product suite.  This approach required that the SSRS charts were sufficiently small enough to display properly in the dashboard and had the big advantage that anyone who could make SSRS charts could make report grids and charts for the dashboard.  This could easily have been carried over to a web version of the dashboard.  The dashboard was well received by the users and potential customers at trade shows.

 

For AT&T I have built 17 prototype and production dashboards using the Dundas Dashboard product.  This is a good dashboard product although it is a little complex and was, at the time, based up Silverlight technology.  You had to continually write scripts to power features in the dashboards and this required C# code snippets in Silverlight to extend their object model.  I did the POC (Proof Of Concept) requirements specifications, facilitated training sessions, and got Dundas to significantly reduce the price on their licenses ($35K versus $80K) for my department.  All in all, it was good technology for that time period and it performed well with both SQL databases and OLAP cubes.

I can’t show you screen shots of the dashboards I built with Dundas Dashboards here as the information in the screen shots are proprietary of course.

The complexity of always having to write C# scripts deterred others in my department from developing dashboards and the powers that be wanted an easier and a “one tool that does it all” product that could handle their multiple types of GUI needs (beyond C# development). They tried Iron Speed and Report Portal with varying success and finally decided on LogiXML dashboard product.

To make a long story short, LogiXML became the standard dashboard product, I trained in the product, and they eventually hired a very experienced and talents (and certified and nice) LogiXML developer to do the dashboard work.  They have even creatively “adapted” the product to some generic GUI needs.  It’s a JavaScript and server based script product and they just used some JavaScript to start building GUIs.  So it’s a good dashboard tool for them.

But I really want a much better dashboard product.  One that runs on both desktops and on the web.  One that doesn’t require developers and developer experience to customize, develop, and run.  One that is, well, better and easier to use.  So I have the architecture and prototype work complete and plan (hope) to complete these super dashboard products in the coming year so I can bundle them with my Practical Dashboard Book.  Stay tuned!

While I can’t show you the dashboards that I’ve built here are some pages from some of my writings on dashboards so you can get a bit of feel that I know what I’m talking about.  And one of the first things to learn is that dashboards are a system not just a customized product.  What you see is the dashboard but underneath it, supporting and driving it, is most often a complex system of ETL, SQL, and staging dashboards….

Technical Documents:

Yes, I develop and write technical documentation for the projects and products that I develop.  (But please don’t tell anybody because we all know that real developers don’t write documentation, right?)  I think that this is very good practice that any reasonably competent developer can do (as long as they are not lazy and are confident enough that they don’t have to keep their software babies secret as to ensure job longevity). 

Generally, when I develop a software system or product I will want to develop the following set of minimal documentation:

  • Quick Start Guide – A short quick read on how to get started using the software system.
    • Remember:  Software is really a SYSTEM not just a thing.  And it should be developed and taught as such.
  • User’s Guide – This is a much more detailed guide for users, from the user’s perspective, on what the software does and how to use it and it’s many features
    • Typically I will develop this as a PowerPoint for training and outline purposes then merge that data into a Word document for the formal document.

Often I will want to (but am often constrained by time and priorities) develop the following additional documentation guides for the software system:

  •  Installation Guide – How to install the software.  Useful when there’s a database involved.
  • Administrator’s Guide – What a DBA and web server administrator needs to know on how to administer the software system.
  • Technical Guide – Contains important technical information on the architecture, design, and implementation of the software system for future developers.

Obviously, these aren’t need for tool applications but they are invaluable for complex database based software systems with a lot of moving parts and typically multiple people to take care of the them after I release the system to production.

I’ve developed quite a bit of technical documentation for my projects over the years.  Most places don’t have a tech writer on staff so I write them up myself.  Not as well as a high grade tech writer but definitely good enough to read and to use. 

Here in the following are some snapshot images of documentation that I have developed for my projects.

Patents:

I have two U.S. patents for electronic metering devices from back in the old days when I was a hardware and microcontroller engineer:

Both of patents were invented by me alone and I wrote and processed the patent applications through the patent application process with some occasional advice of a patent attorney.  Because I wrote them up myself I saved about $20k per patent at the time and received much better written patents than if I would have had an attorney done them.  In fact, in one of the patents I slipped in an additional new claim while the patent was in the processing process which is something that is supposed to be not possible.  Well, I did it, but don’t let the PTO (Patent and Trademark Office) know that I slipped one over them!

Note that both of these patents are listed with my old middle initial of “R” (for Rowen) which I changed formally in 1994 to a new middle initial of “G” for “Grant”.  Which everyone called me for many years.  I don’t really use my first name because it is continually confused in regards to spelling and misinterpreted as “Gary”, etc.

Regarding more patents, I could easily have had dozens of patents in many areas by now except for the fact that they have gone from very expensive to horribly expensive.  Back when I did mine it was about $20K for just a patent application and processing.  Edison had 1,093 U.S. patents but back then the patents were hand written on single sheet of paper, cost less than $50.00 to file, and were granted in a couple of weeks.  Nowadays, it has risen to $50-150K for an averagely complex patent.  So it would take about $1.2 million to $3.6 million to file the two dozen patentable ideas that I’ve had since and that’s a bit much.  And to defend a patent (didn’t I mention that you also have to DEFEND it in court?) costs about $1 million per case nowadays.  So patents are a wee bit on the expensive side nowadays.  So I only have two but may, depending upon commercial plans and possibilities, apply for more in the future.

Technical Publications:

I have written many technical papers ranging from tech notes to architectural specifications to complete documentation sets for projects and products that I have developed.  I have a large backlog of potential technical articles to publish and have recently started to look for and find venues for technical articles.

My first formal beyond the corporate environment technical article is on the SubStringInstance SQL Server CLR function that I wrote several years ago to solve a data extraction problem I encountered while developing ETL for a data warehouse.  I published this topic as an article on the website Code Project which is well known as an excellent site for development articles and information.

I have more technical articles to format and publish and am currently looking for technical websites where the information would be useful and well viewed by people who could use the information and techniques that I’ve developed over many years of experience.

I have a book that I have been working on irregularly over the last several years.  It’s an overview of how to effectively use OLAP cubes in the corporate environment.

  • The Manager’s Guide to OLAP Cubes (work in progress).

Throughout my journey in building OLAP cubes for small to medium to very large organizations I have found, consistently, that there are many not quite well formed and often incorrect ideas and concepts about just what exactly OLAP cubes are and how to effectively and profitably apply them in various corporate and non-profit environments.  This book is meant to provide that insight and turn a very technical subject into one that can be understood and effectively managed by semi-technical managers and also users.  Rather than being a dry, step-by-agonizing-step technical rendition of how to technically build OLAP cubes this book addresses many real world concerns that are totally missed by the many technical books on OLAP cubes.  I have a real world sample OLAP cube and an OLAP cube browser software to go with the book so readers can actually have hands-on learning with OLAP cubes.  I am planning to complete the book within the next year and perhaps to create a short video course based on the book.

I have many more technical books that I could write and publish including:

  • The Practical Dashboard Book – Learn all about dashboards by building your own dashboard in a few hours.
    • This book will be unlike any current dashboard book as it will include a desktop dashboard program which will allows non-technical users to easily construct a dashboard in a few hours (instead of having to spend $50K+ on a commercial dashboard product and spend weeks to months to get a dashboard online).
  •  Tech Check – How to technically check out job candidates to find the good ones.
    • Includes Tech Check software and a database of C#, database, OLAP cube, and other technical questions and answers (as well as the Developer Sushi Test).
    •  I am outstanding with being able to technically check developer candidates out over the phone by using a few simple principles and rules that most managers and recruiters overlook but could learn in a few hours.
  • Software Psychology – So why do some (few) software development projects succeed while the majority of others fail often spectacularly?  Why do some software (and hardware) products succeed where others don’t?  Why are most software developers actually not very good at producing good, usable software on time and on target while a few “super star” developers can do it in their sleep?  The answer is the psychology of software development. What I call “Software Psychology”.  Read this book and find out the answers on how to use and develop “super star” developers, free them from “swimming with cinderblocks”, and make your software development projects a success instead of a fizzle or dud.
  • The T3 Silver Bullet Project Management System for Software Development – We have many software development methodologies and they sometimes work but most of the time don’t really work.  Scrum is the latest fad/craze and it’s fading because it doesn’t really work with projects that need more than 1 or 2 days to add a feature or functionality.  Which is really about 93% of all the software projects out there at some point or another.  So what’s the real, usable, working answer?  It all starts out with 3 simple objectives – On Time, On Target, and On Track, the three “T’s” in the T3 Software Development Project Methodology…

But my time is limited and I am unsure when I will have (or make) time to finish writing and publish these books.  I may create instead video training courses as they have a higher profit and are easier to produce in many ways.  (And easier and faster for many people to view than to read a book.)

Video Training Courses:

I have created two training videos for AT&T (2012):

  • #60460502 – Introduction to OLAP Cubes.
  • #60468003 – Excel Pivot Tables with OLAP Cubes.

Both video courses were developed by me to address the learning challenges involved with the deployment and use of OLAP cubes in the corporate environment.  I developed both courses curriculum and training plans to directly address the needs of users experienced with Excel but not experienced with Excel Pivot Tables or OLAP cubes.  I found in my background research that there aren’t any technical courses or web tutorials addressing these two subjects.  So I built both lesson sets up from the ground, focused them on just what the user audience needed to know, and created the two courses around PowerPoint based presentation mixed with regular periods of guided hands on learning so as to get the practical use of OLAP cubes and Excel Pivot Tables with OLAP cubes “locked in” for the users.

I recorded and edited both video courses using Camtasia Pro.  Both courses were accepted and added to the AT&T corporate learning course systems (LSO) which provides the courses to all 282,720 AT&T employees.  I also taught both courses to a several audiences of users via the internal AT&T Connect video conferencing service (which is like WebEx).

=== END ===