You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 10 Next »

How  Lessons Learned are Managed

GMAT lessons learned include things that we did well and should keep doing, and large scale things we should be doing to improve the software or our project.   Lessons learned are each discussed by the team and if we decide there is a real issue, we require a plan for improvement.   To make sure we are efficiently handling lessons learned, here are some high level guidelines for creating them.

What is a Lesson Learned

Lessons learned are issues that cause significant problems or could have caused significant problems, or are issues where we did something that significantly improved the software or our project.   Lessons learned require group discussion and probably a change in team habits, process or strategy.

Lessons learned satisfy one the following criteria:

  • Issue that is putting the project at greater risk than necessary
  • Issue that is causing significant inefficiency
  • Issue that is significantly lowering quality
  • What is Not a Lesson Learned

A lesson learned is not a minor annoyance, a tweak to an existing process, or something that can be resolved between team members in the everyday process of getting work done. Team members should bring these types of issues up at meetings, or work them among the team members involved.

A minor issue, (i.e.  not a lessons learned), satisfies one of these criteria:

  • Tweak to an existing process
  • Minor annoyance or gripe
  • Can be resolved by just picking up the phone, or discussing via email, or weekly meeting
  • Does not require significant change in habits or processes

Things We Should Keep Doing

Things We Should Change

Do Better

  • Linux script test system runs are currently performed from a folder named "linuxBin" in the test/script folder to avoid conflicts with the test/script/bin folder.  The scripts used to test on Linux should be merged into the bin folder to simplify repository updates.  (This has been tested and has no conflicts, but it's too close to release to do it until R2018a is out the door.)
  • Still lots of test config changes after QA complete, not running nav tests we should run etc.. need to formalize how we verify that test system configuration is nailed down by QA complete.   what folders should be run, review profiles spreadsheet etc.  We made a lot of changes two day before Code Freeze .
  • Scrub the system for deprecated features and remove at least low hanging fruit... both code and tests.
    • Save
    • LibCInterface
    • TrackingSystem
    • StatisticsAcceptFilter
    • StatisticsRejectFilter
  • Always build the system and run smoke tests before committing and pushing code.  (This includes when merging code from a branch into another branch.)  This is particularly critical near system freezes (QA Complete, Visual Freeze, and especially Code Freeze.)
  • Check sample missions on all platforms
    • Currently, running samples is assigned to one person.  We should modify the release process to assign it to one person per platform.
  • Seriously consider adding the bundling shell scripts to the daily build script so that we create zip/installer every day and test against those...
  • Include all of the plugin components when running smoke tests and system tests.  We are looking for side effects as well as failures in the component being worked, so we need to watch for those everywhere.
  • Raise and resolve test case failures as early as possible.  The recent issue with Ephem_GMAT_Code500_ACE_OpsPrototype_v13 was thought to be a config issue, but there were other signs of a problem that we neglected.
  • Some pushes to the central repository will not compile.  That makes it difficult to track down when a change occurred.  It would be useful to include a note on the commit message stating that the code does not compile, and why.

Stop Doing

  • No labels