Ken Larsen's web site - How to improve decision making


We live in a complicated world.  There are seemingly an endless number of problems and no shortage of opinions on how those problems should be addressed.  Making decisions can be difficult. 


I assert that there are three simple approaches that can dramatically improve decision making:



1.0 Ready, Fire, Aim

  1. At the beginning of a meeting, state the problem that needs to be solved.

  2. Ask for proposed solutions, but specify this ground rule:  No one is to criticize any suggestions until a lengthy list has been created.  Reason:  What normally happens is that people criticize each suggestion as it is put up, the moderator erases it, and after a couple of hours you have nothing ... except for a room full of disgruntled people ... particularly those who proposed suggestions that were immediately criticized.

  3. Go back and list the pros and cons for each suggestion and give each suggestion a net weight.

  4. Sort the list by the net weight.

An alternative way of stating "Ready, Fire, Aim" is:  "The best way to generate good ideas is to generate lots of ideas and then throw away the bad ones."


Here are two examples:


2.0 Kepner-Tregoe Decision Analysis

  1. At the beginning of a meeting, state the problem that needs to be solved.

  2. List "musts" that a solution must satisfy.

  3. List the objectives.

  4. Weight the objectives from 1 to 10 with 10 being the highest weight.

  5. Invite the group to propose solutions (see Ready, Fire, Aim).  Note:  Each solution must comply with the musts.

  6. Weight the solutions based on how well they meet each of the objectives.

  7. Multiply the solution weights against the objective weights and total.

  8. Resort the table by the totals.

  9. Reexamine the top solutions and consider side effects.


For an example, see


I learned about Kepner-Tregoe back in 1979 when I took a week long class on it.  To this day, I regard it as the best class that I've ever taken. 


3.0 Microsoft Word's table sort capability


This is not really a decision making tool by itself, but is a way to close out a brain storming session once a lot of solutions have been proposed.  You simply place all the proposed solutions in a Microsoft table and then add a column or columns which identify the group's weight for that solution.  Then, you use the sort facility to sort the table by the weights.


For an example, see section 2.0 in This section has a list of 45 recommendations that were proposed in a document that was written by the local chapter of the NAACP.  I added three columns to the right and assigned my own weights.  The table can then be sorted based on those weights.  It would be best if the School Board assigned the weights, because they are the ones who are responsible for school issues.  I'm not on the School Board.


4.0 Example of Problem Solving using metrics


(Pat Heinrich, January 4, 2016) I looked at the Ready, Fire, Aim information on your website. It makes tremendous sense. It reminds me of a tool I used early in my career as a Failure Mode and Effects Analysis (FMEA). I was working in chemical plants at the time and in production, nothing is more important than maintaining quality production. We would brainstorm issues that would upset or stop production and rank them along three axes, severity, detectability and frequency. Each of these would be multiplied by a weighting factor and the axes would be summed to get a total weight or rank. This would inform the team on what would issues would be most important to focus on. The three axes we used were for:

This tool was very helpful in production environments. Without it, emotions often weighed into what problems were prioritized. For example, often spectacular large issues that might shutdown the production process were prioritized because they were at the forefront of everyone’s mind. However, these issues might be virtually undetectable and might only happen once every 12 months or so. But because they had high severity and high visibility, they were often prioritized as something that needed to be addressed. More frequent, but less spectacular issues were often deprioritized even if they were more frequent and more easily detectable and impacted the production process more over a whole year (due to the frequency of the problem). Using this FMEA framework we were able to definitively rank issues and set aside the emotional factors that would sometimes drive us to address the more spectacular issues that had less overall impact on the production process.


Ready, Fire, Aim looks like a similar framework to me that is focused on potential solutions rather than problems or issues. And it looks more general than the FMEA. Personally I think that it makes a lot of sense to use Ready, Fire, Aim as a tool to prioritize solutions to equity and achievement issues in our schools. When you used this tool in the past, who were the participants? When we used the FMEA process, we usually had a broad cross section of technical experts and higher level stakeholders (like production supervisors and the plant manager). When I imagine using Ready, Fire, Aim for equity and achievement solutions in the schools, I see the School Board as stakeholders, but the real experts are likely to be administrators and teachers or even outside experts that may have used some of these techniques at other school systems.


5.0 Story told by one of David Schwartz's professors


A story one of my professors used to tell:

A man told his psychologist he was distressed because he was having difficulty making an important life decision. The psychologist advised him to go home, write down all the pros and cons associated with each alternative, assign weights to the different outcomes, crunch the numbers, and then choose the option with the greatest expected utility.

A week later, the man again visited the psychologist, who asked him, “So, did you use the procedure I suggested?”

"I sure did,” the man replied. “I did it 20 times.” 

"20 times?" the psychologist asked incredulously. “Why so many?” 

"Because,” the man replied, “the answer kept coming out wrong.”



Ken Larsen's home page