Texas Investment Network


Recent Blog


Pitching Help Desk


Testimonials

"I made several great connections through your network. In fact, I was able to over fund my project. I also listed with another network that cost 3X as much and the leads were nowhere near as solid as the investors I met through this network. I will definitely only be using this network in the future. "
Jason A.

 BLOG >> Decision Trees

Decision Tree Diagrams [Decision Trees
Posted on March 21, 2016 @ 08:20:00 AM by Paul Meagher

I've been reading quite a bit about decision making lately and one common recommendation for making better decisions is to diagram your decision using decision trees.

In Reid Hastie & Robyn M. Dawes textbook Rational Choice in an Uncertain World: The Psychology of Judgement and Decision Making (2nd Edition, 2010) they use the convention that decision alternatives are indicated by a small square and event alternatives are indicated by a small circle. I used the online diagramming tool draw.io to create the skeleton of a basic decision tree diagram:

In a fully specified decision tree diagram I would also have labels along each horizontal branch, probabilities along each event branch, and consequence values at the terminal end of each event branch (see this previous blog for an example). I won't be going into such details in this blog because I want to focus on the first step in diagramming a decision tree; namely, drawing the skeletal structure of a decision tree.

If you want to diagram decisions it is useful to have a tool that makes it easy to do so. A piece a paper is a good option, but I wanted to use a piece of software to create decision tree diagrams exactly as they appear in the textbook mentioned above. The draw.io tool fits these requirements. In a previous blog I demonstrated a more complex approach to creating decision trees. In this blog I wanted to find a more accessible approach that anyone could use without installing a bunch of software.

Decision trees have lots of symmetrical branches and drawing them was the main challenge I had in making decision tree skeletons. Duplicating and dragging horizontal or vertical lines were the main actions I ended up using to create the symmetrical decision tree diagram above. The left column in the figure above shows the "General" and the "Misc" symbol libraries that I used to create the square, circle, and line shapes. By default they are available to use when you access the draw.io online application and you just need to click the library name to open it and access the diagramming shapes you want to use.

It would be possible to print off a decision tree template like this so you can write labels and numbers onto it. It would be a ready-made template for a common form of decision problem, namely, a decision with 2 possible choices and 2 possible event outcomes that will determine the expected consequences for each choice.

In this blog my goal was to suggest some software you can use to create decision tree diagrams (draw.io) and how you can go about creating nice symmetrical tree diagrams with it. There are probably more efficient techniques to use than what I'm suggesting as I just started playing around with draw.io for this purpose and the duplicate and drag approach was the first approach that worked ok. We can use decision tree diagrams to explore a wide range issues in decision making and in future blogs we'll explore some of these issues.

Here are some useful tips that will get you up to speed quickly using draw.io.

Permalink 

Devil is in the Details [Decision Trees
Posted on September 10, 2013 @ 06:25:00 AM by Paul Meagher

In my previous blog, I showed how to construct a nice decision tree for a decision about how much nitrogen to apply to a crop. In this blog, I want to advance our thinking about decision trees in two ways:

  1. Show how expected returns can be calculated using PHP.
  2. Discuss the issue of how detailed we should get when constructing a decision tree.

Computing Expected Return

In my blog titled Computing Expected Values I referred you to a video tutorial on how to calculate expected values. In this blog, I will implement that calculation in a PHP script. Implementing the calculation programmatically allows us to see what types of data structures need to be defined and how they looped over in order to compute expected returns. We need a data structure to represent our actions (i.e., a $nitrogen array), our events (i.e., a $weather), our outcomes (i.e., a $payoffs matrix), and to store the expected returns that are computed for each action option (i.e., an $EV array). With these basic elements in place we can compute our expected values in a straightforward manner as illustrated in the code below:

The output of this script looks like this:

Array
(
    [lo] => 6900
    [med] => 7900
    [hi] => 8900
)

These are the expected returns for low, medium, and high levels of nitrogen application and correspond to the expect returns that appeared in the decision tree appearing in my last blog.

Levels of Detail

The decision tree we have constructed to represent a nitrogen application decision is vague in many of its details and, as such, would be difficult to use for the purposes of making an actual decision about whether to apply nitrogen or not.

Our biggest omission is to just talk about an "expected return" without talking specifically about whether this is expected revenue, expected profit, or expected utility. If our payoffs are expected revenue amounts then our decision tree is not going to be that useful because it hides the costs involved. For this reason, the expected profit would be a better value to compute as our "payoffs" rather than expected revenues. Theoretically, an even better value to compute would be the expected utility associated with each action option but that is a tricky value to compute because it depends on subjective factors and more complex formulas. For this reason, we can be satisfied if we can at least compute expected profits for each decision option.

Another omission in our decision tree is any discussion of the costs associated with each proposed action. In order to compute such costs we must get detailed about the when, where, and how of applying nitrogen. We also need to estimate the price of nitrogen at the time of application. If we have already purchased our nitrogen then this would simplify our calculations. Other costs include the cost of fuel to apply our nitrogen. We also need to be specific about what crop we are applying our nitrogen to. In order to compute expected profits we would need to compute some other costs associated with planting, cultivating, and harvesting the crop per acre so that these can be subtracted from the overall revenue generated to compute our expected profits.

Our nitrogen application decision is impacted by weather which we have characterized as poor, average, or good. This is also not very precise and would need to be specified in more detail. Weather could specifically mean rainfall amounts in the spring phase of the year.

Once we get very specific about costs and what our variables specifically refer to, then our decision tree would provide better guidance on how to act. The visual depiction of a decision as a decision tree helps to organize our research efforts but it omits much of the research work that will have gone into making the decision tree useful and realistic.

Permalink 

Decision Tree Subgraphs [Decision Trees
Posted on September 5, 2013 @ 06:55:00 AM by Paul Meagher

I experimented with two Graphviz features that I thought might improve the appearance of my decision trees:

  1. I wanted the connecting lines to be rectilinear rather than curvilinear. Unfortunately, I am not able to achieve this effect when I use labelled edges; rectilinear connections only works when labels are applied to nodes, when labels are applied to edges it may be more difficult to calculate line placement. Rectilinear connections might have looked better but I'll have to live with curvilinear connections because I prefer labelled edges for the decision trees I'm exploring right now.
  2. I wanted to separately highlight actions, events, and outcome sections of the decision tree. I found a way to do this for the outcome nodes but have not found a way to apply further labelling or highlighting to labelled edges.

Todays blog shows how to use the "subgraph" feature of the dot language to highlite nodes that are related in some way. I used the subgraph feature to better highlite what the payoffs were for each separate action. The possible payoffs associated with each action option are distinguished by having a different color and bounding box for each set of payoffs.

Notice that the payoffs associated with each action are separately highlited and that I also add up the payoffs for each set of payoffs and report it as the expected value (EV) for that action. Here is where I get into calculating expected values manually because the dot language is not a general purpose programming language. For that I'll be using PHP in my next blog to compute expected values and supply them to the dot file that is generated.

Here is the dot file that was used to generate a decision tree that uses subgraphs to highlite sections of it.

Permalink 

Improved Decision Tree Layout [Decision Trees
Posted on September 3, 2013 @ 07:34:00 AM by Paul Meagher

To date I have not been fully satisfied with how my decision trees have appeared. They did not appear to use space efficiently and they were not as easy to read as I would have liked. Today we will make some improvements to a Graphviz recipe for constructing decision trees. These improvements will make for a more space efficient decision tree that is also easier to read.

The main improvements I have made are:

  • All of the labelling for actions and events appears on the edges instead of the nodes. In previous examples, most labelling was done at the nodes.
  • There is no labelling at all between actions and events, just a small connector shape.

The combination of these two improvements means that 1) it is easier to read the graph as all the edges are labelled and the flow is oriented in a left-to-right fashion, and 2) the layout is more space efficient as the nodes connecting actions to events takes up alot of space when they include labelling. Now, we have only small connector shapes with no labelling.

I'll illustrate the improved decision tree in the context of a decision about how much nitrogen to apply to a crop per acre that involves calculating the payoffs you might expect if you get Good, Average, or Poor weather during the growing season. This is what such a decision tree looks like with the improvements mentioned above:

Nitrogen application decision tree

There are a couple of other aspects of this decision tree that are also noteworthy. First, the labels for each possible action (e.g., Nitrogen application amounts) includes the cost per acre of applying that amount of Nitrogen. One aspect of constructing a decision tree is computing the cost for each course of action. The second aspect to note is that the terminal nodes on our decision tree are often payoffs that involve multiplying an estimate of revenue by the probability of some event that significantly affects the payoff (e.g., the quality of the weather during the growing season).

I created the visualization for the nitrogen application decision using Graphviz. To do so I piped the recipe below into the graphviz program "dot". The recipe illustrates how to add comments to your dot file to make it easy to follow your recipe for rendering a graph shape. The recipe is also organized into logical sections to also make it easier to read.

Figuring out how to render decision trees with labelled intermediate edges instead of labelled intermediate nodes was a big step in creating a decision tree format that I find is more workable. I'm not done yet, however, as I want to explore some other features of graphviz to add some more tweaks to my decision tree.

Permalink 

Complex Decision Trees [Decision Trees
Posted on August 21, 2013 @ 01:13:00 PM by Paul Meagher

In my last blog I offered up a video tutorial on how to construct simple decision trees and analyze them using expected values. It is easy to object that these binary decision trees are too simple to represent the complex decision problems that we are confronted with each day. Before you object too loudly, you should examine what a more complex decision tree might look like and the issues that arise when we add more complexity to our decision trees.

A decision tree can become more complex in two basic ways. We can add more intermediate acts or we can add more intermediate events. In simple decision trees we have a binary set of Actions (apply 90 lb nitrogen, apply 110 lb nitrogen) leading to a binary set of Events (e.g., probability of low rainfall, probability of high rainfall) and each combination of Actions and Events lead to an Outcome. See my blog, Representing Decisions with Graphviz, for more details.

So one way we can add complexity to a decision tree, beyond just adding more than 2 branches for each action and event node, is to add intermediate actions and/or events to our decision tree. So, for example, our decision problem might involve the act of either applying 90lbs or 110lbs of Nitrogen per acre to our wheat crop. We might also have to choose between the actions of applying the Nitrogen at time X or at time Y. The combination of these actions can then lead into a season with either a low summer rainfall event or a high summer rainfall event. We can represent a fragment of this decision tree generically with the following diagram:

The diagram was constructed using Graphviz and the dot file I used to construct it looks like this:

This is just a fragment of a multistep decision problem. As you can see, the number of terminal branches in this decision problem explodes as we add more intermediate action or event nodes. This does not prevent us from using decision trees to help us make better decisions, but it does give us advance warning that we should be very sure that it is necessary to introduce intermediate actions or events into our decision tree before we do so as they add considerable complexity to the decision tree. Decision trees are not meant to capture the minute details of a decision problem, just the high level actions and events that impact upon the decision. The choice of action and event nodes, just like the assignment of probabilities to event nodes, involves alot of subjective judgement. The process of formalizing it all into a decision tree, however, brings the whole exercise out of subjective reality into consensus reality where others can comment, disagree, or agree with the manner in which you have framed the decision problem.

Permalink 

Computing Expected Values [Decision Trees
Posted on August 20, 2013 @ 12:47:00 PM by Paul Meagher

So you have a simple decision tree leading from actions, to events, to outcomes. You have labelled the probability of your events, the costs and payoffs associated with actions and outcomes, and you are wondering how you can use all this information to pick a course of action. One answer is that you can compute the expected value associated with each outcome and make your decision based upon the course of action that yields the highest expected value (e.g., highest average profit).

Fortunately, I do not have to explain what an expected value is or how to compute it because there is an excellent tutorial available that explains it all. So, sit back, and learn from MBA Bullshit how to use a decision tree to compute expected values and how you can use expected values to help you decide on a course of action.

Permalink 

Layout, Weights, and Highlighting with Graphviz [Decision Trees
Posted on August 8, 2013 @ 03:56:00 AM by Paul Meagher

I introduced the Graphviz program in my last blog. In today's blog I want to go a little deeper into the DOT language to show how you can achieve three useful effects using the DOT language. The three effects are:

  • Change the overall layout of the graph. Instead of starting our decision tree from the top, I would prefer to start it from the left side of the canvas and expand it towards the right side of the canvas (i.e., left-to-right reading order). I can do this by adding the command rankdir=LR; to my dot file.
  • Would be nice to show probability values on links going into event nodes. For example, the probability of high rain fall this season. We do this by adding a bracket next to link commands and specifying the value for the "label" attribute (e.g., Action -> HighRainFall[label="0.6"];).
  • If you are trying to highlight a path through a decision tree, then there are ways to highlight a path in graphviz. One way would be to thicken the line and add red coloration to each link in the path (e.g., Action -> LowRainFall[label="0.4",color=red,penwidth=3.0]; ).

If we put all these elements together in one dot program file, it would look like this:

digraph { 
  
  rankdir=LR;

  Action -> LowRainFall[label="0.4",color=red,penwidth=3.0]; 
  Action -> HighRainFall[label="0.6"]; 

}

If we load this dot file into the graphviz program "dot", it will generate this graph:

What we have here is a fragment of a graph. A fragment like this might appear in your decision tree leading from an action node to an event node. This is how we can get probabilities to appear on our graphical representations of a decision problem. Also, I like to orient the tree from left-to-right because if you have a large branchy tree it can more easily be printed off whereas top-to-bottom trees are hard to print off and involve alot of horizontal scrolling to view. Finally, when you make a decision to pursue a particular course of action, you can highlight that course of action graphically with a thick red pen effect.

Permalink 

Representing Decisions with Graphviz [Decision Trees
Posted on July 29, 2013 @ 06:14:00 AM by Paul Meagher

Previously I discussed a simple risk management framework that involves precisely specifying the actions, events, and outcomes involved in a decision problem.

One aspect of "precisely specifying" a decision is representing the overall decision making problem in the form of a graph. We can use a pad of paper to draw lines representing possible actions, which lead to events, which lead to outcomes. Or, we can use a tool like Graphviz to construct much prettier graphs and make us feel more professional.

GraphViz was developed by AT&T Research and is considered a top-tier tool for creating/visualizing graph structures.

In today's blog, I want to give you a basic idea of how Graphviz works so you can judge for yourself whether you want to invest time into learning it.

To generate a graph using Graphviz you need to write some commands into a "dot" file that ends with the extension ".dot". The term "dot" is also used to denote one of the main Graphviz programs used to generate graphs from dot files. Also, the term "DOT" is used to refer to the command langauge you enter into your dot files.

Without further ado, here is dot file called DecisionTree.dot that depicts a decision in terms of an action, event, outcome framework for managing risk.

If I run this command from my Linux command prompt:

dot -Tpng DecisionTree.dot > DecisionTree.png

The dot interpreter will read the file and generate the graph in png format. This is what that output looks like.

As you can see, it is not difficult to go from entering commands into a dot file and generating a decent looking graph. The DOT language has much more powerful features for drawing graphs than what I am showing you here; however, in the initial stages of sketching out the actions, events, and outcomes involved in your decision problem, you may want to keep things simple and just focus on drawing out all the nodes and the lines between them.

The graph below visualizes a nitrogren application decision. Do I apply 90 lbs per acre of Nitrogren or 110 lbs? The effect of each action on the crop I want to grow is jointly determined by the amount of rainfall I'm likely to recieve over the growing season. The action forks (i.e., application amounts) lead to the event forks (i.e., rainfall amounts) which lead to the terminal outcomes (i.e., expected number of bushels).

Permalink 

 Archive 
 

Archive


 November 2023 [1]
 June 2023 [1]
 May 2023 [1]
 April 2023 [1]
 March 2023 [6]
 February 2023 [1]
 November 2022 [2]
 October 2022 [2]
 August 2022 [2]
 May 2022 [2]
 April 2022 [4]
 March 2022 [1]
 February 2022 [1]
 January 2022 [2]
 December 2021 [1]
 November 2021 [2]
 October 2021 [1]
 July 2021 [1]
 June 2021 [1]
 May 2021 [3]
 April 2021 [3]
 March 2021 [4]
 February 2021 [1]
 January 2021 [1]
 December 2020 [2]
 November 2020 [1]
 August 2020 [1]
 June 2020 [4]
 May 2020 [1]
 April 2020 [2]
 March 2020 [2]
 February 2020 [1]
 January 2020 [2]
 December 2019 [1]
 November 2019 [2]
 October 2019 [2]
 September 2019 [1]
 July 2019 [1]
 June 2019 [2]
 May 2019 [3]
 April 2019 [5]
 March 2019 [4]
 February 2019 [3]
 January 2019 [3]
 December 2018 [4]
 November 2018 [2]
 September 2018 [2]
 August 2018 [1]
 July 2018 [1]
 June 2018 [1]
 May 2018 [5]
 April 2018 [4]
 March 2018 [2]
 February 2018 [4]
 January 2018 [4]
 December 2017 [2]
 November 2017 [6]
 October 2017 [6]
 September 2017 [6]
 August 2017 [2]
 July 2017 [2]
 June 2017 [5]
 May 2017 [7]
 April 2017 [6]
 March 2017 [8]
 February 2017 [7]
 January 2017 [9]
 December 2016 [7]
 November 2016 [7]
 October 2016 [5]
 September 2016 [5]
 August 2016 [4]
 July 2016 [6]
 June 2016 [5]
 May 2016 [10]
 April 2016 [12]
 March 2016 [10]
 February 2016 [11]
 January 2016 [12]
 December 2015 [6]
 November 2015 [8]
 October 2015 [12]
 September 2015 [10]
 August 2015 [14]
 July 2015 [9]
 June 2015 [9]
 May 2015 [10]
 April 2015 [9]
 March 2015 [8]
 February 2015 [8]
 January 2015 [5]
 December 2014 [11]
 November 2014 [10]
 October 2014 [10]
 September 2014 [8]
 August 2014 [7]
 July 2014 [5]
 June 2014 [7]
 May 2014 [6]
 April 2014 [3]
 March 2014 [8]
 February 2014 [6]
 January 2014 [5]
 December 2013 [5]
 November 2013 [3]
 October 2013 [4]
 September 2013 [11]
 August 2013 [4]
 July 2013 [8]
 June 2013 [10]
 May 2013 [14]
 April 2013 [12]
 March 2013 [11]
 February 2013 [19]
 January 2013 [20]
 December 2012 [5]
 November 2012 [1]
 October 2012 [3]
 September 2012 [1]
 August 2012 [1]
 July 2012 [1]
 June 2012 [2]


Categories


 Agriculture [77]
 Bayesian Inference [14]
 Books [18]
 Business Models [24]
 Causal Inference [2]
 Creativity [7]
 Decision Making [17]
 Decision Trees [8]
 Definitions [1]
 Design [38]
 Eco-Green [4]
 Economics [14]
 Education [10]
 Energy [0]
 Entrepreneurship [74]
 Events [7]
 Farming [21]
 Finance [30]
 Future [15]
 Growth [19]
 Investing [25]
 Lean Startup [10]
 Leisure [5]
 Lens Model [9]
 Making [1]
 Management [12]
 Motivation [3]
 Nature [22]
 Patents & Trademarks [1]
 Permaculture [36]
 Psychology [2]
 Real Estate [5]
 Robots [1]
 Selling [12]
 Site News [17]
 Startups [12]
 Statistics [3]
 Systems Thinking [3]
 Trends [11]
 Useful Links [3]
 Valuation [1]
 Venture Capital [5]
 Video [2]
 Writing [2]