User:Shir21/Decision Functions

From Simple English Wikipedia, the free encyclopedia

Decision Functions[change | change source]

This page is a sandbox. It is not an encyclopedia article, and may not be accurate.

The person who created or worked on this sandbox, Shir21, may be changing this page at the moment. You are asked nicely not to change this page while it is being worked on. Thank you.

Decision functions is a tool in Social Networks that helps compute the Decision Matrix. Using the Decision Matrix, we select the two opposing characters in a story (ex. Protagonist and Antagonist) as anchors and then we can find the decision vector of each node in the graph (character in the story).

Definition[change | change source]

The decision function takes a party member function pm together with a social network which represents the logic of the narrative, and generates a decision matrix. This generates a decision vector for each node. The decision vector provides a “grade” for each option available to that node. The “grade” is given according to the total order with a direction.[1]

With decision functions we would like to find the decision matrix, but there can be different ways to find it. For example, we can look at Gridland- the decision matrix will be created by using the distances from each node to an anchor. In a graph it will be a similar case, we will use the distances between vertices to find the distance graph and use that to find the decision matrix. In a weighted graph we'll have better understanding of the distances between vertices (in an AB graph for example, it's safe to assume that two characters who talk a lot and therefore have a bigger weight to their edge are closer than characters who barely speak.) rather than a non-weighted graph where all adjacent vertices have the same distance.

Formally: Let n be the number of nodes in the social network G. Let k be the number of parties. Let be the set of known party leaders or anchors. The decision function is:

Anchors[change | change source]

Anchors are characters in a narrative that we are well aware of their opinions, and so we can deduce the other characters opinions by relation to the anchors.

A node is an anchor if we know before we compute the decision matrix . We denote the set of all anchors by . Anchors will be used to compute the decision matrix .[1]

Decision Matrix and Vectors[change | change source]

The matrix is called a decision matrix when the rows are the decision vectors which “grade” or “rate” the various options available to the node according to the total order .

Decision vectors are each node's rating of its relation to anchor. For example, if there are n anchors the decision vector for a node would be 1xn (In column 1 it would be its rating in relation to anchor 1, in column 2 it would be its rating in relation to anchor 2 and so on...).

Decision matrix would be a combination of all the vectors (In row 1 there would be the decision vector of node 1, in row 2 there would be the decision vector of node 2 and so on...).

Example- Gridland[1][change | change source]

Let's look at Gridland- a world that is 3 by 3 and that have 9 nodes that are separated by a certain distance, they have an election to decide where to place a nuclear plant. There are two parties- The blue party that elects to have the nuclear near node 1 and the red party thar elects to have the nuclear near node 9. Each node in Gridland can vote, which means there are 9 voters.

We'll find that the Decision Matrix will be defined according to the distances between the nodes.

Different examples of elections in Gridland with different anchors.


In this example each node distance from itself is 0, and from each neighboring node that is not diagonal it's 1.

Using this distance matrix, we can find the decision matrix. Each node would want the nuclear plant to be as far away from itself as possible, so naturally node 1 would be on the red team and node 9 will be on the blue team. They would be the anchors in this example as their opinion is the most obvious.

The matrix seen above with the two Anchors as defined will create this Decision matrix:

The first column shows how much a node would rate having the nuclear plant in node 1 and the second column shows how much a node would rate having the nuclear plant in node 9. Each row represents a node, so the first row is node number 1, second row is node 2, etc...

As expected, node 1 rates the option to have the nuclear bomb at node 1 as 0 but rates the option to have the nuclear bomb at node 9 as 4. Because the example we used shows a symmetric graph, the decision matrix and symmetrical as well. Nodes that rate both options as 2 have the same distance to both anchors (node 1 and 2) and so they would mind both options equally.

Further reading[change | change source]

  1. 1.0 1.1 1.2 Lotker, Zvi (2021). Analyzing Narratives in Social Networks. Springer. pp. 69–70.