Signal Detection and Information Theory

Consider the following scenario: You are the only person guarding a sensitive perimeter. It is foggy night and you can’t tell if the dark shadow 10 feet away from you  is a person or something else.  Your task is to alert the military if you see any human being. Your call will put all the…

Consider the following scenario:

You are the only person guarding a sensitive perimeter. It is foggy night and you can’t tell if the dark shadow 10 feet away from you  is a person or something else.  Your task is to alert the military if you see any human being. Your call will put all the operations on hold and entire battalion in the state of alert.

There are four possible scenarios:

You press the siren and it was an enemy. 

You press the siren and it was just a shadow.

You didn’t press the siren and it was an enemy.

You didn’t press the siren and it was just a shadow because of weird angle.

Out of the four, in two cases you will be praised and hailed,  but you will have screwed up big time in the rest. Can you tell which?

Signal Detection Theory deals with such problems.

SDT assumes that a signal or stimulus may be perceived differently  by different experiencers. This depends on mental capacity, sensitivity of sense organs, prior experiences, attention span, etc. The difference in experience is also caused by the presence of noise. So, we basically have four situations.

                                                      Signal Present              Signal Absent

Positive to signal                        Hit                                   False Alarm

Negative to signal                     Miss                                 Correct Rejection

Signal Detection Theory involves experimenting observers to detect signal in the presence of noise and analyzing how well they perform when there is a signal amidst noise.

A perfect detection would be a situation when there is 100% Hit rate and 100% Correct Rejection and 0% False Alarm and 0% Miss.

Observers who are bad at detecting signals will have one of the following outcomes:

Screen Shot 2018-09-28 at 4.24.34 PM.png

There is no signal detection happening in either of the four cases. The first one is saying “YES” to signals and noise. Observer 2 reports “Yes” 40% of the time no matter what he is seeing. Observer four doesn’t detect any signal or noise and lastly, Observer 4 is saying “Yes” and “No” based on the flip of a coin.

Based on the Hit rate and false alarm rate we can draw plots called Receiver Operating Characteristic (ROC). If we put Hit rate on Y-axis and False Alarm in X-axis we can plot the value from the experiment and get a point. If we do the same experiment with different observers we will get a curve. This curve is called ROC curve.

roc_intro3.png

The dotted line is what you get when you have absolutely useless observers. They cannot discern signal from noise. We would want to have the curve to be more towards the top left.

FIG XIII ROC curves for tests A and B

In the graph above Test A is preferred compared to Test B.

There is a lot of application of Signal Detection Theory in machine learning, statistics, psychology, medicine and military.

This brings us to Information Theory. Information Theory deals with how much surprise element is contained in an information. For example, if someone tells you there will be a sun rise tomorrow, you wouldn’t care much because you already know that, there is no surprise. But, if someone tells you there will be meteor shower and will be visible from your house, you will probably be excited (if you get excited about meteors) because meteor showers don’t happen frequently.

Basically, information theory tells you that if there is more surprise there will be more information. The amount of information is calculated in terms of bits.

screen-shot-2018-09-28-at-4-57-37-pm.png

If you are tossing a coin, the amount of information carried by telling it is a head is 1 bit. In contrast, if you are tossing three coins and you got all three heads your information has 3 bits. This is way more interesting than getting a head in a single coin.

Entropy is the average amount of information received per sample. And we care about that because it gives us crucial information while designing a product: TIME.

If we are designing a system or service, the amount of information present in the decision making process will determine the time it will take to make that decision.

When we are designing a keyboard, button, or a display (all forms of UI) we must take into consideration the index of difficulty for user to go from rest to any button. This is given by Fitts’s Law.

Fitts’s law

From Wikipedia.org

Fitts’s law (often cited as Fitts’ law) is a predictive model of human movement primarily used in human–computer interaction and ergonomics. This scientific law predicts that the time required to rapidly move to a target area is a function of the ratio between the distance to the target and the width of the target.

Fitts’s index of difficulty (ID, in bits):

Screen Shot 2018-09-28 at 5.26.52 PM.png

Index of Performance (IP, in bits per second) as a measure of human performance. The metric combines a task’s index of difficulty (ID) with the movement time (MT, in seconds) in selecting the target. In Fitts’s words, “The average rate of information generated by a series of movements is the average information per movement divided by the time per movement” (1954, p. 390). Thus,Screen Shot 2018-09-28 at 5.27.03 PM.png

where,

 Screen Shot 2018-09-28 at 5.26.57 PM.png

 

To summarize, there is a high possibility of error if there is a lot of noise or the decision making criteria is not clear in the system. It is also possible to make error or use lots of time resource if large amount of information is required. As Human Factors Engineers, our efforts must be to minimize the errors, make signal easily discernible from noise and reduce Index of Difficulty and Index of Performance for users.

 

Tags:

Responses to “Signal Detection and Information Theory”

  1. Katrine Tsoris Zymnis

    I really enjoyed reading your post. I think you did an excellent job in explaining SDT and Information theory. These concepts can be hard to grasp from a human factors point of view and you made sure to explained them clearly and not leave any important information aside. I like the fact that you started the post with a real life example. Finally i appreciate the fact that the structure/appearance of the text alligned the description and visual depiction (graph&tables) of the theories.

    Like

  2. Erin Hsu

    Agreeing with Katrine on how your post really explained the theories we’ve gone over in a brief, informative way. Particularly with the SDT matrix, which works better for me visually with everything written out. I really liked the point you brought up at the end, because it is truly integral to Human Factors and as designers/engineers. So much of the way we approach a situation is to be considerate of future scenarios, rather than addressing things in hindsight.

    Like

Leave a reply to Katrine Tsoris Zymnis Cancel reply