Entropy, Elevator Pitch and getting teams to work!

Vinodh Kumar Ravindranath, CTO @ Belong

17 March 2019 • 6 mins

One of the interesting interview rounds, we do at our company is a culture round?—?typically we have a select set of “culture” interviewers doing it. I know most companies do some sort of an HR round, but the reason this stands out is not only because there is a lot of real probing of each of the culture values we have listed for our organization to see how aligned the candidate is with respect to those, but also since there is a non-negligible rejection rate?! So, in one of the culture rounds, the culture interviewer was not gung-ho about a candidate who had actually done well in all the previous rounds and the reasoning the interviewer gave was that the candidate was “all over the place”, particularly to his question of why join our company.

At the outset, that seemed the most hazy and un-objective statement to use, particularly while entrusted with an objective task of evaluating a candidate. So is it that as a culture round interviewer, one had the luxury of such evaluations(!) or there is more than that meets the eye?—?almost some latent objectivity in the whole statement! For the moment though I would like to divert your attention to perhaps one of the most intriguing concepts of science— Entropy, while running the risk of me seemingly ‘going all over the place’ at this point of time.

1_VeCJ280sPs83JljH3-F6CQ.jpeg

Entropy, simply put, is the measure of how random a system is. Or the amount of uncertainty in a system. Before you decide that this article is for statisticians or physicists, let me quickly explain in simplistic terms on what entropy means and why this ties into something even as fuzzy as evaluating a culture interview round!

1_8qRKaHrrTxqYltkkyrgyAw.png
Figure 1

Imagine there are gas molecules within a container, marked by gridlines as shown in Figure 1. Picture in your mind on how the figure would look if the gas molecules were to be plotted.

1__-i5pMwv6tZdUU6YLoOBJA.png
Figure 2

1_K8uIYNcF08Nv-znZ-SqQfQ.png
Figure 3

Would that picture be more closer to Figure 2 or Figure 3?

I think the picture in your mind is more likely to be Figure 3 than Figure 2. Now the answer is not because you know the thermodynamics of gas molecules, but because in the absence of prior information your mind is likely to be non-committal to specific patterns and would just imagine the most uncertain state which is Figure 3.

Well entropy captures exactly this, with the following intriguing formula developed by Claude Shannon, who was a juggler, unicyclist and mathematician.

H = -????p? * log(p?)

Note that in this context, p? is the probability of gas molecules being present in a particular grid cell i.

If you bother to do the math, H ?3.14 for Figure 3 since the probability of any gas molecule being any of the grid cells is equal (?). While for Figure 2, H ?0.57 (Since the two overcrowded grid cells constitute for almost 70% of the gas molecules between themselves).

What is clear is that the greater the entropy, there is greater randomness. Or greater uncertainty. Without prior information, one assume utmost randomness about any system. There is nothing interesting in it. So entropy can also be interpreted as a measure of how less interesting a system is?—?Greater the entropy, the more boring things seem!

There is yet another definition of entropy in the information theory world and as you would soon realize, the very formula that is at the heart of the Internet. H represents the minimum number of bits (binary digits) required to express the system. So if a system is totally random, it is going to take a long winding description to explain the system. So the Figure 2 can be described almost 5x more succinctly than Figure 3. As you can see in Figure 3, there is not much of a purpose?—?seems ‘random’. On the other hand, Figure 2 seems to have some sort of purpose to it?—?the molecules seem to be getting closer to the origin staying in the bottom grids, if we attempt to explain.

So entropy seems to be a measure of purpose (or actually the lack thereof) in a system. Pause for a moment and try to explain the purpose of something you are currently pursuing?—?it could be a job you are doing at work, an activity or even a company you are running.

1_zBNQyKe2Jacvhvh5FuTmtA.png

Did you answer it in a long essay, or a paragraph, or a single line? Did you find yourself going ‘all over the place’? If you were able to explain in a single line, that is probably what folks call ‘the elevator pitch’. Congratulations?—?you have been able to achieve quite a low entropy for what you are up to. And lower the entropy, the greater the purpose and more interesting it is. Perhaps that is the reason why investors in the startup world keep asking you for elevator pitches! And perhaps, as the dots now connect, the candidate unfortunately had a maximal entropy answer for the culture round question?! It now almost seems like evaluating entropy is actually quite a objective way to approach these kinds of subjective tasks.

The principle of maximal entropy states that a system, is most likely to be in the most uncertain state?—?i.e. the state of maximum entropy, barring the given constraints. Suppose this system was the organization you are trying to build or a team you are leading or working in. How do you think the individuals in the team / organization behave? At this juncture, does Figure 3 come to mind or Figure 2 or something else?

Well thats your clue?—?picture what is the ideal state and work towards reducing the entropy. If your teams are aligned then the entropy is reduced and there is lesser uncertainty. Now the trick is to do that amidst team dynamics, the conflicting motivations of individuals, market factors and what not. But minimizing the entropy does reduce uncertainties and get more things done. So if you are part of a team, your job is not only to do your work, but also participate in reducing the overall entropy of your team. In fact I would go one step further and say that folks who reduce entropy in an organization are actually leaders?—?it is their role and responsibility to align, bring clarity to the chaos and simplify the state of the system.

1_3s9FNtPHWOAdezx5u5cSkw.jpeg

It is quite amazing to realize that the key yet again seems to lie in simplification. By the way, in the field of thermodynamics, entropy is defined as a measure of the energy in a system that cannot be used to do work?—?so as leaders and team players, your job on this front is cut out. By the way, there are statistical techniques to estimate entropy in the context of teams and business, which I will not go into (to minimize entropy of this article itself, but ping me if you are interested!).

To end, there is not only a touch of (cosmetic) elegance in brevity and simplicity, but at times greater purpose, certainty and output. So says entropy!

PS: For the mathematically inquisitive, curious on how did Shannon stumble upon this notion, here is a great thread on the same which ended up into me going into some lengths to understand the derivation of this formula over here.

Originally posted by me on Towards Data Science .