-
Entropy (Information Theory)
Initially written as a Tweet, but to-be-expounded on this blog post. When the data source produces a low-probability value (i.e., when a low-probability event occurs), the event carries more “information” than when the data source produces a high-probability value. Wiki on Entropy. Information Theory by Claude Shannon. In the context of relationships This translates to…
-
Growth
Building stuff as an engineer excites me and draws my passion for so many things, but building myself is my most treasured project. April 11, 2019
-
Embracing confusion
Been listening to Farnam Street Blog, and I am very pleased with their content. In Embracing Confusion, one of their recent podcasts, they discuss how to embrace confusion, learning behaviors, leadership, and a lot more. I like being around Shane (author of the Farnam Street Blog, and the Knowledge Project Podcast) and the guests that…
-