# Research & Development

## The Mother of All Demos, presented by Douglas Engelbart (1968)

Speaking of intelligence and foresight....

The Mother of All Demos is a name given retrospectively to Douglas Engelbart's December 9, 1968, demonstration of experimental computer technologies that are now commonplace. The live demonstration featured the introduction of the computer mouse, video conferencing, teleconferencing, hypertext, word processing, hypermedia, object addressing and dynamic file linking, bootstrapping, and a collaborative real-time editor.

## On Entropy Depletion & Related Links

I had to dig these up in the context of a conversation around the (in)security of currency regimes such as BitCoin where presumed ownership of currency is built solely upon asymmetric cryptography. You may find some of these links to be of interest as well.

Textbook RSA is insecure
and other interesting observations...

http://crypto.stanford.edu/~dabo/courses/cs255_winter00/RSA.pdf

Hardware Security for FPGAs using Cryptography
contains a great overview of different kinds of sideband attacks on cryptography
Acoustic cryptanalysis: on nosy people and noisy machines
seeing through The Matrix isn't really that hard if you know how to look at it
Disk encryption may not be secure enough
ye olde standard cold boot attack
On Entropy Depletion
Running out of randomness can hurt, bigtime.
http://www.educatedguesswork.org/2008/10/on_entropy_depletion.html
Researchers Crack RSA Encryption Via Power Supply
Invasive sideband attack.
Blue Pill - Machine Virtualization for Fun, Profit, and Security
Virtualization attacks.  Epic turtles.
via David Lazar.

## Slides from 11th Annual SecureIT conference- “OWASP Web Services Security - Securing your Service Oriented Architecture”

I recently spoke to 11th SecureIT conference on "OWASP Web Services Security - Securing your Service Oriented Architecture". This annual event was hosted by UC San Bernardino at Sheraton Fairplex Hotel.

This SecureIT Conference conference provides focus and opportunities to higher education staff meeting the challenges of providing a secure information technology environment for campus communities. The event was well attended with distinguished speakers, including Pradeep Khosla, UC San Diego’s chancellor, Michael Montecillo, IBM Security Services Threat Research and Intelligence Principal and Eric Skinner, VP of Mobile Security for Trend Micro.

The slides of my presentation can be found below.

## Quantum Computing & Entanglement with Dr. John Preskill @ Caltech

Last night I had the privilege to listen to Dr. John Preskill in Beckman Auditorium here at Caltech with fellow Quantum aficianado David Lazar. John Preskill is the Richard P. Feynman Professor of Theoretical Physics at Caltech. This was definitely one of the most accessible lecture on this topic of general audience which was very well received. Dr. Preskill is definitely a teacher and a communicator; as Feynman chair, he effectively summarized 50+ years of Quantum research and development into a one hour lecture. Quantum frontiers has some of the recorded lectures which readers may find interesting.

Dr. Preskill is also involved with IQIM, Institute for Quantum Information and Matter, at Caltech. Here is an IQIM Promotional video which was shown towards the end of the session.

The lecture addressed the opportunities and challenges in quantum computing, entanglements, speculation  about future trends, quantum error correction and quantum information science.

Couple of his detailed lectures can be seen below.

## The Clairvoyant Load Balancing Algorithm for Highly Available Service Oriented Architectures

Abstract: Load balancing allows network devices to distribute workload across multiple processing resources including server clusters and storage devices. This distribution helps maximize throughput, achieve optimal resource utilization, minimize response time and help use hardware effectively in multiple data-center locations. As a meta-heuristic enhancement to Psychic Routing[1], researchers present early work in a novel algorithm Clairvoyant for optimal yet unrealizable distribution of traffic.
Among many earlier works including [5, 4], the main inspiration of this algorithm is the RFC 1149, i.e. a standard for the Transmission of IP Datagrams on Avian Carriers. Study of literature suggests that earlier work by [7, 2] on internet protocol over xylophone players (IPoXP) also has a huge impact on classical OSI network model. A typical application load balancing is based on techniques including round robin, weighted round robin, least connections, shortest response, SNMP, weighted balance, priority, overflow, persistence, least used, lowest latency, and enforced traffic flow [6]. Researchers propose that Clairvoyant, by utilizing the ensemble of anomalous cognition, ESP, remote viewing and psychometry, can provide a high performance yet irreproducible load balancing approach. The Clairvoyant load balancing algorithm helps the system administrator fine-tune how traffic is distributed across connections in a psychic manner. Backed by parapsychological research[1], each load balancer is equipped with an enterprise grade channeling medium with features to fulfill potential special deployment requirements. Building upon the techniques proposed in RFC 5984, using extrasensory perception to achieve "infinite bandwidth" in IP networks, Clairvoyant can achieve negative latency as well as negative transmission time difference with appropriate parameters, unachievable by traditional methods[6, 3]. The algorithm uses claircognizance to redirect traffic to one of the unused or even non existent nodes. Clairaudience allows setting up the connection priority order, however early experiments suggest that using 0x8 spherical surfaces also achieve the same level of performance when compared using ROC/AUC.
Although irreproducible in most non-REM environments, the researchers see the potential of using this load balancing algorithm in most high performing service oriented architectures allowing the packet forwarding that will provide unsurpassed end user performance regardless of link capacity, distance, and number of hops. Detailed algorithm and findings will be published in The Journal of Irreproducible Results by 4/1/2014.

# References

[1] Jonathan Anderson, Frank Stajano. Psychic Routing: Upper Bounds on Routing in Private DTNs. , 2011.

[2] R Stuart Geiger, Yoon Jung Jeong, Emily Manders. Black-boxing the user: internet protocol over xylophone players (IPoXP). Proceedings of the 2012 ACM annual conference extended abstracts on Human Factors in Computing Systems Extended Abstracts:71—80, 2012.

[3] David R Karger, Matthias Ruhl. Simple efficient load balancing algorithms for peer-to-peer systems. Proceedings of the sixteenth annual ACM symposium on Parallelism in algorithms and architectures:36—43, 2004.

[4] KM Moller. Increasing Throughput in IP Networks with ESP-Based Forwarding: ESPBasedForwarding. , 2011.

[5] C Pignataro, G Salgueiro, J Clarke. Service Undiscovery Using Hide-and-Go-Seek for the Domain Pseudonym System (DPS). , 2012.

[6] Sandeep Sharma, Sarabjit Singh, Meenakshi Sharma. Performance analysis of load balancing algorithms. World Academy of Science, Engineering and Technology, 38:269—272, 2008.

[7] Emily Wagner, Yoon Jeong, R Stuart Geiger. IPoXP: Internet Protocol over Xylophone Players.

## Causality, Probability, and Time - A Temporo-Philosophical Primer to Causal Inference with Case Studies

Causality, Probability and Time by Dr. Samantha Kelinberg is a whirlwind yet original journey of the interdisciplinary study of probabilistic temporal logic and causal inference. Probabilistic causation is a fairly demanding area of study which studies the relationship between cause and effect using the tools of probability theory. Judea Pearl, in his seminal text "Causality: Models, Reasoning, and Inference" refers to this quandary by stating that

(causality) connotes lawlike necessity, whereas probabilities connote exceptionality, doubt, and lack of regularity.

Dr. Kelinberg's work provides a balanced introduction to background work on this topic while breaking new grounds on a well-positioned approach of causality based on temporal logic. The envisioning problem is the problem of deducing the set of facts, possibly as the result of our actions leading to the decision problem. This is compounded with finding a timely and useful way to represent our knowledge about time, change, and chance.

In this ~260 page book, Dr. Kelinberg begins with a brief history of causality leading to Probability, logic and probabilistic temporal logic. The author then defines causality from various different facets, proceeding to causality inference, token causality and then finally the case studies. With practical examples and algorithms, author devises simple mathematical tools for analyzing the relationships between causal connections, inference, causal significance, model complexity, statistical associations, actions and observations.

Exploiting the temporal nature of probabilistic events, Dr. Kelinberg's research is a thought provoking and valuable addition to the scientific community interested in learning causal effects and inference with respect to time. Built upon the works of the likes of Heckerman, Breese, Santos and Young, this book will pave the way probabilistic reasoning researchers think about temporal effects on causality for years to come.

David Hume believed that the causes are invariably followed by their effects: "We may define a cause to be an object, followed by another, and where all the objects similar to the first, are followed by objects similar to the second." So, would you like a well written margin-annotation-laden text which provides formal and practical case study based approach to this somewhat abstract concept of causality? Then look no further!

## Bayesian Network Repositories Collections

A #NoteToSelf style post regarding collection of bayesian network repositories including but not limited to bnet, net, bif, dsc and rda files.

## The Theory That Would Not Die - An Engaging History of Bayesian Philosophy

As statistician Dennis Lindley famously said, "Inside every nonBayesian there is a Bayesian struggling to get out"; it would be safe to interpolate that Sharon McGrayne's interesting tale of trials and triumph of the Bayes Rule, or more accurately Bayes-Laplace-Price rule, is an excellent historical journey, which may help get your Bayesian out of the closet.

The Theory That Would Not Die: How Bayes' Rule Cracked the Enigma Code, Hunted Down Russian Submarines, and Emerged Triumphant from Two Centuries of Controversy makes for an interesting and captivating read especially considering that writing about history of mathematics and statistics for general audience is a daunting task when compared with relatively popular topics like astronomy or physics. In this easy reading for popular-science audience, author covers over three hundred years of the history behind Bayes rule with its applications and engrossing stories of mathematical luminaries; some of which thought it was a brilliant way to model real-life scenarios while others considered it unscientific, an exercise in futility and vehemently fought against the idea of incorporating prior beliefs. Aside from providing thorough research on the subject matter, this text also delves into significant details about life and works of important scientists, mathematicians and statisticians including but not limited to Turing, Von Neumann, Price, Shannon, Bailey, Laplace, Fisher and Feynman. Regarding modern times, I was delighted to see Daphne Koller and Heckerman's work mentioned as well as the role Bayesian techniques played in contemporary discipline of Machine learning.

Starting with the compelling statement

When the facts change, I change my opinion. What do you do, sir?
—John Maynard Keynes

the ups and downs of adoption of Bayesian rule are listed as different eras and separated out as different parts of the book. The 17 chapters are divided into five parts namely Enlightenment and the anti-Bayesian reaction, Second World War era, the glorious revival, to prove it's worth and finally, victory. Did author do a good job explaining Bayes rule is the point of contention among earlier reviews. I agree that a few more concrete examples with algebraic expressions may have helped better explaining how Bayesian priors and it's mathematical formulation by early luminaries in the field makes it easy to work without complex integrals. However, it is to be noted that this book is not a course in antiquity of causality and inference but rather a study of Bayesian thought through centuries and it's profound impact on science and technology. The book very well covers the advances by 'Bayesian revolution' in variety of fields including medical diagnosis, ecology, geology, computer science, artificial intelligence, machine learning, genetics, astrophysics, archaeology, education performance, sports modeling, and more.

$P(A|B)=\frac{P(B|A) P(A)}{P(B)}$

Sharon McGrayne's has picked a very relevant topic for contemporary audience interested in mathematical and computational sciences; making this ~350 page book a very informative, absorbing and pleasurable reading. Although light on technical details, proofs, mathematical equations and problems, this book delivers what it sets to accomplish, to tell the story of Bayes theory. "The theory that would not die" tells the story of a robust idea which is simple, intuitive, unsettling to establishment and yet so resilient that despite of all the criticisms from mainstream frequentists, it stayed alive and well. To quote from the book

"Bayes is still young. Probability did not have any mathematics in it until 1700. Bayes grew up in data- poor and computationally poor circumstances. It hasn't settled down yet. We can give it time. We're just starting."

## A Deep Dive into Causality with Judea Pearl

For most researchers in the ever growing fields of probabilistic graphical models, belief networks, causal influence and probabilistic inference, ACM Turing award winner Dr. Judea Pearl and his seminary papers on causality are well-known and acknowledged. Representation and determination of Causality, the relationship between an event (the cause) and a second event (the effect), where the second event is understood as a consequence of the first, is a challenging problem. Over the years, Dr. pearl has written significantly on both Art and Science of Cause and Effect. In this book on “Causality: Models, Reasoning and Inference”, the inventor of Bayesian belief networks discusses and elaborates on his earlier workings including but not limited to Reasoning with Cause and Effect, Causal inference in statistics, Simpson's paradox, Causal Diagrams for Empirical Research, Robustness of Causal Claims, Causes and explanations, and Probabilities of causation Bounds and identification.

In these eleven chapters followed by an epilogue, Dr. Pearl’s manuscript postulates representational and computational foundation for the processing of information under uncertainty. It commences with introduction of simpler concepts in Bayesian inference, causality and corresponding proves. However, as text progresses into causal vs. statistical concepts along with theory of inferred causation, the theorems get arduous, somewhat counter-intuitive and the text becomes demanding to keep up. Chapter 3 is an interesting read where causality is discussed in context of philosophy and history. As Dr. Liu states, Judea Pearl’s thesis regarding statistics that it deals with quantitative constructs like mean, variance, correlation, regression, dependence, conditional independence, association, likelihood, collapsibility, risk ratio, odd ratio, marginalization, conditionalization, etc. Meanwhile the causal analysis deals with the topics of randomization, influence, effect, confounding, disturbance, correlation, intervention, explanation and attribution. One of the challenges while following Dr. Pearl’s work is that it abstracts causation discussing it in mathematical and philosophical manner without providing concrete mathematical and computational model for applied research. I believe the book provides great foundation for formal representation of causal analysis and its components, such as do(x) to represent intervention.

Automated Reasoning Group at UCLA has made some strides in this area however the applied research aspects of this formalism still needs to be ‘tightly bound’ by reason of scarcity of empirical evidence for the algorithms in practice.

Go to Top