# Research & Development

## P≠NP - A Definitive Proof by Contradiction

Following the great scholarly acceptance and outstanding academic success of "The Clairvoyant Load Balancing Algorithm for Highly Available Service Oriented Architectures, this year I present P Not Equal to NP - A Definitive Proof by Contradiction.

Click here to read the entire paper in PDF. P Not Equal to NP - A Definitive Proof by Contradiction.

## LyX/LaTeX formatting for the C# code

If you are googling trying to find a good way to insert C# code in LyX, this is where you'd probably end up. MaPePer has provided a very good solution; I have modified it slightly (hiding tabs and removing comments) and following is illustration on how to use it in LyX.

First thing you'd need is a Lyx document (LyxC#CodeListing.lyx). Empty one works well.

Add the following to Preamble (Document-> Settings-> LaTeX Preamble)

\usepackage{color}
\usepackage{listings}

\lstloadlanguages{% Check Dokumentation for further languages ...
C,
C++,
csh,
Java
}

\definecolor{red}{rgb}{0.6,0,0} % for strings
\definecolor{blue}{rgb}{0,0,0.6}
\definecolor{green}{rgb}{0,0.8,0}
\definecolor{cyan}{rgb}{0.0,0.6,0.6}

\lstset{
language=csh,
basicstyle=\footnotesize\ttfamily,
numbers=left,
numberstyle=\tiny,
numbersep=5pt,
tabsize=2,
extendedchars=true,
breaklines=true,
frame=b,
stringstyle=\color{blue}\ttfamily,
showspaces=false,
showtabs=false,
xleftmargin=17pt,
framexleftmargin=17pt,
framexrightmargin=5pt,
framexbottommargin=4pt,
commentstyle=\color{green},
morecomment=[l]{//}, %use comment-line-style!
morecomment=[s]{/*}{*/}, %for multiline comments
showstringspaces=false,
morekeywords={ abstract, event, new, struct,
as, explicit, null, switch,
base, extern, object, this,
bool, false, operator, throw,
break, finally, out, true,
byte, fixed, override, try,
case, float, params, typeof,
catch, for, private, uint,
char, foreach, protected, ulong,
checked, goto, public, unchecked,
class, if, readonly, unsafe,
const, implicit, ref, ushort,
continue, in, return, using,
decimal, int, sbyte, virtual,
default, interface, sealed, volatile,
delegate, internal, short, void,
do, is, sizeof, while,
double, lock, stackalloc,
else, long, static,
enum, namespace, string},
keywordstyle=\color{cyan},
identifierstyle=\color{red},
}
\usepackage{caption}
\DeclareCaptionFont{white}{\color{white}}
\DeclareCaptionFormat{listing}{\colorbox{blue}{\parbox{\textwidth}{\hspace{15pt}#1#2#3}}}
\captionsetup[lstlisting]{format=listing,labelfont=white,textfont=white, singlelinecheck=false, margin=0pt, font={bf,footnotesize}}



In the preamble (Document-> Settings-> LaTeX Preamble)

Now add a program listing block. Hopefully you have the listing package installed otherwise you can always use the listing MikTeX update.

Now add the code to the listing block.

and then Ctrl-R

Tada!

Happy Lyxing

References & download LyxC#CodeListing.lyx

## Machine Learning - On the Art and Science of Algorithms with Peter Flach

Over a decade ago, Peter Flach of Bristol University wrote a paper on the topic of "On the state of the art in machine learning: A personal review" in which he reviewed several, then recent books, related to developments in machine learning. This included Pat Langley’s Elements of Machine Learning (Morgan Kaufmann), Tom Mitchell’s Machine Learning (McGraw-Hill), and Data Mining: Practical Machine Learning Tools and Techniques with Java Implementations by Ian Witten and Eibe Frank (Morgan Kaufman) among many others. Dr. Flach mentioned Michael Berry and Gordon Linoff’s Data Mining Techniques for Marketing, Sales, and Customer Support (John Wiley) for it's excellent writing style citing the paragraph below and commending "I wish that all computer science textbooks were written like this."

“People often find it hard to understand why the training set and test set are “tainted” once they have been used to build a model. An analogy may help: Imagine yourself back in the 5th grade. The class is taking a spelling test. Suppose that, at the end of the test period, the teacher asks you to estimate your own grade on the quiz by marking the words you got wrong. You will give yourself a very good grade, but your spelling will not improve. If, at the beginning of the period, you thought there should be an ‘e’ at the end of “tomato”, nothing will have happened to change your mind when you grade your paper. No new data has entered the system. You need a test set!

Now, imagine that at the end of the test the teacher allows you to look at the papersof several neighbors before grading your own. If they all agree that “tomato” has no final ‘e’, you may decide to mark your own answer wrong. If the teacher gives the same quiz tomorrow, you will do better. But how much better? If you use the papers of the very same neighbors to evaluate your performance tomorrow, you may still be fooling yourself. If they all agree that “potatoes” has no more need of an ‘e’ then “tomato”, and you have changed your own guess to agree with theirs, then you will overestimate your actual grade on the second quiz as well. That is why the evaluation set should be different from the test set.” [3, pp. 76–77] 4

That is why when I recently came across  "Machine Learning The Art and Science of Algorithms that Make Sense of Data", I decided to check it out and wasn't disappointed. Dr. Flach is the Professor of Artificial Intelligence at the University of Bristol and in this "future classic", he left no stone unturned when it comes to clarity and explainability.  The book starts with a machine learning sampler, introduces the ingredients of machine learning fast progressing to Binary classification and Beyond. Written as a textbook, riddled with examples, foot-notes and figures, this text elaborates concept learning, tree models, rule models, linear models, distance-based models, probabilistic models to features and ensembles concluding with Machine learning experiments. I really enjoyed the "Important points to remember" section of the book as a quick refresher on machine-learning-commandments.

The concept learning section seems to have been influenced by author's own research interest and is not discussed in as much details in contemporary machine learning texts. I also found frequent summarization of concepts to be quite helpful. Contrary to it's subtitle and compared to it's counterparts, the book however is light on algorithms and code, possibly on purpose. While it explains the concepts with examples, number of formal algorithms are kept to a minimum. This may aid in clarity and help avoiding recipe-book-syndrome while making it potentially inaccessible to practitioners. Great at basics, the text also falls short on elaboration of intermediate to advance topics such as LDA, kernel methods, PCA, RKHS, and convex optimization. For instance, in chapter 10 "Matrix transformations and decompositions" could have been made an appendix while expanding upon meaningful topics like LSA and use cases of sparse matrix (pg 327). It is definitely not the book's fault; but rather of this reader expecting too much from an introductory text just because author explains everything so well!

As a text book on On the Art and Science of Algorithms, Peter Flach definitely delivers on the promise of clarity, with well chosen illustrations and example based approach. A highly recommended reading for all who would like to understand the principles behind machine learning techniques.

Materials can be downloaded from here which generously include excerpts with background material and literature references, full set of 540 lecture slides in PDF including all figures in the book with LaTeX beamer source of the above.

## Cyber security for service oriented architectures in a Web 2.0 world: An overview of SOA vulnerabilities in financial services

My recently published IEEE Paper

Cyber security for service oriented architectures in a Web 2.0 world: An overview of SOA vulnerabilities

Service oriented architecture is fast becoming ubiquitous enterprise software architecture standard in public and private sector alike. Study of literature and current attacks suggests that with the proliferation of Web API and RESTFul services, the attack vectors prioritized by OWASP top 10, including but not limited to cross site scripting (XSS), cross site request forgery (CSRF), injection, direct object reference, broken authentication and session management now equally apply to web services. In addition service oriented architecture relies heavily on XML/RESTFul web services which are vulnerable to XML Signature Wrapping Attack, Oversize Payload, Coercive parsing, SOAP Action Spoofing, XML Injection, WSDL Scanning, Metadata Spoofing, Oversized Cryptography, BPEL State Deviation, Instantiation Flooding, Indirect Flooding, WS-Addressing spoofing and Middleware Hijacking to name a few. In this paper, we review various such security issues pertaining to service oriented architecture. These and similar techniques, have been employed by Anonymous and other hacktivists, resulting in denial of service attacks on financial applications. While discussing the national security perils of hacktivism, there is an excessive focus on network layer security, and the application layer perspective is not always part of the discussion. In this research, we provide background information and rationale for securing application layer vulnerabilities to facilitate true defense in depth approach for cyber security.

Date of Conference: 12-14 Nov. 2013

@INPROCEEDINGS{6698966,
author={Masood, Adnan},
booktitle={Technologies for Homeland Security (HST), 2013 IEEE International Conference on},
title={Cyber security for service oriented architectures in a Web 2.0 world: An overview of SOA vulnerabilities in financial services},
year={2013},
pages={1-6},
keywords={Availability;Data security;Information security;Information systems;SOA;Service oriented architecture;Web services;cyber security;secure design;secure software development;security assessment;security awareness},
doi={10.1109/THS.2013.6698966},}

## The Mother of All Demos, presented by Douglas Engelbart (1968)

Speaking of intelligence and foresight....

The Mother of All Demos is a name given retrospectively to Douglas Engelbart's December 9, 1968, demonstration of experimental computer technologies that are now commonplace. The live demonstration featured the introduction of the computer mouse, video conferencing, teleconferencing, hypertext, word processing, hypermedia, object addressing and dynamic file linking, bootstrapping, and a collaborative real-time editor.

## On Entropy Depletion & Related Links

I had to dig these up in the context of a conversation around the (in)security of currency regimes such as BitCoin where presumed ownership of currency is built solely upon asymmetric cryptography. You may find some of these links to be of interest as well.

Textbook RSA is insecure
and other interesting observations...

http://crypto.stanford.edu/~dabo/courses/cs255_winter00/RSA.pdf

Hardware Security for FPGAs using Cryptography
contains a great overview of different kinds of sideband attacks on cryptography
https://www.escrypt.com/fileadmin/escrypt/pdf/Hardware_Security_for_FPGAs_using_Cryptography_Microsemi_Huettemann.pdf
Acoustic cryptanalysis: on nosy people and noisy machines
seeing through The Matrix isn't really that hard if you know how to look at it
Disk encryption may not be secure enough
ye olde standard cold boot attack
On Entropy Depletion
Running out of randomness can hurt, bigtime.
http://www.educatedguesswork.org/2008/10/on_entropy_depletion.html
Researchers Crack RSA Encryption Via Power Supply
Invasive sideband attack.
Blue Pill - Machine Virtualization for Fun, Profit, and Security
Virtualization attacks.  Epic turtles.
via David Lazar.

## Slides from 11th Annual SecureIT conference- “OWASP Web Services Security - Securing your Service Oriented Architecture”

I recently spoke to 11th SecureIT conference on "OWASP Web Services Security - Securing your Service Oriented Architecture". This annual event was hosted by UC San Bernardino at Sheraton Fairplex Hotel.

This SecureIT Conference conference provides focus and opportunities to higher education staff meeting the challenges of providing a secure information technology environment for campus communities. The event was well attended with distinguished speakers, including Pradeep Khosla, UC San Diego’s chancellor, Michael Montecillo, IBM Security Services Threat Research and Intelligence Principal and Eric Skinner, VP of Mobile Security for Trend Micro.

The slides of my presentation can be found below.

## Quantum Computing & Entanglement with Dr. John Preskill @ Caltech

Last night I had the privilege to listen to Dr. John Preskill in Beckman Auditorium here at Caltech with fellow Quantum aficianado David Lazar. John Preskill is the Richard P. Feynman Professor of Theoretical Physics at Caltech. This was definitely one of the most accessible lecture on this topic of general audience which was very well received. Dr. Preskill is definitely a teacher and a communicator; as Feynman chair, he effectively summarized 50+ years of Quantum research and development into a one hour lecture. Quantum frontiers has some of the recorded lectures which readers may find interesting.

Dr. Preskill is also involved with IQIM, Institute for Quantum Information and Matter, at Caltech. Here is an IQIM Promotional video which was shown towards the end of the session.

The lecture addressed the opportunities and challenges in quantum computing, entanglements, speculation  about future trends, quantum error correction and quantum information science.

Couple of his detailed lectures can be seen below.

## The Clairvoyant Load Balancing Algorithm for Highly Available Service Oriented Architectures

Abstract: Load balancing allows network devices to distribute workload across multiple processing resources including server clusters and storage devices. This distribution helps maximize throughput, achieve optimal resource utilization, minimize response time and help use hardware effectively in multiple data-center locations. As a meta-heuristic enhancement to Psychic Routing[1], researchers present early work in a novel algorithm Clairvoyant for optimal yet unrealizable distribution of traffic.
Among many earlier works including [5, 4], the main inspiration of this algorithm is the RFC 1149, i.e. a standard for the Transmission of IP Datagrams on Avian Carriers. Study of literature suggests that earlier work by [7, 2] on internet protocol over xylophone players (IPoXP) also has a huge impact on classical OSI network model. A typical application load balancing is based on techniques including round robin, weighted round robin, least connections, shortest response, SNMP, weighted balance, priority, overflow, persistence, least used, lowest latency, and enforced traffic flow [6]. Researchers propose that Clairvoyant, by utilizing the ensemble of anomalous cognition, ESP, remote viewing and psychometry, can provide a high performance yet irreproducible load balancing approach. The Clairvoyant load balancing algorithm helps the system administrator fine-tune how traffic is distributed across connections in a psychic manner. Backed by parapsychological research[1], each load balancer is equipped with an enterprise grade channeling medium with features to fulfill potential special deployment requirements. Building upon the techniques proposed in RFC 5984, using extrasensory perception to achieve "infinite bandwidth" in IP networks, Clairvoyant can achieve negative latency as well as negative transmission time difference with appropriate parameters, unachievable by traditional methods[6, 3]. The algorithm uses claircognizance to redirect traffic to one of the unused or even non existent nodes. Clairaudience allows setting up the connection priority order, however early experiments suggest that using 0x8 spherical surfaces also achieve the same level of performance when compared using ROC/AUC.
Although irreproducible in most non-REM environments, the researchers see the potential of using this load balancing algorithm in most high performing service oriented architectures allowing the packet forwarding that will provide unsurpassed end user performance regardless of link capacity, distance, and number of hops. Detailed algorithm and findings will be published in The Journal of Irreproducible Results by 4/1/2014.

# References

[1] Jonathan Anderson, Frank Stajano. Psychic Routing: Upper Bounds on Routing in Private DTNs. , 2011.

[2] R Stuart Geiger, Yoon Jung Jeong, Emily Manders. Black-boxing the user: internet protocol over xylophone players (IPoXP). Proceedings of the 2012 ACM annual conference extended abstracts on Human Factors in Computing Systems Extended Abstracts:71—80, 2012.

[3] David R Karger, Matthias Ruhl. Simple efficient load balancing algorithms for peer-to-peer systems. Proceedings of the sixteenth annual ACM symposium on Parallelism in algorithms and architectures:36—43, 2004.

[4] KM Moller. Increasing Throughput in IP Networks with ESP-Based Forwarding: ESPBasedForwarding. , 2011.

[5] C Pignataro, G Salgueiro, J Clarke. Service Undiscovery Using Hide-and-Go-Seek for the Domain Pseudonym System (DPS). , 2012.

[6] Sandeep Sharma, Sarabjit Singh, Meenakshi Sharma. Performance analysis of load balancing algorithms. World Academy of Science, Engineering and Technology, 38:269—272, 2008.

[7] Emily Wagner, Yoon Jeong, R Stuart Geiger. IPoXP: Internet Protocol over Xylophone Players.

Go to Top