Typology of disclosure: overt, sequestered, covert, invisible—ethical considerations—manipulating the aleatory pool.
There is one aspect of aleatography that we have yet to address. We need a means of hiding the software that we use to extract software from within an aleation. We also need a means to create and distribute pure aleations without giving the impression that we have any interest in privacy. These objectives can be achieved through the use of dual-purpose software.
Dual-purpose software performs two very distinct functions: the primary function can be anything whatsoever, as long as it is not associated with privacy and is not likely to cause offence to any Big Brother; the secondary function is one that is associated with privacy, either because it helps to develop the aleatographic infrastructure or because it implements some information hiding technique.
The great advantage of dual-purpose software is the defence of plausible deniability that it affords to anyone who possesses it. Since most people will use the software for its primary purpose alone, and may well not even know of the software’s secondary purpose, there is no reason to suspect anyone who possesses it of having an “unhealthy” interest in privacy.
Typology of Disclosure
We can divide dual-purpose software into two categories according to whether or not it discloses its secondary purpose: disclosing and non-disclosing.
We can divide disclosing dual-purpose software into three categories according to the manner in which information about its secondary function is disclosed to potential users: overt, sequestered, and covert.
If disclosure is “overt” then both the primary and secondary purposes of the software are proclaimed for all the world to hear. The home page might start off by saying, “This software has two distinct purposes. You can use it to create crossword puzzles, or you can use it to hide information within images.” With overt dual-purpose software it is very likely that most users will understand that it can perform two unrelated functions.
If disclosure is “sequestered” then information about the secondary purpose of the software is made available to its users, but in such a manner that the average user is unlikely to find it. For example, the information may be buried in the depths of the documentation under an obscure sub-heading; and to initiate the secondary function it may be necessary to click on a button with some enigmatic label, having first ticked a certain check-box that lies buried within some a collection of option tabs. With sequestered disclosure the vast majority of people using the software will be unaware of its secondary purpose.
If disclosure is “covert” then information about the secondary function of the software will not be found within the software itself; and to initiate the secondary function it will be necessary, for example, to enter a specific code into a specific field that as far as the primary function is concerned serves some other purpose. The documentation needed to initiate and make use of the secondary function will not be available on the site from which the software is downloaded but will be distributed amongst privacy forums or, perhaps, only to select groups of individuals. No ordinary user will be aware of, or able to initiate, the secondary function. In the absence of documentation, it would be necessary to disassemble the executable code in order to determine that a secondary function exists.
If disclosure is “invisible” then information about the secondary function of the software is never explicitly documented. Instead, the reader can infer from a description of how the software works that it could be used to support some secondary function. Unlike the other three methods of disclosure, this method protects the author of the software from accusations that he is writing software to support aleatography or information hiding.
These disclosure mechanisms serve different purposes. In regimes that are merely restrictive then overt dual-purpose software is the best choice, as while no one can prove that by possessing the software a user is making use of its secondary function, the existence of that secondary function will be widely known.
Within proscriptive regimes, sequestered and covert dual-purpose software are far more useful. It is entirely plausible that an individual who possesses such software has no knowledge of its secondary function. On the other hand, fewer individuals are likely to discover that secondary function.
Of course, it’s possible to incorporate the same secondary function into different software products that are made available from different websites, where one product makes overt disclosure and the other does not. The website offering the product with overt disclosure could then mention the existence of its counterpart, and where to obtain it, for the benefit of those individuals living under proscriptive regimes.
Of particular interest is non-disclosing dual-purpose software. If the secondary function can be automated, then there is not even a need for documentation from which the existence of a secondary function might be inferred. If some automatic, non-disclosing dual-purpose software tool became popular, then its secondary function would be executed very frequently. We see this type of software as playing a very useful role in the creation and dissemination of pure aleations.
Now, just because Big Brother is entirely lacking in morals, doesn’t mean that we have to follow suit! Certain kinds of dual-purpose software could put some users at risk. With dual-purpose aleatory software there should be no problem, but with dual-purpose information hiding software there might well be. What if the software is unwittingly downloaded by someone living under a proscriptive regime, and is subsequently found by Big Brother? If disclosure is sequestered or covert then its discovery would not in itself arouse suspicion, so there should be no difficulty. However, if disclosure is overt, then Big Brother may well conclude that the person who downloaded the software was aware of its secondary purpose. So if you’re making software with overt disclosure available for download, then we suggest you succinctly display the information about its secondary function alongside a check box that the user is required to tick before the download starts.
The second issue concerns non-disclosing dual purpose software that performs its secondary function automatically. The secondary function should not do anything that would compromise the user through the use of information that the software may gain in carrying out its primary function, and its use of computer and network resources should not be excessive.
Manipulating the Aleatory Pool
The single most valuable secondary function that dual-purpose software can perform is to create and maintain an “aleatory pool”. An aleatory pool is a collection of one or more aleations. Typically these will be stored in some working directory on the hard disk. The aleatory pool is created and manipulated by the software as part of its primary function, so the primary function needs to be one that can make use of random data. The user can insert or remove ciphertext that masquerades as aleations from the aleatory pool using the standard file copy functions provided by the operating system. The aleatory pool can be used for (1) storage; (2) transformation; and (3) communication.
A user can copy ciphertext obtained from some other source into the aleatory pool. As the ciphertext will be indistinguishable from the aleations produced by the software, the user has the perfect storage location for encrypted material.
If the software uses the aleations to modify some other data, such as an image, in a reversible manner, then by substituting ciphertext for an aleation a user would be able to insert the ciphertext into, and later retrieve it from, the data. The modified data would provide an alternative means of storage and could possibly act as a useful carrier of the ciphertext for the purposes of communication.
Communication software that creates and manipulates an aleatory pool performs some of the basic functions of a janionic network. Even users who have no interest in privacy are still creating and exchanging aleations. If the software became popular then it might well generate a nascent janionic network consisting of millions of users. Users with an interest in privacy could then replace aleations with ciphertext and have it shipped to a recipient’s aleatory pool as part of the software’s primary function.