An Important Mathematical Oversight

The original intention for this website was to encourage public awareness of an historical medical crime, one that has remained a tightly-kept British state secret now for more than five decades. The matter is of enormous public interest, not least because the motivation behind the crime itself was that of advancing scientific research into areas that would come to provide the seminal knowledge behind much of the technological progress of the last half-century. My investigation into the matter inspired a parallel enquiry into some of the fundamental principles that underpin that scientific and technological impulse.

There are therefore two principle concerns of this website, and if there is acknowledged to be a substantive connection between them, that has inevitably to do with late 20th Century developments in science and information technologies, and more broadly with the idea of an burgeoning technocracy – the suggestion of a growing alliance between corporate technology and state power – one that might be judged to have atrophied the powers conventionally assigned to liberal-democratic institutions. This link therefore serves as a segue to emphasise the equal importance, to my mind, of what is going on in the X.cetera section of the site, so that that section should not appear, from the point of view of the other, as some kind of afterthought.

X.cetera is concerned with a problem in mathematics and science to do with the way we think about numbers. As a subset of the category defined as integers, elements in the series of the natural numbers are generally held to represent quantities as their absolute, or ‘integral’, properties. It is argued that this conventional understanding of integers, which is the one widely held amongst mathematicians and scientists adopting mathematical principles, is the cause of a significant oversight with regard to changes in the relations of proportion between numerical values, i.e., when those values are transposed out of the decimal rational schema into alternative numerical radices such as those of binary, octal, and hexadecimal, etc.

On the page: The Limits of Rationality it is argued that the relations of proportion between integers are dictated principally by their membership of the restricted group of characters (0-9) as defined by the decimal rational schema; and that corresponding ratios of proportion cannot be assumed to apply between otherwise numerically equal values when transposed into alternative numerical radices having either reduced (as in binary or octal, for instance) or extended (as in hexadecimal) member-ranges.

This is shown to be objectively the case by the results published at: Radical Affinity and Variant Proportion in Natural Numbers, which show that for a series of exponential values in decimal, where the logarithmic ratios between those values are consistently equal to 1, the corresponding series of values when transposed into any radix from binary to nonary (base-9) results in logarithmic ratios having no consistent value at all, in each case producing a graph showing a series of variegated peaks and troughs displaying proportional inconsistency.

These findings are previously unacknowledged by mathematicians and information scientists alike, but the import of the findings is that, while the discrete values of individual integers transposed into alternative radices will be ostensibly equal across those radices, the ratios of proportion between those values will not be preserved, as these ratios must be determined uniquely according to the range of available digits within any respective radix (0-9 in decimal, 0-7 in octal, for instance); one consequence of which of course is the variable relative frequency (or ‘potentiality’) of specific individual digits when compared across radices. This observation has serious consequences in terms of its implications for the logical consistency of data produced within digital information systems, as the logic of those systems generally relies upon the seamless correspondence, not only of ‘integral’ values when transcribed between decimal and the aforementioned radices, but ultimately upon the relations of proportion between those values.

Information Science tends to treat the translation and recording of conventional analogue information into digital format unproblematically. The digital encoding of written, spoken, or visual information is seen to have little effect on the representational content of the message. The process is taken to be neutral, faithful, transparent. While the assessment of quantitative and qualitative differences at the level of the observable world necessarily entails assessments of proportion, the digital encoding of those assessments ultimately involves a reduction, at the level of machine code, to the form of a series of simple binary (or ‘logical’) distinctions between ‘1’ and ‘0’ – positive and negative. The process relies upon a tacit assumption that there exists such a level of fine-grained logical simplicity as the basis of a hierarchy of logical relationships, and which transcends all systems of conventional analogue (or indeed sensory) representation (be they linguistic, visual, sonic, or whatever); and that therefore we may break down these systems of representation to this level – the digital level – and then re-assemble them, as it were, without corruption. Logic is assumed to operate consistently without limits, as a sort of ‘ambient’ condition of information systems.

In the X.cetera section I am concerned to point out however that the logical relationship between ‘1’ and ‘0’ in a binary system (which equates in quantitative terms with what we understand as their proportional relationship) is derived specifically from their membership of a uniquely defined group of digits limited to two members. It does not derive from a set of transcendent logical principles arising elsewhere and having universal applicability (a proposition that, despite its apparent simplicity, may well come as a surprise to many mathematicians and information scientists alike).

As the proportional relationships affecting quantitative expressions within binary are uniquely and restrictively determined, they cannot be assumed to apply (with proportional consistency) to translations of the same expressions into decimal (or into any other number radix, such as octal, or hexadecimal). By extension therefore, the logical relationships within a binary system of codes, being subject to the same restrictive determinations, cannot therefore be applied with logical consistency to conventional analogue representations of the observable world, as this would be to invest binary code with a transcendent logical potential that it simply cannot possess – they may be applied to such representations, and the results may appear to be internally consistent, but they will certainly not be logically consistent with the world of objects.

The issue of a failure of logical consistency is one that concerns the relationships between data objects – it does not concern the specific accuracy or internal content of data objects themselves (just as the variation in proportion across radices concerns the dynamic relations between integers, rather than their specific ‘integral’ numerical values). This means that, from a conventional scientific-positivist perspective, which generally relies for its raw data upon information derived from discrete acts of measurement, the problem will be difficult to recognise or detect (as the data might well appear to possess internal consistency). One will however experience the effects of the failure (while being rather mystified as to its causes) in the lack of a reliable correspondence between expectations derived from data analyses, and real-world events.

So that’s some of what X.cetera is all about.. If you think you’re ‘ard enough!


Download my 165-page report
[pdf – 1.8MB]:

Download my Open Letter to the British Prime Minister & Health Secretary
[pdf – 464KB]:

The Limits of Rationality
(an important mathematical oversight)


Radical Affinity and Variant Proportion in Natural Numbers

Mind: Before & Beyond Computation

Dawkins' Theory of Memetics – A Biological Assault on the Cultural

Randomness, Non-
Randomness, & Structural Selectivity


Measure & Rule

In terms of Euclidian geometry the world of geometrical relations is a constructed one, rather than simply a ‘reflected’ one. Geometry arises out of the application of abstract formal logic onto concrete spatial relations. Euclid’s straight line expresses a distance between two points, but it has no thickness – therefore no material substance; and if we regress to the single point, the same ethereality applies. The line also may theoretically extend to infinity, though its actual straightness in this projection has never been proved mathematically. If we progress as far as the third or the fourth point in space, this affords us the reassurance of enclosed space, and the means for transforming the concrete and the visible in representational space, although the certainties that this may imply remain arbitrary, purely formal ones. The theory of measured and constructed space, the sanctity of the right angle, which was fundamental to mathematics at least until the 19th century, assumes a fundamental consistency and proportionality – a ‘flatness’ – to space, and to the numerical divisions which theory superimposes on it. For Euclidian geometry, proportions in space may be defined diagrammatically in terms of continuous magnitudes;1 but the idea that space intrinsically behaves according to proportional rules has been criticised since the development of non-Euclidian geometry in the 19th century, and more recently in terms of General Relativity. The latter relates the ‘shape’ of space, or its proportionality, to macroscopic factors of time and the density and masses of objects, and suggests the formal logic of Euclidian geometry to be an approximation to actual conditions, accurate only within the terms of a limited perceptual scale. Importantly however, neither non-Euclidian geometry nor 20th century relativistic physics have raised any questions over the relativism of the units of measurement themselves. While proportionality can no longer be assumed as an absolute property of objects in space, it remains the fundamental assumption informing the numerical divisions by which physical properties are measured and calculated.2

Reality Requires a Frame

As a principle of geometry (literally: ‘to measure the world’), the dominance of the rectangle becomes all-inclusive. With advances in the development of optics in the 17th century there is no part of the world too distant, or too small, that it is not possible in theory to represent or to measure it. The confluence of two major technological developments: optical imaging and perspectival representation, enhances human vision and perception with a novel objectivity, which in turn gives birth to a new form of scientific reality based upon the idea of ‘reflection’. While the subdivision and measuring out of geometrical reality according to a system of rectangular proportion can only ever be a highly constructive affair, the means or the labour of this construction is somehow elided, or ‘sanitised’ from the products of representation, and what remains is the illusion of objective unity – that is, the identity of a thing with its representation. Scientific imagery and the description of Nature in terms of its proportional quantities appear as the direct reflection of a pre-existing reality, as a series of divided and divisible facts, but ignores its own construction of the frame of reality – i.e., the superimposition of abstract rules of geometry and proportion by artifice.

The power of Reason is that it can discuss and compare the contents of representations as if they were not artificial, but directly continuous with the world, and so it conveys ‘transparency’ upon scientific description, which implies a kind of moral supremacy over its objects (and also supports the illusory promise of knowledge without limits). This is an inverted logic – the world or Nature (or indeed the awkwardly singular concept of ‘the Universe’), in their entireties, will always exceed our powers to represent or to measure them, in some degree.

The ‘real’ then is that which we know through the artifice of representation (including mental representations). What is not known (or presumed) through representation is also not yet real. ‘Reality’ is therefore contingent on the limits of representation and the limits of knowledge. But the inversion of logic implied in the notion of transparency is that the frame of reality acts as a ‘window on the world’, which suggests that the real also exists as a continuum beyond the frame. In this way ‘reality’ is mistaken as an absolute property of the world and of objects3, rather than, as it emerges in this discussion, as a contingent product of a system of representational signs.

Measured Facts and Signifying Practice

The literal, or scientific, description of natural events in language runs the risk of qualitative distraction. Scientific description frequently employs figurative and metaphorical language in its efforts to objectivise Nature. This is only to be expected – it makes science more readable. If verbal description were all we had on which to base our judgements, the situation would be fairly problematic. We can tolerate the distraction, as science reassures us that its judgements are ultimately based upon sound quantitative criteria. However, the effect is generally to underemphasise the degree of interpretation which is implicit in any verbal description, and also the degree to which the focus on selective quantitative data tends to circumscribe expectations and results.

In The Structure of Scientific Revolutions, Thomas Kuhn makes the point that tacitly held scientific knowledge relies upon the assumption of a 1:1 relationship between perceptual stimuli on the one hand and the interpretative sensations which follow from those perceptions on the other. But while raw perceptual stimuli may remain consistent between diverse individuals and groups, the cognitive sensations formed on the basis of those stimuli may be “systematically different”, according to their professional environment and according to the educative principles with which they are informed. This makes problematic the assertion of neutrality in scientific knowledge, and implies that descriptions of ‘the world’ between diverse individuals or groups (or between different scientific fields, or competing theories) refer to quite different, perhaps even incompatible, sets of objective schemata.4

A comparable critique to Kuhn’s phenomenological approach follows from an application of structural linguistics to scientific discourse. In the linguistic model of the verbal sign as originally proposed by Ferdinand de Saussure5, the sign is a dynamic medium, a reflex between formal (phonic or graphic) elements, and semantic or representational ones. Saussure identifies a binary model of signification composed of material signifiers and conceptual signifieds, where the relationship between these two elements is an arbitrary, rather than a naturalistic, one:

“The word arbitrary also calls for comment. The term should not imply that the choice of the signifier is left entirely to the speaker [...] I mean it is unmotivated, i.e. arbitrary in that it actually has no natural connection with the signified.”6

The signifier is the formal content of a representation – in terms of verbal language, the material form of the letters that compose words. The signified refers to the ideational component of a representation – the mental correlate I form upon reading the word ‘house’ or ‘dog’, for instance. Structural linguistics considers the relationship between these two items to be arbitrary as there is no requirement for formal similarity between words and the things they represent (except, for instance, in isolated cases of onomatopoeia). The fact that the combination of letters swine corresponds to that particular animal in English, while the same animal is represented as cochon in French, is not determined on rational principles (there is no proportional or ‘naturalistic’ correspondence between the form of the letters and the physical form of the animal, as there would be say in a heiroglyph) and the signifying relationship can only be understood in terms of linguistic convention.7

Words may also shift their frames of reference, and one signifier may stand-in for another by substitution, as in metonymy. The field of signification is distinct from objective reality insofar as it is possible for signification to take place without any reference to real world objects, in a work of fiction for instance. In a factual representation there is required the introduction of a third term – the referent – which proposes the definite object of reference, according to identifying details – date, place, time, proper name, etc. The stability of a representation of fact therefore depends on the assumption of a naturalistic association between signifier, signified, and referent. But, in terms of structural linguistics, this association is dependent on the linguistic conventions of a shared culture, or of the shared discourse of well-defined linguistic and academic communities. The assumption of a naturalistic association between a verbal description and the objects under discussion emphatically disregards the role of linguistic communities in establishing discrete regimes of signification, according to their particular discursive requirements.

To speak ‘naturalistically’, or ‘with transparency’, entails a fusion of the components of the sign: the sign appears as a unity – as a token for the thing in itself, which allows signs to partake in the construction of the sense of reality. Without this effect, the representation of reality would be impossible. But the terms upon which representation takes place – the discipline which the signifier exerts over the signified – are the terms which reflect the interests and affiliations of discrete linguistic communities and groups, and therefore the particular language shared by any number of such groups becomes an arena of contention in which those interests may conflict or coincide. In its simplest expression, the location of this struggle is that of the revolving causality between signifier and signified within the linguistic sign, in a process which may proceed indefinitely, as these elements are ultimately constructed in an arbitrary relationship with one another.

A further characteristic of verbal signifiers (upon which “the whole mechanism of language depends”) is that, in practical terms, they form a linear horizontal chain, which unfolds in time.8 The direction and momentum of its syntax is therefore arranged orthogonally in relation to the dimension of its referential property, and with a degree of formal independence from that referential function – a point emphasised in structuralist approaches, and largely ignored in functionalist analyses of language.

The supposition of neutrality in scientific description is dependent on the reality-effect as constructed through the interplay of signs, and essentially ignores the fact that signs are always value-laden; for an arbitrary relationship has no meaning unless at least one of its terms is invested with value. Conversely, the quantitative assessments and measurements which underpin scientific observation appear to be value-free, so that the subjection of such assessments to discursive interpretation, aligned with the reality-effect of the governing discourse, appears harmless, or innocent.

Scientists, by and large, are not well-versed in the techniques of linguistic analysis, and so, for the most part, will tend to underestimate the extent to which their own narrative judgements and conclusions, and especially, the conversion of those judgements in to publicly accessible discourse, become invested with value. The construction of scientific language and the historical developments that language undergoes are, as with all linguistic phenomena, resistant to change by design and the conscious control of individuals – they occur largely unconsciously and imperceptibly. This is a further characteristic that follows from the arbitrary nature of signs.9 There is no inherent relationship of transparency between a signifier and a signified as these elements occupy discrete operational fields: for the signifier, that of grammar and syntax; for the signified, that of the association of ideas. Linguistic change occurs in part due to shifts in the signified (for a particular signifier) according to the vicissitudes of a particular linguistic community, in a given historical context, according to the current state of knowledge.

Literal scientific description depends upon a set of theoretical relationships which are good for a particular specialist discourse for a limited historical period, and the transparency of its statements requires, therefore, a certain ‘anchoring’ of the correspondence between signifying elements of its language; which is to say that mental and ideational components (signifieds) are subjected to a form of discipline that is not inherently characteristic of them. Scientific judgement, in its reduction of the world to a set of measurable mechanical-physical properties, tends to bracket out the field of the signified from its evaluative process, asserting instead a form of sanitised, direct transparency between signifier and referent, by which the very process of signification is occluded.

Meaning however does not derive through transparency, but chiefly from the context in which statements are delivered. Institutional discourse acquires legitimacy through a set of consensual rules, affiliations, and expectations. While raw quantitative information (albeit selectively derived and dependent on references to qualitative categories) may pass for factual data, any interpretation of such data subjects it to a value consensus which undermines its unitary relationship to material objects. The relationship between a quantitative measurement and its interpretation involves a double axis – the vertical axis of reference, interceded by a horizontal axis which situates the quotation in a shared purposive discourse. This intentionality has a circumscribing effect too on the initial selectivity of the observational process. The tendency is for scientific description to disregard its institutional affiliations and to present interpretations of quantitative information as if they were naturally derived – that is, to employ complex, reflexive, and binary signifying material as hypothetically unitary.


June 2013

back to top ^


  1. Euclid, Elements, Thomas L. Heath (ed.), 1908, (Dover Publications, 1956). [back]
  2. Although, as noted by Kragh, the unifying of general relativity with quantum theory in the loop quantum gravity (LQG) program, requires a “quantizing” of the structures of spacetime, with the consequence that: “the spectrum of a quantity (i.e., the numerical values it can possess) can become discretized where it was previously continuous, or bounded where it was previously unbounded, and quantities can be forced to obey a Heisenberg uncertainty principle.” Kragh, H., Cosmology and Theology, Oct. 24 2011: [back]
  3. That is, for all intents and purposes, as something akin to a modern version of the metaphysical concept of substance, as a universal and essentially indivisible property of objects and of matter (also pertaining, somewhat archaically, to non-material objects such as the Soul), and which for pre-Enlightenment philosophers and scientists expressed the Divine origin of the Universe. [back]
  4. For a discussion of the relationship of perception and interpretation, from a phenomenological perspective, in the formation of scientific knowledge, see the Postscript to Thomas S. Kuhn’s The Structure of Scientific Revolutions, pp.191-198: Tacit Knowledge and Intuition, Chicago UP, 1996. [back]
  5. Saussure, F., Course in General Linguistics (1916), McGraw-Hill Paperback Edition, 1966. [back]
  6. Ibid., pp.68-9. [back]
  7. Saussure makes the distinction between this definition of the linguistic sign and the more conventional term ‘symbol’. The sign is a term of general applicability and its syntactic properties depend on the principle of arbitrariness, i.e., of a non-naturalistic (or non-rational) connection between signifier and signified. Symbols are a special case in which there is an implied rational connection between a symbol and its concept:

    “One characteristic of the symbol is that it is never wholly arbitrary; it is not empty, for there is the rudiment of a natural bond between the signifier and the signified. The symbol of justice, a pair of scales, could not be replaced by just any other symbol, such as a chariot.” (ibid., p.68). [back]

  8. Ibid., p.70. [back]
  9. Ibid., pp.73-6. [back]

back to top ^