26 February 2007

THE CLUB OF ROME: INCUBATOR FOR THE "SCIENCE" OF THE APHENOMENAL MODEL

As the following excerpt from an obituary from today's editions of The New York Times discloses, the Club of Rome and its work mark the starting-point of the scientific research problematic in which the modern aphenomenal model has been incubated. By ‘modern’, we refer to the version of the model based on theories developed from the field of systems and operations research. These theories conceal or downplay the roles of human planning and intervention in the contemporary technological disaster. Their modus operandi consists in ascribing outcomes not to class interests and intentions — including their human authors — but rather instead to the automaticity of systems and their operation. This approach shifted the focus away from the contradictions engendered by centuries of foreign domination and the local rebellions and resistance that this fuelled. Instead, the focus became one of fine-tuning the automaticity of such systems with more comprehensive planning. In practice, disguised by gobbledygook about “better feeedback loops”, the content of this comprehensiveness included continuing ever-greater intrusion by North American and European governments, and their tools inside and outside governments of developing countries, to keep public right co-opted by monopoly right in these countries, making peoples and governments in the developing world kowtow to the whims of North American and European corporate molochs.

Starting in 1960, the Rostow brothers (Eugene Rostow in the CIA, Walt Whitman Rostow as the author of The Stages of Economic Growth) unfolded their theories and practices for “modernising” developing countries in a manner that would integrate them more closely, and subjugate them, within a U.S.-dominated imperium (later dubbed the Pax Americana). This was applied with special vigour in central and South America and the Latin Caribbean. With the sole exception of Cuba, by the mid-1960s there was no government in Latin America whose armed forces and police were not compromised and embroiled in programs of the CIA and the US armed forces to train and equip death squads. The theories and practices engendered by this line of political economy were what informed, and provided the warp and woof of, the Club of Rome's entire outlook and approach. Among other aims, this was intended to further the same aims of foreign colonial domination in general without openly reasserting the discredited open racism of Eurocentric outlook in particular.

With the installation of the Kennedy administration, imperial “help”, i.e., foreign intervention and domination developed largely under the U.S. Agency for International Development (USAID), was mandated for modernising developing countries. With its modelling of overpopulation as a condition endemic to economies that fail to achieve critical mass and take off, etc., the Club of Rome produced many “scientific” justifications ascribing the economic difficulties of developing countries to their unfittedness to handle properly the “complexity of modern systems and their requirements” — a logic resonating with toxic British imperialist noises spawned in the 19th century about ”races that are best-fitted for ‘self-government’ ”. No blame for these countries’ failure to thrive was ever attached to any of the destructive consequences of centuries-long colonial intervention and dictate and the massive violence unleashed by colonial masters to prop up local ruling tools against rebellion from below.

The Club of Rome ensured that their own actual consciousness would never be asserted openly — that “we” your overlords, possessing the powerful magic of systems analysis, are white and know what’s what, and what’s best, whereas “you” our lowly subjects are lesser breeds without the law in need of continual whiffs of grapeshot combined with the occasional good extermination. It focused instead on effecting some cosmetic surgery on the ugly and hated visage of European colonialism, enough to ensure the return of this imperialist agenda — newly decked out as “fresh American thinking” — to the commanding centre of world affairs.



=======

Hasan Ozbekhan, 86, Economist Who Helped Found Global Group, Dies

By JEREMY PEARCE
The New York Times, Mon 26 February 2007, Page B6

{EXCERPTS]

Hasan Ozbekhan, a Turkish-born economist and management expert who helped found the Club of Rome, a group of thinkers who came together to examine unwieldy global problems like food shortages and overpopulation, died on Feb. 12 in Philadelphia. He was 86.

...

In the early 1970s, Mr. Ozbekhan (pronounced UHZ-beh-kahn), who taught at the University of Pennsylvania and applied the field-of-systems theory to global problems, helped inspire the group of planners, diplomats, scientists and academics who came together as the Club of Rome. He wrote a paper, “The Predicament of Mankind,” that became an influential core document of the group, addressing issues of energy, overpopulation, depletion of resources and environmental degradation.

Alexander N. Christakis, a former colleague in the Club of Rome, said Mr. Ozbekhan’s writings constituted “a forward-looking document” and argued that global problems were “strongly interconnected and that any attempts to deal with them independently would simply not work.”

Mr. Ozbekhan, who was the club’s director of research and a member of its executive committee, later resigned, but the organization continues and now operates from Hamburg.

In 1975, while working as a consultant for the French government, Mr. Ozbekhan published a report about long-range planning in Paris. The report reviewed land use, cultural issues and the city’s economy and tried to provide the French with an avenue toward developing the city “within the context of a globalizing world.” In 1977, he gave a lecture on the future of Paris before the Royal Society in London.

Hasan Ziya Ozbekhan was born in Istanbul. He earned an undergraduate degree from the London School of Economics. He became an American citizen in the 1950s.

From 1963 to 1969, Mr. Ozbekhan was principal scientist and director of planning at the System Development Corporation, a military research group and software development company in Santa Monica, Calif.

In 1970, he was named a professor of management at the University of Pennsylvania, where he also taught statistics, operations research and social systems sciences. He retired in 1992.

...

08 February 2007

YOU DON'T NEED TO BE EINSTEIN TO GRASP THE APHENOMENAL MODEL UNDERLYING THE PROFITABLE BUSINESS OF SMOKING-CESSATION MEDICATIONS

The graphic that comes with Kevin Helliker's front-page article in today's Wall Street Journal entitled NICOTINE FIX- Behind Antismoking Policy, Influence of Drug Industry; Government Guidelines Don't Push Cold Turkey; Advisers' Company Ties[1] compares the success rates in quitting smoking of those who quit cold-turkey versus those who relied on medications. It shows, first, that almost as many ex-smokers who quit "cold turkey" as ex-smokers who relied on anti-smoking medications are still off cigarettes 3 months after quitting. Additionally, more "cold turkey" ex-smokers are still off after 9 months than users of anti-smoking medications. These two facts form a profound and powerful indictment of the entire smoking-cessation sector of Big Pharma.



The aphenomenality of smoking-cessation pharmaceuticals lies in the assumption that smoking is a habit requiring external intervention to break. This is the assumption that becomes the premise for product development and marketing.

Smoking as a mass habit is widely blamed on becoming addicted to nicotine. A more complete statement of the case is more something like the following:

From the mid-1950s, after the Khrushchev group replaced the Stalin and his circle, the U.S. ruling circles were consumed with intensifying all forms of competition --- military and otherwise --- with the Soviet Union. The aim was to force the latter into a crisis that would bring it crashing down. The U.S. ruling circles had numerous and extensive ambitions beyond U.S. territory which they wanted to pursue without hindrance. The main hindrance these ambitions faced on the home front, among U.S. citizens, was the existing level of participation of the American public in political affairs. This was much higher than the authorities liked. Maximum consumerism was thus developed to serve both the anti-Soviet and the pro-U.S. imperial aims. Middle-class standards of high consumption were encouraged with the aim of wiping out and eventually displacing mass participation in the political process at any level within the United States, and to reinforce propaganda representations of the Soviet bloc as uniformly dingy, stodgy, unmodern, without household appliances, etc.

Meeting these standards, however, imposed ever-increasing levels of personal anxiety on individuals trying to handle the accompanying stresses at the place of work, at home and in social intercourse. One index of this increased general level of societal and individual stress could be seen in the rapid rise in sales of drugs to handle mood and appetite -- tranquilisers, vitamins etc. Noting this trend, Big Tobacco consciously set out to transform the cigarette into a nicotine delivery system, ensuring profits from generation to generation.

Einstein pointed out that the thinking that got one into a problem was not going to get one out. Sure enough, as the evidence served up by the Wall Street Journal exposé confirms, the thinking that the medical science bought and paid for by one massive concentration of capital resources to hook people on nicotine delivered in cancer-causing cigarettes could be redeployed by some other concentration of capital resources to "help" people "stop" smoking profitably has not gotten the society out of the problem of massive concentrations of capital usurping the public's right to good health in the first place.

~~~~

Note

1.Wall Street Journal subscribers only may access this article online.

06 February 2007

TRANSPARENCY AND THE APHENOMENAL MODEL

The aphenomenal modelling of "transparency" starts from placing maximum transparency at one end of a spectrum, and total opacity at the other end, and then discussing degrees in between these poles. As this approach renders it impossible to assert what transparency itself is, in the first place, then what transparency is in its relationships to its internal components and to the external world, aphenomenality is inherent in it.

There are four kinds of transparency that can be readily distinguished:

1 - no screen between subject & object, and mutual awareness of each other;
2 - subject knows there is a screen but object does not, e.g., one-way glass;
3 - neither the subject nor the object know there is a screen through which other third parties observe them; and
4 - both subject and object know there is a screen and/but they can observe one another.

As far as an external third-party observer is concerned, the first, third and fourth look substantially similar, yet each of these scenarios would obviously look and be experienced entirely differently by the subject and object involved. The first, second and fourth are transparent for the subject and a third-party. In the second case, nothing is transparent for the object, and in the third something is hidden from both the subject and object even though they experience transparency with respect to each other.

Natural transparency - the first case - involves no third party regulating or modulating relations between a subject and an object. Aphenomenality arises in each of the other cases as an outgrowth of such third-party involvement.

04 February 2007

Political Economy of the Aphenomenal Model:
THE HONEY → SUGAR → SACCHARIN®→ASPARTAME® , OR HSS®A®, PRINCIPLE


Introduction

We are compelled to live out our lives in an aphenomenal universe. One of its features is the “Honey → Sugar → Saccharin®→Aspartame®” principle [1].

The central idea captured in the form of this principle concerns the degradation of the natural qualities of goods and services through intensified refining. Used originally without further external processing, materials taken originally from the natural environment (including even the living thoughts of human beings) become degraded step-by-step as a result of adding ever-further stages of external processing. This can carry on potentially indefinitely.

It is the aphenomenal quality of this principle that renders it so useful to the status-quo. This aphenomenality resides in its unquestioned underlying assumption that further refinement (or refining) must in the end improve quality [2].

Here we propose to discuss some aspects of the political economy of this principle.


I. The HSS®A® Principle & Monopoly Right

Greater refining is not a function of the increased application of creative human labouring power by the actual producers. On the contrary, it is a function of the owners and managers of production investing more in fixed capital, to replace creative living labour with more advanced machines. These machines are actually accumulated labouring power congealed in machine form, i.e., dead labour. In effect, dead labour is being employed to replace living labour [3].

The refining displaces natural elements of the original source material. Often the replacements are components in molecular forms that either would not be found in nature either at all, or in that particular physical phase, or in the presence of so-called catalysts comprised from laboratory-created combinations not normally present in, or characteristic to, nature. This displacement marks one of the main yet intangible manifestations of the degradation of quality after upgraded refining [4]. On the one hand, regardless of the relative increase in capital-intensiveness, an absolute increase in “total” labour content, i.e., of living labour plus dead labour, is indeed possible and is effected in practice by these means. At the same time, however, that absolute increase in “labour” content incidental to the additional “refining” effort neither prevents nor arrests the tendency for natural components to decrease relatively or absolutely compared to synthesised components in the end-product. What else is taking place within this transition? Exercising power on behalf of a monopoly, oligopoly or cartel, managers and owners of production are dictating the use-value of the commodities that they produce and market [5]. The ability to dictate how commodities will be used, and what value they have in use, derives from the exercise of monopoly right [6].

The capacity to dictate use-value by “over-engineering” at the point of production marks a major break with the situation that prevailed before that point. With the emergence of monopolies, oligopolies and cartels, there were major efforts undertaken to deploy mass advertising campaigns to invent new social wants which monopolies then fulfilled with products [7]. The purchasing power for consumption available in the markets that grew in this period of the 1950s and 1960s was based in the Third World, and mostly among the previously very impoverished at that. American levels of consumption were not going to become available any time too soon in these areas. The need and ability of the American system to expand on a global scale was also blocked by the presence of the Soviet camp. In the bipolar world of that time, monopolies could secure superprofits “at the margin” by inventing new needs for new products as opportunities arose. However, monopoly right was not in a position to trump public right, and a wide array of product standards were indeed regulated by various government agencies, thus posing a potentially punishing financial risk to monopolies that attempted to bully the markets with products that failed to acquire approval from these bodies [8].

Pre-monopoly capitalism had developed and spread across the world by focusing entirely and exclusively on the generation and capture of the exchange-value of commodities. Commodities were a vector, a delivery-vehicle, for this exchange value, in the same way that the modern cigarette was redefined by the U.S. tobacco giants in the 1950s as a nicotine delivery vehicle, or that the anopheles mosquito is described as a vector for the spread of malaria. Capitalists produced commodities to piggyback on these uses and needs which already existed among the populace in order to generate and capture the exchange-value congealed in these commodities. The uses and needs were labelled “demand”, the piggy-backing operation was labelled “supply” -- and voilĂ ! neo-classical economics was born.

In the global economy of the post-bipolar world, on the other hand, monopoly right asserts its priority over public right. It does this by redefining many of the terms of conditions of the economics-of-scale to develop or apply in determining how far, in each case, to redefine and manipulate use-value. For example, honey is perceptibly “sugar”-y to taste. We want the sugar, but honey is also anti-bacterial, and cannot rot. Therefore, the rate at which customers will have to return for the next supply is much lower and slower than the rate at which customers would have to return to resupply themselves with, say, refined sugar. In other cases – in Bangladesh, for example – the amount of honey available in the market is extended by adding refined sugar. The content of this “economic” logic then takes over and drives what happens to honey and sugar as commodities. While there are natural limits to how far honey as a natural product can actually be commodified, sugar is refined to become addictive so the consumer becomes hooked and the producer’s profit is secured. What never enters into considerations of these economics of scale is the matter of intention. Here lies the heart of darkness in a world dictated by monopoly right [9].


II. Economics of Scale under Monopoly Right

There are two especially crucial premises of the economics-of-scale under monopoly right that lie hidden within the notion of “upgrading by refining”.

The first premise of economics of scale under monopoly right is that unit costs of production can be lowered (and unit profit therefore expanded) by increasing output Q per unit time t , i.e., by driving ∂Q/∂t unconditionally in a positive direction. If relatively free competition still prevailed, this would not arise even as a passing consideration. In an economy lacking monopolies, oligopolies and-or cartels dictating effective demand by manipulating supply, unit costs of production remain mainly a function of some given level of technology. Once a certain proportion of investment in fixed-capital (equipment and ground-rent for the production facility) becomes the norm generally among the various producers competing for customers in the same market, the unit costs of production cannot fall or be driven arbitrarily below a certain floor level without risking business loss. The unit cost thus becomes downwardly inelastic.

The unit cost of production can become downwardly elastic, i.e., capable of falling readily below any asserted floor price, under two conditions:

1. during moments of technological transformation of the industry, in which producers who are first to lower their unit costs by using more advanced machinery will gain market shares, temporarily, at the expense of competitors; or

2. in conditions where financially stronger producers absorb financially weakened competitors. In neoclassical models which assume competitiveness in the economy, this second circumstance is associated with the temporary cyclical crisis. This is the crisis that breaks out from time to time in periods of extended oversupply or weakening of demand. In reality, contrary to the assumptions of the neoclassical economic models, the impacts of monopolies, oligopolies and cartels have entirely displaced those of free competition and have become normal rather than the exception. Under such conditions, the lowering of unit costs of production (and expansion thereby of unit profit) by increasing output Q per unit time t , i.e., by driving ∂Q/∂t unconditionally in a positive direction, is no longer an occasional, and exceptional, tactical opportunity. It is a permanent policy option: monopolies, oligopolies and cartels manipulate supply and demand because they can.

The second premise of the economics of scale possibly under monopoly right is that only the desired portion of Q end-product is accounted as having tangible economic, and therefore also intangible social, “value”, while any unwanted consequences – e.g., degradation of, or risks to, public health, damage(s) to the environment, etc. – are discounted and dismissed as “faux frais” [false, incidental costs] of production. Here it becomes possible to glimpse a node around which resistance could begin to form, because popular rejection of this position gives rise to consciousness about the unsustainability of the present order. These methods of continuing indefinitely to refine nature out by substituting ever more elaborate chemical “equivalents” hitherto unknown in the natural environment have started to take their toll. The narrow concerns of the owners and managers of production are seen to be at odds with the needs of society. Irrespective of the private character of their appropriation of the fruits of production, based on concentrating so much power in so few hands, production itself has become far more social. The industrial-scale production of all goods and services as commodities has spread everywhere from the metropolises of Europe and North America to the remotest Asian countryside, deserts of Africa and jungle regions of South America. This economy is not only global in scope, but social in its essential character. Regardless of the readiness of the owners and managers to dismiss and abdicate responsibility for the environmental and human health costs of their unsustainable approach, these costs have become an increasingly urgent concern to societies in general. In this regard, the HSS®A® principle becomes a key and most useful guideline for sorting out what is truly sustainable for the long-term from what is undoubtedly unsustainable.

As monopoly right attempts further to trump public right, the human being already transformed ever further into the merest consumer of products is a being marginalised from most of the possibilities and potentialities of the very fact of his/her existence. This marginalisation is a further important feature of the HSS®A® principle. There are numerous things individuals can do that modulate or otherwise affect the intake of honey and its impacts, but there’s precious little – indeed: nothing – that one can do about Aspartame® except drink it.

~~~~

Notes

1. This principle illuminates much that is obscure about the laws of motion in this universe. It is elaborated explicitly in a Note to be published soon in a forthcoming volume of the Journal of Nature Science and Sustainable Technology (available through Nova Science Publishers).

2. The correct principle was that the increased application to materials and products of creative human labouring power by the actual producers should lead to a general improvement in product quality. The aphenomenal version is a distortion.

3. At the level of financial accounting, this “greater refining” reflects an increase, either relative or absolute, in the expenditure on such components of fixed capital compared to the outlay in wages for the human workforce. This transformation is what conventional economics discourse and financial press reports refer to as “capital intensive” industrial development, which they contrast favourably to “labour intensive” forms that only Third World economies can afford.

4. To our knowledge, the peculiar feature of the political economy of this reality is remarked nowhere to date.

5. This is another peculiarly intangible transformation within the political economy of such “refining” reality that also remains still unremarked anywhere.

6. Monopoly right is often only partially described or encapsulated in such terms as “privatisation” and “outsourcing”. It is essentially a usurpation of public, social rights, and this usurpation within the present stage of global economic development is something that has emerged with a vengeance and ruthlessness since the disappearance of the Soviet bloc and its COMECON and related semi-barter structures within international trade.

7. For example, the popular American writer Vance Packard was already documenting many aspects of this phenomenon in his bestselling book of 1957,
The Hidden Persuaders. With the rise of these monopolies, advertising became a big part of mass marketing and the aphenomenal creation of new “needs”. The aim and conditions, however, in which these developments took flight included a certain stagnation or “plateau”-ing in market growth for established necessities like household appliances, cars, etc., took off. Further constraining the extent of this expansion was what happened to the weight of the consumption level -- disposable income -- available in the largest home markets of the capitalist world, in the United States and western Europe, relative to other parts of the world.

8. The works of Vance Packard and other social liberals in fact became rallying cries for more such regulation.

9. As a result of discarding any consideration of intentions, certain questions go unasked. No one asks whether any degree of external processing of what began as a natural sugar source can or will improve its quality as a sweetener. Exactly what that process, or those processes, would be is also unasked. No sugar refiner is worried about how the marketing of his product in excess is contributing to a diabetes epidemic. The advertising that is crucial to marketing this product certainly won’t raise this question. Guided by the “logic” of the economies of scale, and the marketing effort that must accompany it, greater processing is assumed to be and accepted as being ipso facto good, or better. As a consequence of the selectivity inherent in such “logic”, any other possibility within the overall picture – such as the possibility that, as we go from honey to sugar to saccharin to aspartame, we go from something entirely safe for human consumption to something cancerously toxic – does not even enter the frame. Such considerations could prove very threatening to the health of some group’s big business in the short term. All this is especially devastatingly clear when it comes to, say, the economics of crude oil as an energy source. Widely and falsely believed to be toxic before it is touched by a refiner, refined petroleum products are utterly toxic but not to be questioned since they provide the economy’s lifeblood.

Edible natural products in their natural state are already good enough for humans to consume at some safe level and process further internally in ways useful to the organism. We are not likely normally to overconsume any unrefined natural food source. However, the refining that accompanies the transformation of natural food sources into processed-food commodities also introduces components that interfere with the normal ability we would have to push a natural food source aside after some definite point. Additionally, with externally processed “refinements” of some natural source, the chances increase that the form in which the product is eventually consumed must include compounds that are not characteristic in nature anywhere and which the human organism cannot usefully process without excessively stressing the digestive system and-or other parts of the organism.

03 February 2007

That latest IPCC Report:
THE APHENOMENAL MODEL ON CLIMATE CHANGE TRIES TO WRESTLE RATIONAL CRITICAL THOUGHT TO THE GROUND

Discussing the latest report by the International Panel on Climate Change at the BBC Online website --- one of the busiest in the world -- Dr Vicky Pope, head of the Climate Programme at the UK Met Office's Hadley Centre, writes: "The only way to predict the day-to-day weather and changes to the climate over longer timescales is to use computer models".

If the underlying assumptions are in error, however, longer timescales may not help. If we "know" -- as educated Europeans did for about 2000 years, from 350 BCE to 1650 CE -- that heavier objects fall to the ground faster than lighter objects, computer models of falling objects over longer timescales would enable us to measure the differences in the precise rate at which objects of different weight reach the earth. Enough data would then have accumulated to enable us to predict various things more accurately. But the predictions would all be hogwash, even if this or that group of them turned out to correspond with physically-measured cases, because... differing weights of objects freely falling towards the earth in fact have no effect whatsoever on the rate at which they fall!

The problem with modelling climate change is that there is as yet no integrating hypothesis about what, if anything, actually changed for the long term in the atmosphere with the coming of the Industrial Revolution; what previous changes were redirected or distorted in their effects; and what new processes appeared that had not appeared before. There is neither baseline data nor an hypothesis supported by extensively collected observations to suggest either what fundamental change or shifts in climate took place if any, and what its dynamics were or still are. In this area, we remain as lost as physicists and students of motion were before Galileo and Newton figured gravity out. Solving equations systems that produce answers consistent with observed data isn't going to get us out of this one... but this needs to be demonstrated with some actual systems of equations, preferably non-linear to begin with, and what happens with their solutions when certain fundamental assumptions are relaxed, esp. if the relaxation is in fact a linearisation.


Link to the BBC's informed commentary by Dr Pope on the IPCC's 2007 update of their climate-change findings

Link to the IPCC's 2007 paper

02 February 2007

HOW, AND UNDER WHAT CONDITIONS, IS USING MODERN RULES IN OLD CASES "UNFAIR", JUDGE DOHERTY?

The remarks of Justice David Doherty in the Ontario Court of Appeal's review hearing of the Truscott conviction provide an excellent example of the aphenomenal model at work.

According to the description of a session of the review hearing, on Thursday, 1 February 2007, reported in The Globe and Mail, Judge Doherty said:

"If something is done in accordance with the accepted rules of the day, I don't understand how it can turn out to be unfair... Before 1896, an accused couldn't testify at his trial. Does that mean that we declare every trial before 1896 to be unfair?"

Judge Doherty made the remarks after hearing James Lockyer, a lawyer for Mr. Truscott, argue the appeal judge must be prepared to evaluate a vast amount of evidence that wasn't disclosed to Mr. Truscott's defence team at his 1959 trial or at a 1966 review of his case by the Supreme Court of Canada.

What Judge Doherty's remark betrays is the most serious limitation of the entire mindset of Anglo-American criminal justice. According to this mindset, the conduct of the investigation prepared before the summoning of the trial is secondary to the conduct of the trial.The other side of this coin is that the rules of trial procedure and their maintenance by the judge, the prosecution and the defence, are decisive. Any other notion of priority is slighted or dismissed. And why? Because, in practice, the manner in which any particular criminal investigation has been conducted becomes the subject of review only extremely infrequently. This lack of frequency is then itself assumed to prove the lack of a need for such review in general. From this people are expected to infer that full and proper use of trial procedure by all parties -- the prosecution, defence and the judge -- can generally make up for, or even overcome, malice or bias in the investigation. Of course, as lawyers and others in the justice system well know: "absence of evidence is not evidence of absence." Infrequent hauling on the carpet discloses nothing about how necessary it may be to call responsible officials to account.

There is nothing new in the police gathering evidence for a criminal prosecution without the details of the investigation being disclosed, before trial, to an accused and-or his/her defence counsel. For centuries, details of investigative procedure could be withheld even during trial. In the United States, beginning in the 19th century, it was illegal for the prosecution to withhold evidence from the defence before trial. (An exception was maintained for espionage, which has been broadened since 9-11 to include "terrorism" interpreted as a suspicion without the standard of "reasonable and probable grounds" of ordinary criminal jurisprudence.) However, even in ordinary criminal jurisprudence, this was frequently manipulated to mean: "before introduction of the evidence during trial". Thus could surprises still be sprung on an accused after the trial had begun and the defence prepared along definite lines and assumptions about the state of the prosecution's case.

In Anglo-American criminal jurisdictions, the most important feature of criminal investigations is the large discretionary power prosecutors enjoy in their dealings with the investigative arm. Prosecutors could either shut down, or instigate, police pursuit of certain lines of investigation. The use of this power may be reviewable after the success or failure of the appeal of a criminal conviction. The key thing, however, was (and remains to date) that the use of this power could not be examined during or as part of the original trial itself.

This problem of the criminal justice system, and its unique potential to unleash great mischief and supreme injustice in stripping people of their liberty, is discussed far less than the problems of plea-bargaining and plea-bargaining strategies of prosecutors and defence counsels. It remains the key, however, both to why the judge in the Truscott case did so little to protect the accused from prosecutorial misconduct or assist his defence counsel to protect him from it during and at trial, and to why he persisted in obstructing efforts proposed independently and outside of the appeals court to review any aspect of the case including the police investigation.

In Anglo-American systems of criminal jurisprudence, the presumption of innocence of an accused, combined with the strictly-maintained barriers between the prosecution and the defence counsel of an accused, serve to insulate the investigative process from scrutiny. In criminal jurisprudence developed in other jurisdictions, based on the model of France, there is no presumption of the innocence of an accused. As a consequence, no rules have been developed that are based on an adversarial arrangement between the prosecution and the accused. Investigation is the responsibility of an examining magistrate, i.e., a member of the judiciary. Any suspected police misconduct is reviewable by a special administrative body of the judiciary empowered to sanction investigatory misconduct.

In general, on the other hand, as part of the adversarial principle of this system of criminal jurisprudence, the onus is on the prosecution to prove guilt beyond a reasonable doubt. Within this arrangement and on this basis, it was long accepted that the prosecutorial arm had no particular duty of care to exercise in relation to the ability of an accused to conduct an adequate defence. In the adversarial system, the judge in the case carried this responsibility. However, the judge also lacked any power to order, conduct or direct the investigation. Thus, while charged with a duty of care for the rights of an accused under a system that assumes innocence until proven guilty, the judge lacked any access to the institutions or individuals bringing the prosecution's case. Such a duty of care can obviously only be exercised formally, with regard only to details of procedure, not to any of the substance of the intentions informing the Crown's choices as to how to proceed.

The discretionary power of the prosecuting arm of the criminal justice system is defended in our own day from two directions. It is said to be either something flowing naturally from the adversarial principle, or something that aids in the maintenance of the most efficient functioning of judicial services. In fact, however, this discretionary power is part and parcel of the Royal Prerogative of the State, based on the notion of the Divine Right of Kings. This prerogative, which asserts that the Sovereign is always right, stands in stark opposition to securing a just result or uncovering and correcting an earlier unjust result. (For centuries, it was high treason punishable by execution even to think let alone whisper that the Sovereign could be wrong.)

The prerogative character inherent in the exercise of prosecutorial discretion has consequences. For example, apart from the penalties attaching to perjury, there is almost no systematic way to uncover any order by the prosecution to the police to deep-six certain incriminating evidence and-or the methods by which, and sources from whom, it was obtained.

How the prosecution assembled its case against Truscott exposes the great dangers inherent in such discretion. It dribbled out into the public prints over the decades following his conviction that police at the time had collected evidence suggesting possible involvement in the victim's murder of a disturbed individual serving in the Canadian armed forces. This man was potentially implicated in other contemporaneous but still unsolved rape-murders. (He died while Truscott was in jail. This strengthened the later argument that a review of the original investigation was moot, since Truscott's appeals were exhausted and a new trial could not be held to prosecute a dead man.) At the time of this particular murder, a case could be assembled and despatched more quickly against a juvenile acquainted with the victim and known to be one of the last people to see her alive. Accordingly, the Crown proceeded against Stephen Truscott. The discretion exercised by the police and the Crown prosecutor in so proceeding still awaits comprehensive investigation and exposure.

Truscott was convicted in 1959 as a 14-year old and sentenced to the gallows. However, his sentence was commuted after Canada abolished the death penalty. He was released after serving two-thirds of a life sentence. The judgment in the case was controversial almost as soon as it was rendered. In the wake of popular reception of a book about the case by journalist Isabel LeBourdais which questioned the entire conduct of the prosecution, the presiding judge in the Truscott case became incensed enough to write the Prime Minister of Canada demanding she be charged with bringing the administration of justice into contempt. This demonstrates how the concern of the presiding judge to protect the reputation of the conduct of all in his court -- including himself-- could operate to reinforce pressures from elsewhere in the system against any kind of judicial review of the original investigation.

The strong premonition of an unjust result in a particular case arises from two circumstances. First, there is the continued assertion of the innocence of the convicted person. Sometimes this may become combined -- as in the Truscott review -- with a reasonable reconstruction of how an unjust result could have been produced despite, and to some extent because, of the fact that both sides in a particular criminal case, tried under the adversarial norms, followed the rules more-or-less. The key to uncovering possible pathways to an unjust result in the Truscott case was to apply retrospectively certain investigative and procedural standards developed since 1959 to those aspects of the case where such standards did not yet apply.

The resistance to such retrospection brings us back finally to the true and deeper significance of Judge Doherty's outburst. Clearly, wherever this exercise may lead in any particular case has validity for that case and possibly certain other cases whose outcome turned on certain procedures not being available to, let alone insisted upon, by the defence. However, from this it does not follow that the judgment rendered in every every trial completed before certain rules of investigative as well as trial procedure changed must fall under a cloud. It does suggest that, if certain cases were re-examined by applying such a retrospective analysis, much light might be thrown on the actual pathways to the final result.

This is exactly the process we have been proposing as the only way to defeat and undo the noxious consequences of the aphenomenal model everywhere -- throughout the physical, natural and biological sciences and their engineering applications, as well as throughout the social sciences. What was assumed to be true at "Knowledge-state(time=yesterday)" of our knowledge can hardly be assumed still to be true in the same way, if at all, after our knowledge has moved on to "Knowledge-state(time=today)".

What could be the source(s) of the pressure against such retrospection, reflected in Justice Doherty's outburst?

One possibility is that material considerations, tied to the short-term self-interest of various individuals and their institutional bodies, are deeply vested in resisting the updating of the fairness and transparency of rules of procedure or the widening of their application outside already defined boundaries. As we have noted in many other examples, this investment in the status-quo appears to be the guarantor of the Aphenomenal Model. Without such a guarantee, the model would collapse of its own self-evidently top-heavy instability, an instability well illustrated by our graphical representation of it as an inverted triangle.