Phys. Chem. Earth (A), Vol. 25, No. 8, pp. 613-618, 2000 0 2000 Elsevier Science Ltd All rights reserved 1464- 1895/00/$ - see front matter
Pergamon
PII: S 1464- 1895(00)00093-4
Scientific
Data Between Validation
Imperatives,
Oblivion and Fraud
G. K. Hartmann Max-Planck-lnstitut Received
4 September
fur Aeronomie, 2000; accepted
Katlenburg-Lindau, 25 September
Germany 2000
Abstract. The causes and consequences of the three processes data filtering and validation, data oblivion, and (fraudulent) data manipulation are very different. they will be disussed in more detail in this paper. Qualifying filtering and validation are necessary because of the colossal data growth rates and the fact that as the quality of data decreases the legal, economic, ecologic, and scientific-technical risks incurred in using them increase. The optimal validation is possibble With the so called “joint retrieval” using calibrated raw data from different experiments together with “assimilated” model data. The consequences of data oblivion through technical amnesia (“forgetting”) reduce the (governmental and non governmental) resources for research and development and are thus anti innovative. From a scientific point of view the negative consequences of fraudulent manipulation are both the most detrimental (materially and in terms of credibility) and the most unpredictable. Thus a closer look at the causes will be taken. Remark: Boldface terms are explained in an alphabetic glossary at the end of the text.
data manipulation. We are dealing here with a highly dangerous, self-reinforcing factor which has been operative in exacerbating the public distrust of “objective” science, a distrust reinforced by the increasing bureaucracy encountered at all turns. We must also bear in mind that for their indispensable verification and validation “objective” data require the intersubjective approval (“evaluation”) of the scientific community; this, in its turn, means that they cannot be entirely objective in the true sense of the term. 2nd cause: the negative effects of the growing number of technologically complex large-scale systems. At present these effects are responsible for an increasing hostility vis-vis science in general; at the same time, they are thrusting the scientists and engineers working in research and development (R&D) into a hitherto unaccustomed culprit or scapegoat role. So far, legal protection (source responsibility etc.) in this connection has been inadequate (1 refrain from touching on the moral aspect). The extent to which administrations require scientists and engineers to “operate” (as opposed to “co-operate”) is proportional to the degree to which necessary and sensible administration turns into counter-productive over-administration. By this we mean the kind of senseless “red tape” which (in the long term) has an (economically) counter-productive, foot-dragging effect (which is truly “destructive” through the negative synergies it generates) as opposed to the supportive effect (positive synergies) that any form of administration worthy of the name may be rightly expected to have. 3rd cause: the (today) unavoidable mixture of public/private in the use and marketing of the results of basic research for economic/industrial purposes; the inadequately established awareness of the complementarity principle, in conjunction with an insufficiently scrupulous and judicious use of language. 4th cause: the fact that in many nation states and societies there appears to be (more or less tacit) agreement that addedvalue (e.g. money value) is the real value (Is this addedvalue really the greater value?) The fact that the principle of “more and quicker” is an eminently marketable rationale
0 2000 Elsevier Science Ltd. All rights reserved.
1
Introduction
The causes and consequences of the three processes data filtering and validation, data oblivion, and data manipulation are very different indeed. As, however, from a scientific point of view the negative consequences of fraudulent manipulation are both the most detrimental (materially and in terms of credibility) and the most unpredictable, it is worth while taking a closer look at the causes. 1st cause: the growing competitive pressure generated by the reduction of (government) funding, plus the fact (largely attributable to the large data growth rates) that an insufficiently clear distinction is made between a) the necessary, user-friendly, qualifying filtering of data (including data calibration, verification and validation) and b) fraudulent Correspondence
to: G. K. Hartmann
613
614
G. K. Hartmann: Scientific Data Between Validation Imperatives, Oblivion and Fraud
has led to a situation in which a) the “velociferic” trend can be expected to gain ever greater momentum (this development is bound up with an ill-advised rationalization drive curtailing the indispensable areas of “tolerance” in the interplay between humans and machines) and b) there is also insufficient perception of the ethical obligations inherent in science as a profession. Accordingly, the (legal and financial) risks go on mushrooming in the (highly) developed nation states, notably because there is less and less time 1.) for the necessary optimizing of the man-machine adaptation process, and 2.) for achieving better understanding of, and skills in dealing with, (non-linear) complex hybrid systems at the software and hardware levels. Hence we face the incessantly ballooning costs for the prior prevention (prophylaxis) and subsequent elimination of the damage attendant on these risks (Hartmann, 2000a).
2
Data inflation problems
Humans are non-specialized beings driven in what they do bycuriosity. As such they need information. For some years the industrialized nations have been observing a phenomenon operative not only in the earth sciences (e.g. research on the earth’s atmosphere) but in many other areas as well: the constant increase in the plethora of primary information, over and against a signal dearth of secondary information. By primary information we mean such things as the raw data generated by measuring processes and also all the knowledge needed in order to be able to “make something” (what Aristotle calls tech&). Secondary information refers to data that have been subjected to qualifying filtering (selection, screening), verification, validation and interpretation, together with the knowledge needed in order to be able to “do something with something” (what Aristotle calls phronesis, literally: reasoned action or practical wisdom). In many areas the tensions between these two poles have become so extreme that we are fully justified in talking of an information crisis and an incipient “information explosion” in the domain of primary information. Atmospheric research is a case in point, illustrating both the glut of primary information and the rate at which it is growing. In 1990 we had over 2,5 x 1014 bits of information growing at a rate of about 10% annually, with the advent of satellites like ENVISAT about 50%. If we assume a page thickness of 0.1 mm, this corresponds to a row of books 362 km long. Current estimates suggest that at present humanity generates about 10” bits of information each year. The physical ceiling for the generation of bits in the sun-earth-space system is 2.5 orders of magnitude higher (1O43 bits per year), so in that respect there is plenty of leeway yet. But the same is quite definitely not true of the biological limits imposed on the human capacity for retaining and processing information. Those limits are in fact very close at hand (Hartmann, 1997). The headlong development of computer systems has encouraged this drastic increase in sheer information volume because the restraints imposed on electronic processing and
storage are relatively minor. But it has become abundantly clear that coping with this deluge of information poses entirely new problems for the purveyors and users of such data, and hence notably for those institutions whose job it is to ensure their bibliographic and/or numeric archivization or to document them in a more general sense, i.e. “capture” them, store them and make them available (in a user-friendly, interactive way). It would appear that progress in using bibliographic data has outstripped advances in the management of numeric data, especially where large volumes are involved. Today about 90% of Data Processing (DP) expenditure goes on the development of suitable software and only about 10% on hardware. About 20 years back the situation was precisely the opposite. Over the last 20 years the software-hardware gap has turned into a full-blown software crisis highlighting the increasingly crucial problem of ensuring quality control and the regular, painstaking overhaul (maintenance) and updating of information systems such as large databases and the used thesauri. The celerity of technical progress has made it increasingly difficult to process older data. In some cases it is impossible today to process data going back further than 10 years. This has given rise to the term “technical amnesia” (“forgetting”). This new form of “knowledge death” not only generates a staggering cost spiral, it also confronts us with some other very unpleasant problems. So far, and for reasons difficult to fathom, very little has been done to obviate the causes, so that little improvement can be expected in the near future. This is a crucial hazard for science because the costs thus arising will probably devolve to a very high degree on science itself, i.e. they will have to be met from available funding, quite simply because scientists, technologists and engineers are regarded as the main culprits. The new media flood us with global, regional and local information. On the one hand, this leads to the rejection and deformation of information; on the other, of course, it offers at least in theory an opportunity for a broader and more in-depth “view of the world”, for more democracy, responsibility and greater participation in decision-making processes, albeit usually via indirect (technically filtered) channels rather than immediate experience via the senses. In general, then, the new media contribute to a deepening of the cleft between mind and body, a kind of latter-day Platonism. The computer “recycles” classical Platonism; this, at least, is the view taken by American philosophy professor Michael Heim. If it is true, then it flies in the face of one of the central declared tenets of present-day thinking - heightened synergetic interplay between mind, body and spirit, in other words, an acceptance and espousal of complementarity. Only the future will tell us whether Bill Gates is right in his conviction that the data highways will play a major role in solving the great global problems or whether his adversary, the American media critic Neil Postman, will be borne out in his warnings about the negative consequences of overinformation, which he calls “information overkill”. If we look at the most urgent global problems assailing
G. K. Hartmann:
Scientific
Data Between
us at present - over-population, hunger, destruction of the natural environment - the trend appears to moving more in Postman’s direction. It seems as if we will be in much greater need of data to mitigate the negative after-effects of disasters and to recover more quickly and completely from them than to genuinely solve the great global problems, directly or indirectly. But even this far less ambitious project of minimizing catastrophic aftermaths can only be realized if we do indeed contrive to offset Postman’s overkill effect. What is most urgently required for this to happen is a considerably faster and more genuinely “qualifying” method for filtering primary information. A spin-off requirement from that is faster interactive access to the secondary information thus generated, with corresponding representation possibilities and opportunities for fast linking (“cross-correlation”) with data from other sources. The DUST-2 concept was conceived for this purpose. Multi-media technology is an important tool. But even with such modern hi-tech implements human users will still invariably select the information which is relevant to them, i.e. corresponds most closely to their expectations. It is fair to say, in general terms, that primary information (possible, potential information) is hardly understood by the (broad mass of the) public; this is not true of (qualifying filtered) secondary information. We are confronted at present with a strangely vague and woolly concept of what information actually means. The term dates back well before Shannon’s information theory, not to mention the extended versions of it central to communications theory or the relatively young information science, the science of computers and the basics of their application. Originally the word informatio meant precisely what it says: that which gives form or shape to something. Hence the idea that God’s creative will “in-formed” all being. In the Middle Ages the accepted meaning of “information” was the essential form of something, which, in accordance with its nature, then in-formed its extensions and ramifications. Since science has given up inquiring into anything remotely resembling the “substance” of things, we now obviously feel free to apply the term “information” to any kind of formalized or formulated communication, transmission, signal or impulse as something which delimits, makes perceptible, determines, inlluences, etc. The more the term is used to relate to mere content and the greater the sophistication applied to studying the “objective” givens of the constantly “selfdifferentiating” sciences, the more variegated the definitions of “information” become. At present we can readily marshal over 160 such definitions. Confusingly, however, they appear on closer inspection to have little or nothing in common. In the face of the constantly growing information problems bearing in on us, this cannot but be a source of disquiet and should make reflection on what is actually meant by the term an absolute must. And this both specifically - at the level of the premises and methodologies of the scientific subjects themselves - and above all, as a matter of fundamental principle, in the all-encompassing context. The author of this article admits to having a definition of his own: “Information is the product of a filtering process”. If we accept this,
Validation
Imperatives,
Oblivion
and Fraud
615
then the logical rider is: “Information contains provisional certainties marking themselves off against determinable uncertainty.” Whether and to what extent this actually takes place depends on at least two different time intervals: observation time and filter time constants. Thus the determinable uncertainty (e.g. noise) has become just as significant a factor as that which can be adduced as (provisional) certainty (e.g. the signal). In the force field generated by the question and answer process the two are inextricably interlinked.
3
Empirical
Science and intersubjectivity
Looking at empirical science as a “form of knowing” we soon find that here too metaphysics and ethics play a constitutive role over and above the fundamental links obtaining between them. We find historical instances of this in such cases as Johannes Kepler pivoting a new view of the world squarely on a form of solar mysticism and postulating, in keeping with the spirit of the Renaissance, that the principles underlying the design of the Universe must be recognizable to Man. But Kepler was content with approximate mathematical equations. One of the aims of modern empirical science is to give a more realistic shape to our imperfect notion of our “environment”, in other words to get nearer the truth. The instrument it uses for this purpose is measurement and the data derived from measuring processes. Now the empirical sciences are by no means as empirical as they are often made out to be. Both the verification method (logical empiricism) and the falsification method (critical rationalism) appeal to pure facts as the final authority determining the validity of theories. But in reality there is no such thing as pure facts, not even in theoretical physics. Anyone setting out to measure something is espousing (tacitly or explicitly) a number of theories: a theory of measuring, a theory of the things to be measured, a theory of the measuring instruments used. But given the inevitably finite temporal and spatial measuring intervals and the characteristics of the measuring instruments, measurement accuracy is invariably limited, i.e. there will always remain a finite indeterminacy or uncertainty. This is frequently referred to unthinkingly as “error”, although in many such cases there is in fact no way of knowing what it is that is “wrong”. The selection of representative measurements takes place with the aid of a theory of “error” whose application normally masks the very problems that can arise precisely from the unthinking use of the term. For instance, how far can the value measured stray from the value expected before the theory is termed to have been falsified by the data? The value (or standard) is determined intersubjectively by the scientific community. This joint consensus, and this alone, is what is are really referring to when we speak of “objective” empirical facts. So-called measurement errors (uncertainties) are easier to determine in connection with measurings or observations taking place at the location of the parameter in question (in situ) than with measurings that have to be done via remote sensing because the location in question is inaccessible. A complicat-
616
G. K. Hartmann: Scientific Data Between Validation Imperatives, Oblivion and Fraud
ing factor here is the fact that most remote sensing data are also time-series data, which means they cannot be repeated under the same conditions. Most of the data scandals publicized in the recent past stem from the domain of “in situ” data. But in principle data manipulation of whatever kind is much easier to undertake and conceal in the field of remote sensing. The large number of different experiments and the huge volumes of data which this field generates, both facilitates and necessitates data comparison and data combination and accordingly verification and validation. The optimal validation is possibble with the so called “joint retrieval” using calibrated raw data from different experiments together with “assimilated” model data in one algorithm. This is of course essential if we want to (or have to) enhance the reliability and accuracy of the “data products”. And it is a conditio sine qua non when putting in for new (government and non-government) funding. As data comparisons increase and improve, fraudulent manipulation, though not absolutely impossible, is much easier to detect, not least because of the effects of the self-regulation called for on all sides. Note: The more radical the cutbacks in (government) funding for research and development, the greater the competitive pressure becomes, thus increasing the likelihood of fraudulent data manipulation. In many areas of research there is also a decline in the number of competent staff, thus reducing in its turn the possibilities of effective (self-)regulation and facilitating fraudulent data manipulation. In the sphere of “in situ” data independent repetition of the measuring (a second measuring) has a “verification and validating” effect. What militates against this is not only the costs involved but also the inherent unattractiveness of such an activity. In today’s western thinking you have to be the “first” in order to count; coming second and being the runnerup means being relegated to the status of an also-ran, a nobody hardly eligible for funding and simply nowhere in the career stakes, not least because such findings, if they get published at all, only get aired after a substantial timelag. As the “publish or perish” principle has lost none of its validity, very few people are prepared to let themselves be bundled into the category of the runner-up or mere “replicator”, although this role is in fact absolutely essential for the verification - including re-calibration - and validation of data - including value added validation, which hitherto is considered different from the joint retrieval. Nature proceeds on exactly the opposite principle, providing for all, not only the winners, (although the one(s) bringing up the rear admittedly get the least encouragement). In this way nature has kept evolution going successfully for millions of years.
4
Inadequate machines
“tolerance”
between humans and
As science (which is somewhere between a profession and a vocation) turns more and more into a job like any other (i.e. has less and less to do with vocation and Bildung, although becoming a scientist is much more than a mere matter
of “vocational” (!) training), and as social solidarity declines within societies, we can confidently predict a growth in the number of cases of fraud in the scientific field. This means that society will be increasingly unable to “reap” the fruits of its investment in science, research and development (R&D). The greater the potentially hazardous consequences of large-scale complex technological systems, the higher the necessity of learning to quantify and minimize the probability of accidents, e.g. via so-called early warning systems and corresponding risk model calculations required for preventive measures. The “in-valuation” (say, of the results of these risk model calculations) into the given cultural background (the complement to the e-valuation of data) and the readiness of insurance companies to provide insurance cover for these risks under certain conditions are the basis for the assessment of culturally conditioned qualitative risk-acceptance thresholds. The indispensable prerequisite for insurance companies is the availability of corresponding qualifying filtered data allowing for a quantitative assessment of the risk probability and the support of “co-operative learning and teaching processes”, which then slowly alters the risk acceptance threshhold. The more unacceptable the disastrous potential effects of large-scale complex technical systems become, the greater is the necessity not only to minimize possible sources of error (faults) via human and machine “activities”, i.e. via increasing automation, but also to create possibilities for human intervention in cases of emergency. Automation brings with it a substantial reduction in the potential sources of human error, notably those arising from the collision between “linear (clock-)time” and “non-linear, rhythmic time”, at the points of contact or interfaces between the two. Linear time determines the functioning of machines, non-linear rhythmic (cyclical) time the functioning of natural (human) life. Providing for “overruling” human intervention will lessen the probability of disasters caused by machines because humans are better at recognizing complex, unexpected patterns than computers are, the latter having major problems with the fundamental and unavoidable uncertainty (inaccuracy) of measurement data. The discussion on how to improve the synergistic interplay between humans and machines is greatly hampered by reluctance to discuss a problem epitomized by the ambiguous phrase “computer responsibility” (Hartmann, 2000b). One of the great challenges facing empirical science (or, more properly, the scientists) is what can be achieved via the complementary rivalry/symbiosis/situation between humans and machines. But it is equally essential to determine what is not possible. Accordingly, and despite the inevitable time delay before new knowledge can be implemented on a political plane, it is one of the tasks of genuine Bildung to define what is desirable or undesirable and what can responsibly be done in this connection. It follows from this that in the face of the rediscovery of complementarity, the responsibility of modern science and computer responsibility is probably the biggest single challenge facing education (in the sense of Bildung), not least - in fact precisely because - complementarity calls in ques-
G. K. Hartmann:
Scientific
Data Between
tion an essential premise underlying what German philosopher Hans Jonas (1984) called the “Principle of Responsibility” in his book of the same name. That premise is nothing other than the absolute priority of Being over Nothingness. It is a premise that no Buddhist brought up to think in complementary dimensions could ever subscribe to. A genuinely intercultural dialog is hence more imperative now than it has ever been. Although one would expect the opposite to be the case, most of the cases of fraudulent manipulation of data detected in the Federal Republic of Germany (and elsewhere) in the last two years stem from the sphere of “in situ” data rather than from the complementary area of “remote sensing” data. These “data scandals” are a welcome excuse for politicians to decree further funding slashbacks for research and development, especially in those sectors which promise little or nothing in the way of short-term disposable knowledge (i.e. power) but a great deal of long-term in-depth knowledge. These sectors serve basic research in the first instance and as such represent the essential foundations for future innovations or new research and development activities. Competition within the scientific community is hotting up all the time. But largely due to bureaucratic overkill this rivalry leads to remarkably little innovation activity or constructive criticism. Instead it generates ever greater efforts to comply with the demands and expectations of short-range policy-making (for fear of losing the status quo-the so-called “page syndrome”) or even more (intentional and detected) instances of fraud. But this is precisely the opposite of what politicians can truly be aiming for if they want to improve the economic complexion of their country and above all the situation on the labor market. The proposed (and now implemented) external and internal controls and self-monitoring (the latter promises to be much more effective than the former) are essential steps towards an improvement of the situation. But they can only be successful if the following necessities are given sufficient heed: 1. A clearer definition of what we mean by science and technology and how the responsibility for them is to be apportioned. 2. More reliable knowledge about the accuracy, the (unavoidable) inaccuracy and the underlying error theory of empirically established data. (Uncertainty is greater, for example, the smaller the period of measurement/observation is.) This implies learning to be more accurate about, and heedful of, the concurrences and differences between the terms error and uncertainty, indeterminacy and inaccuracy, in short being more scrupulous in our use of language, not only in science. 3. Learning to be more accurate about, and heedful of, the adaptation/interface problem between humans and machines. It is however difficult to avoid the impression that the number of people prepared to address these questions are dwindling (or the number of them willing to suppress or want only ignore them is increasing). The upshot of this is that we spend more and more of our time conserving the “shes” rather than the “fire”. Necessary (but unfortunately not sufficient) for a successful discussion (or better: dialogue) and
Validation
Imperatives,
Oblivion
and Fraud
617
above all for the practical implementation of findings and countermeasures is the growth of a “sound middle class” in most nation states and direct support from the elites operative in all cultures. (Sound stands here for a synergistic blend of specialist knowledge, understanding of the self, motivation, commitment and willingness to accept responsibility).
Glossary The German term Bildung has so far defied adequate translation into other languages. Unlike the one-sided terms “formation” in French and “education” in English it stands for a two-sided model that extends from the central term Bild (form, image) to encompass “Vor-bild” (literally “pre-form” but with the operative meanings of example, ideal, form to be emulated) and Nach-Bild (“post-form” or emulation of the “pre-form”). In this context W. von Humboldt spoke (1804) about Bildung through science. Today we no longer speak of the progress of science but rather of progress through science. The term Bildung appears to have been stripped of all its original connotations besides the canon of factual knowledge (disposable knowledge) and the technical skills inculcated by modern European-style education systems. Ideas such as the concepts of “ordering knowledge” and “life-knowledge” so central to the thinking of philosophers like Eric Voegelin and Hans-Georg Gadamer have obviously been relegated to a very minor position. A sea-change indeed! The present author understands Niels Bohr’s (1928) term Complementarity to mean: ?? that being things (appearances) manifest themselves in two different forms which are (logically) incompatible. ?? that the nearer one approaches one of these forms, the further one moves away from the other (more simply: the “sharper” the one is, the “fuzzier” the other becomes). ?? that the two forms cannot be completely “unmixed” (a consequence of temporality and the finitude of observation time). Complementarity is a given which needs to be firmly established in our philosophy of things and in many cases replaces Either/Or by Both/And (As well as). The calibration of the (technically) Data calibration: recorded data, so called raw data - with data set processing level 0 - relates the outputs of the measuring equipments, which might be given in voltages, counts etc. to the interesting physical quantity, after having removed the equipment characteristics by deconvolution and having determined the “errors”. The often-used term “calibration” has different meaning for different types of measurements. (Unless there is support by the experimentators, it is unreasonable to make uncalibrated data accessible in public information systems!) Data Set Processing levels: A widely used definition taken from the EOS Reference handbook shows data set (“product”) levels ranging from 0 to 4. See also Hartmann (1997) page 22. Data validation: systematic errors can only be measured if the same physical object is measured at least once more with a measuring configuration based on a different principle from
618
G. K. Hartmann:
Scientific
Data Between Validation
the first. The comparison of these two (or more) sets of data is called validation. The degree of agreement between the two sets of data (a standard established intersubjectively by the scientific community) determines the “value” (quality) of the data, i.e. the degree of reliability the data can claim. The value thus determined is called “accuracy” (indeterminacy) sometimes also denoted as absolute error. Data validation is unthinkable without this participation by the community. The optimal validation is possibble with the so called “joint retrieval” using calibrated raw data from different experiments together with “assimilated” model data. Data verification means repeating the measurement independently with the same equipment, using the same hypotheses and references, and then comparing the results. The crosscorrelation between two sets of data obtained independently of one another is a gage for the verification, whose standard (quantitative value) has to be agreed on intersubjectively by the scientific community. The value thus determined is called “precision” (indeterminacy) sometimes also denoted as relative error. Qualifying filtering is necessary because of the colossal data growth rates and the fact that as the quality of data decreases the legal, economic, ecologic, and scientific-technical risks incurred in using them increase. The present author sees empirically grounded Science as a contribution to a better understanding of ourselves in relation to the cosmos, as a complement to transcendence; it makes techn(olog)ical activity possible and for the scientist represents a challenging and rewarding opportunity for selfpresentation. Science thus conceived must not only live with provisional certainties standing out from the determinable
Imperatives,
Oblivion
and Fraud
(complementary) uncertainty around them but also with (and between) Newton’s and Goethe’s sorcerer’s apprentices. The Scientific process in a crude (circular) representation: Observation - Interpretation, Inductive logic - Laws - Creativity - Theory - Deductive logic - Predictions - Ingenuity - Experimentation - Material Reality (comprehensible, recognizalbe) - Observation. Theory is equivalent of myth, if the process does not go beyond theory to experimentation, which brings science in touch with material reality. Similar, however complementary considerations, can be made with the religious process. The term “Velociferic” was coined by J.W. von Goethe in 1825 from “velocitas” and “Lucifer”. It describes first and foremost the negative consequences of acceleration. Our present age - sometimes referred to as the Age of Acceleration is outstandingly “velociferic”. The author of this paper and Prinicple Investigator (PI) of the DUST-2 CD research project thanks the Max-Planck-Institut fur Aeronomie (MPAE) and the DLR - under Fkz. 50 EE 98038 - for their support. He thanks his colleagues and friends from the international DUST-2 team, from the international MAS team, from the Institute of Intercultural Research (IIR), and from Coperenicus association e.V., for the friendly, excellent, and successful cooperation.
Acknowledgements.
References Hartmann,
G. K., Facts about data from the Earths atmosphere,
015-97-24,
MPAE-L-
1997.
Hartmann, G. K., Filtered, Fiddled, Forgotten, MPAE-L-015-00-04, 2OOOa. Hartmann, G. K., Computer responsibility: A challenge for Bildung and Science, MPAE-L-051-00-03, 2000b. (The three texts are available in English and German
on the DUST-2 CD.)
  1. The causes and consequences of the three processes data filtering and validation, data oblivion, and (fraudulent) data manipulation are very different. They will be disussed in more detail in this paper. Qualifying filtering and validation are necessary because of the colossal data growth rates and the fact that as the.
  2. Nichifor, Serban: OBLIVION WALTZ, dedicated to AnJoan Grey (for Cello (or Violin) and Piano (or Harp)) Cello, Piano / Intermediate to difficult / 2 PDF / 2 MP3 / MIDI Play-along Connect to add to a playlist.
  3. 8/10 (30 votes) - Download Oblivion Free. Oblivion is the fourth part of The Elder Scrolls in which we have to save Tamriel from an invasion. Download the video of Oblivion and find out what's new. One of the main reasons behind the success of the saga of role-playing games The Elder Scrolls is.
  4. This is a list of official downloads, additional content created by Bethesda Softworks for Oblivion.Ten downloads have been released for Oblivion, with Shivering Isles serving as the game's only major expansion pack. Excluding Shivering Isles and Fighter's Stronghold, all of the downloads are included on the Knights of the Nine retail disc. All ten downloads are included with the Game of the.

The game gives you hours and hours of fantastic gameplay, and the portals lead you straight to the dark dimension of Oblivion. As a brave adventurer, you must uncover a way to halt the invasion of monsters from this world into the other world. The beauty of Elder Scrolls IV: Oblivion is how it exists even if you do not play the story right away.

The UESPWiki – Your source for The Elder Scrolls since 1995
Jump to: navigation, search

This is a list of official downloads, additional content created by Bethesda Softworks for Oblivion. Ten downloads have been released for Oblivion, with Shivering Isles serving as the game's only major expansion pack.

Excluding Shivering Isles and Fighter's Stronghold, all of the downloads are included on the Knights of the Nine retail disc. All ten downloads are included with the Game of the Year Deluxe Edition available on Steam or GOG, and with The Elder Scrolls Anthology; these are the only legitimate ways to acquire Fighter's Stronghold for PC. The Oblivion Game of the Year and 5th Anniversary Edition retail discs include only Knights of the Nine and the Shivering Isles expansion and not the other eight downloads. The downloads can no longer be purchased online individually for PC, but were originally available from Bethesda's Online Store and Direct2Drive. Downloads that were purchased from these vendors are not compatible with Steam versions of Oblivion.

Xbox 360 players can purchase any of the downloads from the Xbox Live Marketplace. Both the Thieves Den and Wizard's Tower downloads were also provided on the Official Xbox Magazine's 67th promotional disc.

A General Theory Of Oblivion Pdf Free Download

On PlayStation 3, Knights of the Nine is included with all versions of the game. Shivering Isles can be purchased from the Playstation Store and is also included with Game of the Year Edition. The other eight downloads have not been released for PS3 and are not available to purchase.

DownloadPlatformRequired VersionDescription
Horse Armor Pack
PC
Xbox 360
NoneTamriel is a dangerous place. Protect your horse from danger with this beautiful and resilient armor.
Orrery
PC
Xbox 360
NoneThe Orrery Pack allows you to help the Mages Guild of Cyrodiil repair the Imperial Orrery, an ancient dwarven machine with mystical properties. Bandits have stolen a shipment of parts destined for the Arcane University; if you can return them, the Orrery will function once more.
Wizard's Tower
PC
Xbox 360
NoneLocated high in the Jerall Mountains of Cyrodiil away from prying eyes, lies the Wizard's Tower, Frostcrag Spire. Packed with numerous useful enhancements, this structure will prove invaluable to magic-oriented characters.
Thieves Den
PC
Xbox 360
NoneThe Thieves Den provides you with everything you need to outfit your stealth-based character for missions. Rediscover the legendary Dunbarrow Cove and the ancient pirate ship The Red Sabre. Dunbarrow Cove not only provides a home base to operate out of, you can purchase new vendors, trainers, and even an upgrade to your sleeping quarters with the loot from your heists.
Mehrunes' Razor
PC
Xbox 360
1.1.511Conquer one of the deepest and most challenging dungeons in all of Cyrodiil and claim one of the most fearsome weapons imaginable.
Vile Lair
PC
Xbox 360
1.1.511Deepscorn Hollow is an asylum for the wicked, giving refuge from persecution by the virtuous. This is the place for evil characters.
Spell Tomes
PC
Xbox 360
1.1.511This pack adds Spell Tomes to the world's treasures. These books grant you powerful magic spells by reading them.
Knights of the Nine
PC
PS3
Xbox 360
1.1.511A fallen King has been unchained from the darkness of Oblivion to seek vengeance upon the Gods who banished him. Only a champion pure of heart can vanquish the evil that has been released upon the land. You must heed the call, reclaim the lost relics of the Divine Crusader, and return the Nine to glory. New dungeons, characters, quests, and mysteries await.
Shivering Isles
PC
PS3
Xbox 360
1.2.0416Enter the torn realm of Sheogorath - a world where Mania and Dementia reign. Do you have the strength to survive his trials, to tame a realm fraught with paranoia and despair, and wear the mantle of a God?
Fighter's Stronghold
PC
Xbox 360
1.1.511Defeat the attacking marauders and become lord of Battlehorn Castle, located in the Colovian Highlands west of Chorrol. The castle has private quarters, a grand dining hall, wine cellar, barracks, training room, and – most importantly – a few hidden passages and dark secrets.

Into Oblivion Chloe Frayne Pdf Free Download

Free
Knights of the Nine is included with the PS3 version of Oblivion, as well as Game of the Year Edition for all platforms.

Further information

Downloads that require the 1.1.511 patch should only be viewed and edited with version 1.2 of the Construction Set. Alternatively, it is possible (although not recommended) to view the download in the original construction set if you hex edit the download's .esp file: make a backup of the .esp file, open it in a hex editor, and change the second line from '0C 00 00 00 80 3F' to '0C 00 CD CC 4C 3F'.

FormIDs for content introduced with downloads differ from regular items; the first two digits are based on the number of downloads and mods installed and their load order (e.g., if the download is the first one loaded, the first two digits will be '01', for the second download '02', etc.). More information can be found on Formid.

See Also

Oblivion Pdf free. download full

  • The Elder Scrolls:Oblivion official website

Oblivion For Free

Retrieved from 'https://en.uesp.net/w/index.php?title=Oblivion:Downloads&oldid=2306914'
Coments are closed

Latest News

Scroll to top