many, very many, and healthy: health big data is yet another gold mine

Health of Data

Big data have changed the way we manage, analyze and leverage data in every industry. One of the most promising areas where they can be applied to bring about change is healthcare. Healthcare data analysis has the potential to reduce healthcare costs, predict epidemic outbreaks, prevent avoidable illnesses, and improve overall quality of life. Global life expectancy is increasing, bringing new challenges to current therapeutic methods. Healthcare professionals, just like entrepreneurs, can collect large amounts of data and seek the best strategies to use these numbers. In this article we want to emphasize the need for big data in healthcare: Why and how can they help? What are the barriers to their adoption?

The application of big data analysis in healthcare has numerous positive and life-saving results. Big data means large amounts of information created by the digitization of everything, which are stored and analyzed by specialized technologies. Their application in healthcare means using specific health data, at a population scale (or of a specific individual) and potentially will help prevent epidemics, treat illnesses, reduce costs, etc.

Now that we live longer, treatment models have changed and many of these changes have emerged from data. Doctors want to understand as much as possible about a patient and as early in their life as possible, in order to identify warning signs for serious diseases that will arise – treating every illness at an early stage, where it is much simpler and less expensive. With the analysis of health care data, prevention is better than treatment, and managing them in order to form a more general picture of the patient that will allow insurance companies to offer a “custom” package for each one. This is the industry’s effort to overcome the problems of separation that exist with each patient’s data: everything is collected in bits and bytes and is archived in hospitals, clinics, surgeries, etc., but it is impossible for them to communicate properly with each other.

Thus begins an advertisement article on how “the analysis of healthcare big data will save people” in the media of July 2018.1 Like every techno-promotion that does not feel threatened by criticism from anywhere, this one also speaks relatively clearly when referring to those interested in the digitization of individual and collective health. Cost reduction (certainly regarding the expenses of healthcare systems…); individually “tailored” health insurance contracts (from private insurance companies, so that they never have expenses that are not fully covered by payments from their customers’ contracts…); and always more effective disease management from the side of healthcare professionals (who tend to resemble healthcare entrepreneurs)…

In the immediately following chapter, titled “why we need big data analysis in healthcare” the same article clarifies (the emphasis is again ours):

There is a huge need for big data in healthcare, due to the rising costs in countries such as the USA. As a McKinsey study shows, “After more than 20 years of steady increases, healthcare expenditures represent 17.6% of GDP – almost $600 billion more than the expected limit for a country the size and wealth of the USA.”
In other words, costs are much higher than they should be, and have been increasing for 20 years. Certainly, we need some smart idea for this issue, guided by data. And current incentives are also changing: many insurance companies are moving from “fee-for-service” schemes (which reward the use of expensive and sometimes unnecessary treatments, and quick care for large numbers of patients) to schemes that prioritize treatment outcomes.
As has been written in the popular Freakonomics books, financial incentives matter. And incentives that prioritize patient health over caring for large numbers of patients are a good thing…

One of the biggest obstacles in the way of using big data in medicine is that medical data is scattered in many places, located in different countries, hospitals and public administration departments. The upgrading of these data sources would require the development of new infrastructures where all health data providers collaborate with each other.
Equally important is the establishment of new online software records, and a new business intelligence strategy. Healthcare must reach the levels of other industries that have already moved from traditional retrospective methods to methods that look more to the future, such as predictive analysis, machine learning, and graph analysis…

If your idea of doctors (and therefore of medicine as a science) is that of people who have been trained to recognize symptoms and diagnose disease or health, then you have a long-outdated view. The reality is the one briefly described (since it is known) in the same article:

… Doctors are increasingly relying on data, which means they rely on large packages of clinical observation and research data, as opposed to relying solely on their education and professional opinion. As is happening in many other industries, here too the concentration of data and its management is growing, and professionals need help in this area. This new approach to treatment implies that there is greater demand for big data analysis in healthcare facilities than ever before….
Do these seem obvious or even self-evident? They are not. However, we should turn to medical history to see how they have come to become commonplaces…

Genealogy (clinical, laboratory, statistical: if medicine changes, does health change with it?)

Traditionally (for centuries) the diagnosis of an illness was based on the recognition of certain symptoms. Fever is a symptom, both subjective (someone feels they are “burning up”) and objective (a touch on the forehead immediately shows the rise in body temperature). Cough, pain, fatigue, sudden loss of appetite, difficulty breathing, dizziness, fainting, skin rashes, blood in urine… all these and many similar ones are symptoms.
In this symptomatic approach things seem simple. If any organism has a problem, it shows it. The symptom is first detected by the organism itself, then quickly by its environment. The doctor, trained in recognizing and interpreting symptoms, comes third in line; and it is he who, due to education, knowledge, experience and authority, will make the scientific diagnosis. Nevertheless, if there is social experience (e.g. in common childhood illnesses), even the social environment can make a correct diagnosis. Even the (human) organism itself: it is not difficult for everyone to understand where toothache comes from or that “what I ate upset me” if one suddenly vomits.
In the end, the trained doctor is the most recognized one to recommend any treatment.

This symptom-based medicine has been named clinical. The stethoscope, a simple medical device that still hangs around doctors’ necks and has been established, along with the white coat, as the symbol of medicine, is among the first instruments/tools/technical means of clinical medicine. It dates back to the early 19th century, specifically to 1816, when the French doctor Rene Laennec made a long thin tube, and placed one end on the patient’s chest and the other to his ear, somehow amplifying the sound of breathing.2

Parallel to this dominant practice of (medical) interpretation of symptoms, already from the 19th century the surgical treatment (which however was not at all unknown to much earlier cultures) began to develop in Europe and more generally in the Western urban world – and, mainly, microbiology. Names such as the Frenchman Pasteur and the German Koch have rightfully remained in the history of medicine. As the most famous among many more researchers, by utilizing the microscope (which could only evolve thereafter), they began to discover the source, the cause of certain diseases (such as cholera) that plagued various regions either of Europe or of the colonial conquests of European states in various parts of the planet.

Microbiology is indeed a milestone in the evolution of medicine. Not only because it complemented the symptomatology of clinical medicine by identifying causes invisible to the naked eye (microbes, various kinds of microorganisms), but also because it pioneered a new form of treatment through the elimination of these microscopic pathogens. Vaccines and chemotherapy originate precisely from the discovery and methodical approach to combating these “invisible” disease-causing agents.
However, microbiology, with its own instruments (microscopes, microbial “culture” plates, etc.) and its distinct body of knowledge, added a new dimension alongside the symptom-based clinical medicine, which was closely tied to the patient’s body: laboratory medicine. Laboratory medicine was not practiced directly on or in physical continuity with the body. It was conducted elsewhere, in isolated spaces and times, for both operational and safety reasons. Laboratory medicine also made diagnoses: it discovered—or claimed to discover—the hidden cause of the symptom. Consequently, it identified the type of illness, at least in cases where the disease was caused by microorganisms.

The next breakthrough in medicine (regarding our topic) was the introduction of statistics in disease measurements; those that were notoriously infectious/contagious or due to obviously poor public hygiene conditions. Here, credit is rightfully given to Florence Nightingale. Her personality and contribution are fascinating. The British-born Nightingale (born in Florence—hence her nickname) was a lady with a solid general education/background according to the standards of her time. Contrary to family advice, her social position, and her qualifications, she decided to become a nurse (something that became a must for young ladies of the urban class for several decades afterward, but at the time was bold). Nightingale worked as a nurse in military hospitals during the Crimean War (mid-19th century) and became legendary by enforcing the observance of simple but fundamental hygiene rules in those places that were formally called “hospitals” but were practically death chambers for war casualties: she reduced the death rate perhaps by as much as 90% by ensuring good ventilation of the spaces and constant handwashing by doctors and nurses…
However, there was another issue for which Nightingale was praised as a pioneer. Since she had acquired solid mathematical knowledge as a teenager, she began inventing methods of visual representation of statistical data on illnesses and deaths. The tables and diagrams that have become fashionable in the 21st century as visualizations of the risk (within or outside quotation marks…) of various viruses directly originate from the tables and diagrams of Florence Nightingale in the mid-19th century… In 1859, she was elected a member (the first woman…) of the Royal Statistical Society of England. And in 1874, she became an honorary member of the American Statistical Association.

If health data need some kind of ancestor, it is to be found there, in the combination of mathematical and medical care provided by a nurse in the middle of the 19th century… There, however, lies also the origin of an original kind of medicine. One that is not practiced next to the patient’s body interpreting symptoms, nor even in a distant space/time that is nevertheless housed within the clinic/hospital, seeking the generally invisible causes of symptoms. It is a kind of medicine that could then be practiced anywhere there was room for a large table, and the appropriate knowledge of constructing diagrams on a map.

Simply a different (medical) topology? Yes, but not only. A different conception both of illness and health. Different relationships… Illness is bio-symptomatic: as such it can be identified, and as such it can be addressed by clinical medicine (or social experience)… Illness is micro-biological: as such it can be interpreted, and its treatment may be chemical (drugs, vaccines) – yet there may still be room for social knowledge (herbs, etc.)… Illness is statistical: while the micro-scope of science develops, delving into increasingly smaller “components” of the body, there also begins to develop (supportively, for several decades) the macro-scope. The panoramic view of science: just as the micro-scope dives into cells, so the macro-scope ascends toward a kind of numerical geography, concerning ideas about illness (and health) that are very far from the direct, individual, and social experience of bodies.

Of course, someone would be entitled to wonder whether this dual movement, towards the “depth” and towards the “height” of illness and health, was imposed by the condition of the bodies themselves. No, it was not, and that is why it did not occur everywhere on the planet. Rather, these were movements of knowledge methods of a specific ideological origin: the European (and North American) Protestant, primarily urban class. The division into ever smaller components and the synthesis into ever larger magnitudes are identity elements of the “mother” of all sciences, physics. However, while in their early stages these methods had very good results (in favor of the health and hygiene of individuals and populations), already from some point in the middle of the 20th century, with the development / evolution of the industrial phase of capitalism, they began to show limits of effectiveness. And, furthermore, they revealed their ideological origin: the fact that various approaches both for health and for treatment of Eastern (Asian) origin began to gain influence in Western societies, against Western medicine, can be attributed to this not particularly marginal doubt as to whether what is defined as “health” or “treatment” is adequately such…

Prevention: medicine or something more?

The above is a historical account that has more than just encyclopedic value. Although in its development Western medicine began early on to acquire additional fields, and although specific illnesses acquired special social significance in different periods (tuberculosis, cancer), generally speaking practiced medicine (and correspondingly, the average social idea of illness and health) remained clinical/symptomatic at least until the 1960s, 1970s, perhaps a little later. The basic social idea remained until then that if you feel well, you are well. Certainly, the large family of “psychological problems/conditions” entered the daily agenda in a more massive way in the second half of the 20th century. Yet again, there should be symptoms. Health was the (expected) norm and illness the (undesirable) exception, which in most cases could be identified symptomatically.3

The situation began to change radically (and for a time somewhat underground and silently) from the mid-1970s. In 1974, Marc Lalonde, Canada’s Minister of Health and Welfare, published a study by specialists titled “A New Perspective on the Health of Canadians,” which became a landmark and a “model for international emulation” thereafter. The “Lalonde Report” (as it remained in history) established a new (though not unimaginable) concept: the health field. To achieve “field health” (or “herd health,” as it could be said…), two pillars were necessary: on one hand, the public health care system; on the other, prevention and the “promotion” of individual health care. While the first was already shaped (with clinical medicine and support from laboratory and statistical medicine), the second—the idea of prevention and individual/family responsibility for prevention/health care—was something new. The “Lalonde Report” is considered the first modern governmental document in the Western world to acknowledge that our emphasis on the biomedical care system is wrong and that we must look beyond the traditional health care system (patient care) if we want to improve public health.4

It was indeed a cut that began to show its consequences and how it could be utilized relatively quickly. Medicine, with all its branches, would no longer be the only (or the main) factor ensuring health for individuals and populations. Healthcare was not an entirely new concept regarding governmental responsibilities; vaccinations already existed in medical/state practices.
However, now
– preventive healthcare was about to expand and be personalized for everyone;
– preventive healthcare emerged as a generalized behavioral paradigm/model, whose directions would be shaped by “experts” (and, why not, by mass media of exhaustion);
– clinical medicine and its supplements would be the “last resort” that one should ideally never reach;
– feeling well would no longer be sufficient as a health certificate, but should be supplemented by what I constantly do to stay healthy…
The “Lalonde report” of 1974 could be considered the birth act of hygienism from above. Even more so since it was enriched and adopted first in 1978 (with the Alma Ata declaration) and later more systematically in 1984 by the “World Health Organization” (which then had authority…) as a set of guidelines and directions for its member states. It is interesting to note that the “Lalonde report,” besides indicating lifestyles and daily behaviors as basic fields of “healthcare,” also introduced the state of the environment as an equally important factor. One could say that Lalonde’s intentions were good and suited to issues that had started to emerge dynamically around the same time.
But the “composition” of causes leading to illness remained open. And, consequently, manageable in terms of power. Inspired by the “Lalonde report,” the U.S. Public Health Service Commissioned Corps (PHSCC) prepared in 1979 a ten-year action plan to improve the health of Americans. The plan was named Healthy People 1979 and had very clear warnings about where the responsibility for poor health/diseases of the population lies:

… 50% of mortality in 1976 was due to unhealthy behaviors or lifestyles, 20% to environmental factors, 20% to human biology, and 10% to inequalities in healthcare…

Healthy People 1979 became the manual for state measures for good health in the US, significantly changing the funding goals for prevention. It was renewed in 1990, 2000, 2010, and 2020, with each version being the new floor compared to the previous one.5

The birth of hygiene from above found easy and widespread resonance in the emergence and spread in the first (capitalistically developed) world of hygiene from below from the late 1970s onwards. A structural element of the rise of neo-liberalism as an “ideology for social use” was the idea of the Self – Capital. And a structural element of the Self – Capital was Health – as – Individual – Capital: the sector (along with education) in which everyone should invest…

From that point in time onwards (roughly: the 1980s), healthcare began to move out of hospitals and clinics. The idea of being healthy, based on the absence of symptoms, started to be relativized, as it became a common social belief that “even if you’re fine now, you might not be tomorrow if you don’t take care of yourself starting today.” As the concept of healthcare as a daily social duty began descending from the upper social classes downward, healthcare itself started to become an ever-expanding industry beyond and around the traditional center of illness management: from healthy foods and waters to popular “health and beauty” tips; from incorporating exercise into daily behaviors to denouncing industrial capitalism as unhealthy…
But how could one know that their health care was effective without having some worrying symptoms? Through routine preventive examinations. Tests began entering the field of health forcefully, naturally personalized. I am well if the tests show that I am well! This reorganization of the concept of health is a relatively recent development; and we would add: it has proven fatal! Since, inevitably, any self-awareness of the living being regarding their condition (in some cases and the causes of their troubles) was quickly exiled, in the name of “objective” and “scientific” health measurements and indicators… Yet when self-sense and self-knowledge of something so serious is lost, countless evils follow.
Meanwhile, and this was inevitable, laboratory medicine, which had been the helper of clinical medicine, began to occupy a central place in the conception of health, illness, and prevention. Tests began multiplying, test technology developed, many tests that were exclusively for hospital use entered the market for diagnostic tests… And over time, from the 1990s onwards, clinical diagnosis, the recognition of symptoms, began losing its diagnostic value even among health professionals, clinical doctors. Referral for tests (and more tests, and more tests) became the routine of health systems; much to the delight of both private diagnostic companies and chemical and pharmaceutical companies that manufacture the testing machines and the chemical reagents used, shaping (and reshaping according to their interests) the “boundaries” of health/illness, and naturally the various drugs or quasi-drugs (including the entire range of “organism boosters”…) required by the once manic constant self-healthcare.

“Prevention” ceased to be the responsibility of “specialists.” It became something like a divine commandment, for every faithful of health, within its systematic ambiguity. It became, of course, a massive range of consumption. What should each person be protected from? Medical research (or pseudo-research) began to become “popular reading,” permanently feeding the health-conscious public hungry for preventive care with contradictions. However, the matter would not remain there.

The outbreak of precaution as an expression of hygienic responsibility was proven to be the emergence of AIDS/HIV. The issue has many sides that we cannot address here. One of the most significant aspects related to AIDS/HIV is not merely that it emerged within the general environment where one must take care; it did not emerge solely within the general environment where tests show whether I am healthy; but it also brought forth this unprecedented notion: everyone was considered ill without symptoms if they were a “carrier” of a virus, which weakens the immune system so that the body becomes more vulnerable to common infectious diseases. In a sudden, violent, and historically unprecedented manner, societies jumped from the belief that if you don’t take care of your health you can get sick (i.e., display symptoms of some illness) to the medical diagnosis/certainty that you are ill even if you haven’t gotten sick yet, because you will definitely fall ill soon; and you will die!!!
Today, after so many years, we know that there are thousands of HIV-positive individuals who are indeed “carriers” but have neither fallen ill nor died… However, during the ’80s and ’90s, this strange angel of death, this microscopic retrovirus that was so insidious as to cause no symptoms but to “wait” by reducing normal immunity, and which was moreover associated with sexual relations and practices—this “bad news” was a shock! A shock that not only strengthened the already existing social tendency toward preventive examinations and tests; but also substantially shaped hygienic behaviors in a field as difficult and private as erotic relationships. Adding something to the late-modern capitalist conditions that had not existed earlier with other diseases: the guilt of transmission… And, something equally important: the idea of the asymptomatic person who transmits the virus, the asymptomatic person who infects—from lack of responsibility, care, knowledge (within or outside quotation marks…)

Michel Foucault presented the concept of biopolitics coherently in his well-known series of lectures at the Collège de France in 1978 and 1979. In this, he condensed the state (and capitalist) “rationality” in the management and control of populations already from the 17th century. In biopolitics, or more accurately, in biopolitics (since we are talking about such a long and, mainly, such a dense period of time), the issue of health / illness is certainly included, in all its aspects.
From the 1980s onwards, the hygienic aspect of biopolitics visibly changed, not only in relation to the past but also within its accelerating centralization. And not just once. With alternative practices or not, with different treatments or not, within a stable ideological bed (the continuous care of the Health of the Self Capital Chapter) a cosmogonic, ontological reversal took place. Before, health was the usual, the expected, which was interrupted by conditions of illness, more or less serious, more or less painful, sometimes fatal. Now (that is, for some years now) health is an uncertain condition, never given, which each person must supervise and “cultivate” with more or less agony; often with not certainly effective methods.

The fact that now “experts” as well as demagogues / shapers of public opinion are talking about industry, and moreover about the healthcare industry (and not narrowly “medical”), is the ripe and complex result of 4 or 5 decades of capitalist development (for many, “growth”…). There was no sinister plot by a handful of underworld bosses. However, in the course of these 4 or 5 decades, these bosses, as underworld as required by the service of their economic interests, joined the path; they conquered it; they carved it out, paved it and signposted it. For quite some time now they have been controlling it. So much so that if, for some mysterious reason, all the pharmaceutical industries of the capitalist planet were to suddenly disappear, this would be far more painful (certainly for first-world societies) than if all the wheat were to vanish…

Algorithms: medicine or something more?

Some have commented, in light of the emergence of covid-19, that the pandemic is a mirror: it is the image of each particular society, of its organization, its authorities, its fears… Many, very many, the majority of developed capitalist societies, believe that covid-19 was/is simply a dangerous natural phenomenon, full stop. Certainly, covid-19, like thousands of other viruses, is a real, natural entity. But this is not the whole story.

While a lightning bolt is also a natural phenomenon, a lightning bolt drawn by a lightning rod is no longer, merely, a natural phenomenon. There is in the lightning rod a certain technology, a certain scientific or empirical knowledge, that “pushes” the lightning there and not elsewhere. Ultimately, the human species, throughout its journey, constantly “converses” with two issues: with natural phenomena and with social phenomena. Rain is always rain. The roof with tiles, the umbrella or the anti-skid tires of vehicles are forms of the relationship with the natural phenomenon. It is rather unlikely that we can still conceive of a natural phenomenon without including the ways we have (or do not have) to deal with it, to manage it. Ultimately, even if a sunset draws attention without bringing to mind sunglasses or its photography, viruses and every other threat to health we cannot perceive without dealing with them.

And that is where the covid-19 pandemic is a particularly valid (to deterrent) reflection of modern social, political, ideological, economic characteristics. The emergence of covid-19 (that is, its specific management) had been gestating for over 15 years. (And this was not / is not even a secret!) It was conceived during the outbreak (that is, in the specific management but also the unresolved issues that remained) of SARS, in 2002 – 2004… It was conceived during the outbreak (that is, in the specific management but also the unresolved issues that remained) of H5N1 (“bird flu”) in 2003 – 2005… It was conceived during the outbreak (that is, in the specific management but also the unresolved issues that remained) of H1N1 (“swine flu”) in 2009 – 2010… It is striking that although these “pre-sales” of mass hygienic terror are quite recent, and although at the time they indeed shocked the first-world societies, they were erased from memory, forgotten. So radically that although with the emergence of covid-19 the same “protagonists” did the same job from the outset, few remembered that they had seen this work before! This collective amnesia / forgetfulness must have some serious explanations… What is certain is that it was very useful in replaying the work, this time more complete…

In 2006 we wrote, among many other things, under the title Some Who Make Us Sick:6

Some influenza pandemics are mild. Others prove to be highly lethal. If the virus reproduces rapidly before the immune system has time to produce antibodies, a severe and, often fatal, worsening of the disease will occur, resulting in a scourge that could easily cost more lives in one year than AIDS has claimed in 25. Epidemiologists warn that the next pandemic could affect one in three people on the planet, hospitalize many of them, and prove fatal for tens to hundreds of millions. Such a disease will spare no country, race, or income group, and there will be no sure way to avoid it.

W. Wayt Gibbs, Christine Soores – Scientific American (Greek edition, December 2005).

The scenario is going smoothly. Viruses mutate, and this can make them dangerous to the human species. Flu outbreaks usually come to the West (although not always) from the outer and distant East. Common carriers and hosts of flu viruses are animals, mainly waterfowl.

Consequently, a bird flu could be the common cold of the days, magnified as much as a few minutes of publicity require. But we know it is something different. It is overproduction. How big it doesn’t seem at first glance. The noise of the media and the experts is as common as the dramatic tones for flea jumping.

However, avian flu is not an economic scandal, nor yet another diplomatic crisis here or there. It is not even a greenhouse phenomenon, a risk on credit. As the latest link in the rich chain of health threats, it has incorporated all the dramatic experience of mad cow disease, SARS, and other less famous but significant episodes. The best thing about it (or the worst, depending on one’s position and where one finds oneself and is about to find oneself) is that it has, as a threat, highly opportune timing: it coincides with artfully articulated prophecies (or methodologies, again the characterization is a matter of position) for impending unprecedented public order crises…

Avian flu has all the apocalyptic characteristics that fit the post-aids conditions of capitalism. However, anyone who imagines that it is simply, or mainly, an ideological operation is mistaken. Ideology plays a significant role since it is on the front line. But it is not alone. Moreover, for the most part, the theatricality of this particular threat has no new components. It rather repeats the conquests of recent years. It keeps fresh, timely, and militant the standard elements of hygienic panic, social psychoses, and metaphysical dogmas.

The novelty lies rather in the institutional scaling up of preparations for a kind of war that we have never experienced before. And this is happening behind the scenes, or at least behind the level at which the average citizen perceives reality.

While the noisy, constantly on the brink of panic, citizens consider themselves threatened by swallows, pigeons, chickens, swans or ostriches (depending on the local fauna) the scientific community has before it antigens, dna recombinations and cytokine storms. Both the former and the latter refer to avian flu. The difference and distance in the representations (of the problem, the risk, the threat) between specialists and non-specialists is impressive. And not only from a size perspective.

For public opinion, her mimetic primitivism is remarkable, at least in relation to the imagery of the threat of AIDS. At that time, representations of fields of hygienic battle and risks entered the social imaginary that had no historical precedent, yet were consistent with technological development: immune system, cells – T4 guards, retroviruses. Of course, there was also a more accessible level of risk-logical spectacle: sexual practices, blood transfusions, and intravenous drug use.

Prevention is an interesting condition: it has no limits in duration or intensity, except, of course, the appearance of the condition it intends to prevent. Speaking of health protection, prevention is a state of prolonged readiness, a kind of mass reserve mobilization. And precisely because we are talking about a health industry and not merely ideology, prevention is a lot of money. For some, prevention means painful loss of money: this is the case with the industry of chicken-derived edible products and eggs… For others, on the contrary, prevention means easy and enormous profits.
This is the case of a drug from the Swiss pharmaceutical company Roche, which became famous for reasons no one particularly investigated. Tamiflu. Tamiflu is the commercial name for oseltamivir, a new generation antiviral; while there is also a less famous, at this moment, drug of the same category, zanamivir, which has the commercial name Relenza, and is manufactured by the also major pharmaceutical company GlaxoSmith.

Tamiflu literally sold out in the fall of 2005. While, not only is it completely useless in dealing with what specialists fear, a genetically recombined variant of H5N1, but it has minimal effectiveness even against the existing version of the virus. (The New England Journal of Medicine published in December the results of Oxford University’s research in Vietnam regarding the therapeutic effectiveness of Tamiflu. The two patients to whom it was administered eventually died after H5N1 developed resistance to oseltamivir. Of course, these Westerners were not the first to discover Tamiflu’s uselessness. Vietnamese doctors had preceded them).
How then did Roche’s commercial miracle happen? Preventively.

Whether, in the case of a pandemic of the mutated H5N1, 10 or 100 million people will die on the planet, and whether 30 or 300 million will need care, are simply macabre gambler’s questions, eager to bet on the right number. The serious question is different: what consequences would the possibility of any pandemic have on the economic and administrative structures of capitalist societies? Or, put differently: if prophecies of global flu apocalypse are ideologically and commercially convenient, what is the real dimension of such a probability at the heart of capitalist relations? Will there be labor, governments, social class stability under these conditions? And under what terms?

The first hint of answers comes from an unexpected, if one approaches today’s reality superficially, side. The military analyses that became fashionable in the 1990s, and with greater intensity from the end of that decade onward. We reproduce indicatively such a Greek-origin analysis (meaning: copying and adapting international trends), from the “Institute of Defense Analyses,” which is chaired by a retired lieutenant general, dated 2004:

The global nature of many problems is now indisputable: overpopulation in the South of the planet, which, in a repetition of Malthusian theory, raises concerns about global food sufficiency [“food security”], while combined with poverty and underdevelopment creates uncertainty regarding the ability to meet basic needs for a large portion of the planet’s population, the spread of AIDS (according to the WHO, carriers of the disease have exceeded 40 million, 90% of whom are in developing countries), and other infectious diseases (SARS, Ebola virus, etc.), environmental pollution (destruction of the ozone layer, greenhouse effect, destruction of tropical forests, pollution of many species of flora and fauna, etc.), terrorism, drugs, organized crime, problems of economic and political refugees and illegal immigrants, etc.
All these problems exceed the traditional definition of security threats and can be characterized as Gray Area Phenomena… Thus, “human security” and state security are mutually supportive concepts and actions, in order to achieve the broader goal of national security which in the 21st century concerns both the defense of the territorial integrity of the state and the protection of the economic and social welfare of its citizens.

It is impressively magnanimous the willingness of the caravans of the developed capitalist world to adopt as their own problem – under the altruistic concept of national security – every individual crisis of capitalist development… including public health crises.

One could be comforted by estimating that in such parades of risks for modern societies, starting from labor migration and ending with the melting of the polar ice caps, nothing more is reflected than the militaristic voluntarism of a caste of officials from state bureaucracies who simply want to justify their salaries. Indeed, this is happening. But not only this. And, ultimately, not mainly this.

A month later, in Sarajevo.pdf no 148a we wrote among other things about the next “wave of death”, the one associated with H1N1:7

…What, then, were the measures taken by states against such a sanitary Armageddon as the H1N1 was projected to be?
On May 7, 2009, the WHO announced that physical restrictions were not feasible, and that countries should focus on mitigating the consequences of the virus. As late as November 27, the WHO advised that air travel was safe, with certain precautions. Unilaterally, some states decided that travelers coming from areas with H1N1 flu outbreaks up to 2 weeks before their arrival would be placed in quarantine. Beijing announced it would do something like this on April 26, 2009. When in May the Chinese regime confined 21 students and 3 teachers from the US to their rooms in the hotel where they were staying, the US State Department issued a protest and advised its citizens not to travel to China. In Hong Kong, a hotel with 240 people was placed under quarantine; and the Australian government held a cruise ship with 2,000 people away from its ports because a case appeared on board. Japan also quarantined 47 air travelers in mid-May 2009. The sporadic nature of these actions—and the fact that they are still being mentioned—shows that they were exceptions.
Practically what airlines did was improve the ventilation systems of the aircraft, and disinfect them during long breaks between flights. Masks were not part of daily protocol, neither for passengers nor for crew (no “healthcare organization” raised such an issue!) although some Asian airlines introduced them for flight attendants. In some airports, (digital) thermometers were used to check the temperature of those passing through arrivals, but this was quickly deemed a waste of effort.
Several kindergartens closed in the US, but the general advice of the American CDC was against closures. It simply recommended that any students or teachers who were sick or felt they might get sick stay home for a week. Lessons on “personal hygiene” were also organized in schools. The same applied to workplaces: employers were advised to grant “easy” leave to any workers showing flu symptoms.

Why was there only this reaction which, with today’s data, would be considered either simply logical or (for hygiene fanatics) “inhumane”? Despite the “predictions of the experts” for a hygienic Armageddon, and despite the fact that these predictions found their way to the media, in 2009 and 2010 there was another issue, much more serious; and with much more tangible consequences: from the collapse of Lehman Brothers, in September 2008 and afterwards, especially in 2009 and 2010, the intensification of the financial phase of the ongoing global capitalist crisis / restructuring, left little room. Moreover, as we have already said, much was still lacking for the hygienic terror to be transformed into tangible discipline and mass change of behaviors.

the chain and its links

The pregnancy of covid-19, that is, the pregnancy of the beneficial management of a pandemic for the healthcare industry (and not only), in the early steps of the 4th industrial revolution, would be revealing by itself. How many things related to covid-19 were/are the same as those that “played” 10 or 15 years ago, and where were the serious changes? It is impossible to make a complete retrospective here to give the whole answer. We limit ourselves to the somewhat more specific issue of health big data. We need to answer: did the development of big data in general and specifically the (business) demand for health big data strategically shape the management of covid-19?

One of the most fundamental points of this management, almost the cornerstone upon which all the rest was built (measures, arguments for-, etc.), was the revival of the ideological, behavioral, and hygienic framework that had first emerged with AIDS/HIV. Of course, no one openly admitted this, for obvious reasons. What was this framework? The asymptomatic individual who transmits the virus, the asymptomatic carrier who infects—through irresponsibility, negligence, or lack of knowledge (whether intentional or not).
There was an “innovative” and very useful idea/framework then, during the era of AIDS-related fear. In the following years, it was not reused. Reasonable enough: in the anticipated (albeit imagined…) apocalypses of 2004–2005 (H5N1) and 2009–2010 (H1N1), the protagonists were flu viruses. For these, it is well known that transmission occurs either when someone is already sick or immediately afterward, during recovery. Although those flu strains had been presented more or less as the end of our species, there was this critical limitation: it was empirically easy for anyone to avoid close contact with someone visibly ill—as is, indeed, common social practice.

In order to criminalize the entire population and legitimize indifferent, blanket prohibitions, this “the patient recognizable by symptoms” (of the flu) didn’t work at all… It didn’t fit, it didn’t help. The coronavirus, however, served the purpose: upon it could be revived from reserve the idea of the asymptomatic “carrier” infectious against all! Covid-19 was convenient: upon it could be (and for capitalist needs had to be!) constructed and imposed the idea of blind contagiousness of all social relations!
There was however one more problem. A flaw of covid-19. Although it spread easily, it killed only exceptionally: elderly people already suffering from serious health problems; or, rarely, younger people also with serious health problems. How could the idea of the asymptomatic killer by accident be exploited when the killing itself is rare? The ideological staffs didn’t delay in devising the patent solution: by sticking one to another, without symptoms/signs appearing and without anyone falling ill, some link in the chain would randomly already be burdened in his health! Some grandfather, some grandmother… In the end, the asymptomatic “carrier” still kills; just not directly by contact…

The construction of the idea of a “transmission chain” of a virus that can in some cases cause complications and ultimately death, when viewed alone, is monstrous. Not because it is not valid. But because never before in history had anyone (neither citizen nor ruler) conceived that in order to protect a population from a mild contagious infection, it must be completely locked down! However, in 2020, the time had come when what until then was unthinkable not only became conceivable but additionally technically feasible and politically useful.

What does the blind and invisible transmission chain of an infection mean? Is it simply an indisputable “natural fact”? Or is it a very specific—and ultimately deceitful—modern construct? If someone answers the former, then they also shape the corresponding idea of “health.” The logical conclusion is that “healthy,” in the sense of excluding any infectious transmission, is only the one who lives desperately alone, without any direct or indirect contact not only with other individuals of their own species but also with any animal species! This deadlock, the deadlock of the absolute hermit, would logically be “most natural”… And then everyone should wonder why “nature” made this terrifying mistake—the coexistence of species, but also the direct affinity between members of the same species!
The “natural” dimension of the blind and invisible transmissibility of an infection (of a microorganism that causes it) is, ultimately, impossible to take seriously into account as a factor of health, individual and/or collective. That is why it had not been put on the table—either from a social or a medical perspective—until very recently. What made it possible to be launched now was not its “naturalness” at all. But a strong ideological (and bio-political) investment in it: the idea of the individual-as-link-of-a-chain-that-must-be-monitored-in-order-to-be-able-to-“break.”

How could asymptomatic contagiousness through contact (this conquest of the notion of aids/HIV in the ’80s – ’90s) evolve into asymptomatic contagiousness from a distance, into “tele-” (for the sake of representing the threat of covid-19 in the ’20s)? And what idea would each of us have as an “intermediate,” as a link in these “transmission chains”? If you haven’t already thought about what the same conquered pattern is, upon which it proved very easy to make this shift from “through contact” to “from a distance,” here it is: the internet, especially through the abstract formatting of social media! There are where hundreds of millions of users were educated to perceive themselves either as “transmission nodes” or as chained transmitters! The representation of the contagious fatality of covid-19, the idea of multiple blind and invisible transmission chains of the virus that run through all directions of social relations until they find a “target” (kill someone with serious health problems), was/is nothing more and nothing less than the hygienic perversion/translation of the mass “idea”/experience of the internet of social relations!!! Anti-social relations of course..

This (we argue) was the technological/social “conquest” at the beginning of the 3rd decade of the 21st century that made the transformation of direct (unmediated by digital machines) social relations into infectious ones, into factors/channels of disease (imaginary though it was, this had by now little significance) “convincing.” This was the technological/social “conquest” that a decade earlier had not existed in such maturity; but now constituted the compact ideological/symbolic/imaginative foundation8 upon which the spectacle of generalized, invisible, and blind contagiousness could be built. We observed:9


See some characteristic differences between 2009 and 2019:

  • In 2009, internet users were (in parentheses with bold the corresponding size 10 years later, in 2019) in millions:
    Asia……………………………………….764.4 (2,300.47)
    Europe …………………………………..425.8 (727.56)
    North America ………………………….259.6 (327.57)
    South America / Caribbean……………. 186.9 (453.70)
    Africa ……………………………………. 86.2 (522.81)
    Middle East……………………………. 58.3 (175.50)
    Australia………………………………… 21.1 (28.64)

Overall, internet users were 1.1 billion in 2005, 1.77 billion in 2009, and 4.13 billion in 2019.

  • In 2009, Facebook was only a year old and had 250 million users. By 2019, it had nearly 3 billion. In 2009, Twitter, Instagram, or WhatsApp did not exist. YouTube did exist, with around 400,000 users. By 2019, it had nearly 2 billion.
  • The first (and impressive) iPhone appeared commercially in May 2007. By 2009, approximately 480 million smartphones had been sold (and were in use, based on the applications available at the time). In 2019, the figure reached 3.2 billion.

However, it wouldn’t (have) been only that. From every aspect, the lessons and achievements of the 3rd capitalist industrial revolution, as they are restructured in the 4th, have universality. Since everyone is a transmission channel (of infectious conditions), and since infectious conditions are basically “information,” they must also be controllable as a carrier/creator of data (for this transmission and for any other…). They must be a continuous, inexhaustible source of data regarding their health status… It is precisely the moment when health big data emerges not as demands of the industries but as a social necessity – for health, always!

Let us recap for a moment. Only populations that had already internalized the doctrine that only tests certify health, and at the same time have forgotten how contagious many common diseases are, could unquestioningly accept this mass, generalized “stigmatization”…. Only populations that had already mythologized the concept of prevention could accept that millions of healthy people are confined at home for preventive reasons…. Only populations that have already mythologized the “transmission of information” (and poisonous lies among themselves), viewing it as a preferred social relationship, could accept the possibility that they are (out of ignorance…) “carriers” of death… Only social relationships massively based on tele-situations can be guiltily “folded back into their homes”… Only such populations can accept continuous self-monitoring (…for reasons of public health…) calling it social responsibility…
If neither the technical nor the ideological aspect of digital mediation in daily life had progressed so far, the threat of covid-19 would have remained simply, very simply, at the noise level of H1N1…

The data then. The data! This was the goal that became technically feasible for the “competent authorities” to obtain, to appropriate. If from a narrow political perspective the health coups had multiple benefits (which will become apparent in the future), from a “medical” perspective the data, the big data of health, the collection, processing and storage of it, the interaction between citizens and knowledge/power “bodies” through digital machines, this is the utility of covid-19. Not of course of the virus itself as a biological organism; but of the treatment it was subjected to!..

health big data: the “productivity” and the “economy” of general digital mediation and recording

We argue that the business drive for the dynamic increase and exploitation of health data preceded the emergence of covid-19. And that it was one of the few strategic reasons that led to the management of the pandemic in the way it was handled.
In a related article in early December 2018, titled Big Data is set to have explosive growth, creating challenges for healthcare organizations, and subtitled Big Data in the healthcare industry expected to continue growing rapidly through 2025, but related organizations still face significant challenges in managing data10 one could read among other things:

The volume of big data in healthcare is expected to grow faster than in any other sector over the next seven years, creating challenges for healthcare organizations in managing assets of extremely large data concentrations, according to a report by International Data Corporation funded by Seagate Technology.
Researchers found that healthcare data is expected to grow faster than in industry, financial services, or media. Healthcare data is projected to have a compound annual growth rate (CAGR) of 36% through 2025.
In comparison, industrial sector data is expected to have a CAGR of 30%, financial services 26%, and data from media and entertainment industries will increase by 25%.
This rapid increase in data volume is due to advances in big data analytics tools and medical imaging, as well as the growing availability of real-time health data to support clinical decisions.
“Providers benefit from the greater intelligence embedded in diagnostic equipment and patient devices that can collect data from them, upload it to the cloud or central data centers for analysis or diagnosis, and then produce guidance or assessments based on the patient’s specific needs,” the report states.
The increased use of chatbots and virtual home assistants will also increase the volume of healthcare data.
“In the future, home health robotic assistants will monitor elderly patients and provide alerts if individuals need help, ensure medication is taken, and perform simple duties,” IDC predicts.

As the volume of healthcare data continues to grow, finding solutions to related issues will become increasingly critical. The report suggests that organizations invest in health artificial intelligence, blockchain technology, and priority-based analysis algorithms, and seek external professional assistance to close the digital transition gap…

Such articles at the end of 2018 might have seemed futuristic. By mid-2020, they were much less so. It took quite a drastic, dictatorial prohibition of almost every direct social relationship (in the name of public health, as narrowly defined as possible) for the digitized, individual recording of each person’s health status (and movements and all relationships) to emerge as a “new freedom with health security.” Tracking, contact tracing, became a therapeutic method… If the term flexisecurity was invented to symbolize “flexibility in the labor market with some minimum survival security,” perhaps a new word of the kind healthflexisecurity will be invented, if it hasn’t been already.

Practically, the generalized lockdowns became feasible ONLY because the technologies and habits of digital communications were already available to most of society (and certainly among the relatively younger ages), that is, a pre-existing level of continuous data creation. And, at the same time, thanks to the sudden (we say engineered) increase in demand for such “services” from the billions of house-bound captives (half the global population; an abnormal magnitude!), and therefore also due to the exponential increase in data creation, applications and provided digital “services” skyrocketed in a minimal historical timeframe.
Pranav Pai, an engineer and founder/owner of the venture startup 3one4 capital, based in Bangalore, India, recently captured in a few words the distinction between low and high intensity of technological capital in the case of “covid-19 pandemic management”:

… The aftermath of covid-19 makes it an interesting case study for technological diversifications. On one hand, social distancing, self-isolation, hand washing, and masks are direct application, low-technology tactics to prevent the spread of the virus. On the other hand, rapid diagnostic tests, large-scale mobile tele-monitoring of health, computer-based chemical simulations for drug development, and direct payment of allowances to citizens on a national scale are complex uses of technology that enhanced different countries’ response to the epidemic… New technologies proved their effectiveness in managing the covid outbreak, and what one may expect is even greater functional progress on the path of digitization worldwide… There had not been until now a stronger incentive for transitioning business operations to cloud-based systems…
… Telemedicine will solve many problems. Usually doctors have many routine tasks to perform, from entering data for each case separately, managing the patient, selecting the appropriate medication, confirming that the proper equipment is available wherever and whenever needed, and such like—apart from their strictly medical duties… These processes need to be automated, and this became intensely clear during the pandemic, since almost every hospital was transformed into a complex “patient management system.”

Certainly, the push for the continuous extraction of personal data, especially health data, did not fall from the sky. It had been maturing before the first months of 2020. For example, on July 20, 2017, in an article in the electronic edition of the German magazine Der Spiegel, under the impressive title “Dr. Smartphone: The Medical Profession’s Digital Revolution Is Here,” one could read:

… For thousands of years, patients depended on others for help, on a therapist or a doctor. But now mobile devices have begun to change this ancient condition. Empowered by the power of artificial intelligence, the mobile phone promises to become a fundamental challenge to healthcare. Many medical tests that were only possible in the office can now be done at any time by anyone – even while sitting in their armchair at home.
Using small and inexpensive accessories, smartphones can measure brain electrical activity, intraocular pressure and blood pressure. They can perform an electrocardiogram, detect atrial arrhythmia, check lung function, record heart murmurs, take photographs from inside the ear, scan the aorta, and even perform DNA analysis.
Soon it will make little technical difference between a general practitioner’s office and a fully equipped smartphone. On the contrary: there are already examples where patients are served better by a mobile phone.

The first wave of digital health apps emerged with fitness trackers and similar accessories that became famous as pedometers. But the second wave is developing as a significant player in medical technology. Investors have begun referring to this development as “serious health” and there is money to be made from it. Lots of money. But there are other issues at stake, such as trust and the ability to overcome the traditional healthcare system…

Almost a year later, on May 30, 2018, with the English healthcare system (NHS) moving towards experimental uses of wearables by elderly patients in order to promote more effective home care, under the title “Can wearable tech keep the NHS on track?” the author of a commercial technology site wrote:

The debate over NHS funding gaps rarely goes out of date. More recent reports suggest that taxpayers should pay approximately £2,000 more each year to cover the shortfall. Indeed, a study showed that the NHS funding gap would reach £30 billion in 2020–2021. Some attribute this scale to population aging … and in other cases to unhealthy lifestyles that overload the healthcare system. Consequently, it is likely that the wearables market may have positive implications for both technology and healthcare.
.. There are conflicting studies regarding how much physical activity trackers (wearable trackers) provide lasting benefits, with evidence indicating that their usage declines after 12 to 16 weeks. However, given the WHO’s estimate that lack of regular exercise increases the risk of death by 20–30%, even a basic wearable can have positive effects. This is indeed emphasized in Gartner’s “2018 Predictions: Personal Devices” report, which shows that a growing number of users practically change their behaviors for the better once they choose to wear a wearable device.
… Real-time data collection from patients offers significant benefits for both patients and healthcare professionals, who through such data aggregation can gain a better assessment of conditions. Data analytics will play a central role in the future. Undoubtedly, the healthcare industry is increasingly using smart wearable technologies… The global market for medical wearables is estimated to reach £3.3 billion by 2020. But the savings that can be achieved this way in the healthcare sector could be equally as much in billions…

At the end of November 2019, the news was even more optimistic. Under the title Wearable Healthcare: Vital Signs for CTO Innovation, the specialist on the subject wrote among other things:

… With personalized health monitoring on the rise, health wearables are set to become the next mass market.
What started as a limited number of home health monitoring devices has evolved into a new market sector, with wearable health care technologies now able to monitor a wide range of physiological health and wellness indicators.
The market is growing. According to Berg Insight’s forecasts, 239 million such devices will be sold by 2023…. An excellent example of using wearable health technology is the Current application. The company has developed a wearable device that can remotely monitor patients after their hospital discharge. Worn on the upper arm, the device uses artificial intelligence to analyze patient data, providing significant insights to doctors. Dartford and Gravesham NHS Hospital reported a 22% reduction in home medical visits after implementing the technology.
“The value of Current was evident from our first patient – a chronic patient who experienced a drop in oxygen saturation, which Current detected faster than conventional care would have, allowing us to intervene more quickly at the patient’s home,” says hospital technical lead Neil Perry.

… Perhaps the Apple Watch is the most visible use of a wearable device for health monitoring. The latest model of this watch monitors heartbeats and warns if there is a risk… The university hospital of Birmingham and BT have created the first “networked ambulance,” which consists of a vehicle with a 5G connection and virtual reality headsets for paramedics. The clinical doctor can clearly see what the paramedic sees in the ambulance. Using a joystick, they can then directly guide the paramedics in real-time to perform any necessary scans, as well as focus on the patient’s injuries. Fotis Karonis, technical director of BT Enterprise, declared enthusiastically:
“We are very pleased to present this cutting-edge technology here in Birmingham, which is among the first places in the UK and Europe to be connected with 5G communications… Not only is the 5G network capable of ultra-fast speeds, but it also has much lower latency, meaning there is almost no delay in transferring data over the internet. This means that everything happens in ‘real-time.’ This is important for the NHS due to its potential for medical applications, such as diagnosis and preventive healthcare….”

… The wearable market is growing steadily. Vincent Grasso, of IPsoft says: “In the next five years we will see major changes in the healthcare delivery environment, in relation to medical devices and technology in general. The expansion of artificial intelligence ecosystem structural elements such as voice-enabled computers, machine learning, smartphone applications and others will have radical consequences on care delivery…”

And Antonios Oikonomou, responsible for development of wearables and optoelectronic devices at the industrial arm of Graphene Flagship, Europe’s largest consortium of universities, research centers and companies for graphene exploitation, concludes: “Long-term forecasts show that wearable technology will contribute to a global cost reduction in healthcare of about 200 billion. This saving is partly due to the shift from occasional use of wearables in special situations to continuous patient care. A reduction in hospital costs of around 16% over the next 5 years is achievable through this transition. The digitization of the healthcare sector, together with the development of low-energy, highly compatible, multifunctional solutions, are the key trends in the healthcare industry.”

All these were happening (or/and being announced) BEFORE the emergence of covid-19!… Could this have been exploited if its management had been like a moderate flu, with additional protection for vulnerable social groups? Not at all!!! Only coups (in Western democracies…), general social alienation and its widespread replacement by digital applications, panic, the “social responsibility” of tracking and the rest could give significant momentum. And so it happened.11

The related friction (it’s not just about medicine, it’s about bio-power!)

The technological leap, both in mass application and in the management / improvement of the corresponding digital technologies in the name of “combating covid-19”, was rather easily achieved in Asian countries (China, South Korea, Japan, Taiwan, Singapore). It was considered technically successful and effective (by digital public health criteria). In western democracies, the corresponding effort was made elsewhere “unorthodoxly” (with the involvement of secret services, as openly in Israel) and elsewhere limitedly or hesitantly at least as far as public knowledge was concerned; usually under the umbrella of volunteering. The trigger is the increased (western) concern for personal data – and their “fate”.

The protection of privacy (of data) is not the only aspect of the algorithmic turn (in healthcare). However, it is the aspect that clearly intrigues far more than any other. From this perspective, an ambitious European “personal data protection” effort, which was officially and fully implemented very recently (on May 25, 2018), the General Data Protection Regulation / GDPR12, has been watered down in various parts in the ongoing “war against the invisible enemy”—much to the delight of data-driven companies. The handling of covid-19 could be seen not merely as a slap but as a full kick in the face of the GDPR.
It should not seem strange, considering that states (which supposedly bear full responsibility for implementing the GDPR in matters concerning public bodies) have “comfortably” violated their own constitutions! Would they respect the GDPR? For example, by placing health data across their entire spectrum into the most sensitive and strictly protected category of personal data, the GDPR set as a fundamental prerequisite even for their collection (let alone their “processing”) the notification and the clear, written consent of each individual concerned. This logical limitation was literally sidelined. Few invoked it, in vain: the (staged) “state of emergency” swept away all obstacles. Where the collected or still-being-collected health data have ended up is, and will remain, unknown. It is obvious that the various “assurances” of governmental officials about “protection” are laughable and utterly meaningless: for these officials to bear the legal responsibility for the obvious mess that has occurred would require very bold and long-term legal campaigns… Therefore, they have every reason to deceive without consequences.

The same exact thing happened with mobility data. European directives require this data to be anonymous; or, otherwise, to be used only with the individual’s consent. But how “anonymous” can any data stream from any smartphone truly be, when it contains not just a name and surname, but far more identifying information? Not at all.
Then there is facial recognition technology using cameras. Countries that have no issues with generalized surveillance (such as China or Russia, for example) have openly admitted to using it—for the sake of public health, to identify potential carriers or patients within crowds, and other such dramatic measures. Let no one think that Western countries have been left behind; when such technologies are officially used in Britain, for the “fight against crime” as they say, and when the “state of emergency” due to covid-19 offered a first-class opportunity for every national security apparatus to train (and accumulate as much as possible…) amid widespread panic…

It is assumed that, from a formal perspective, the GDPR has a loophole: it allows European states to pass emergency legislation of limited duration for the collection and processing of personal data for public health protection purposes; but with certain minimum guarantees and, certainly, clear and public oversight mechanisms… It is obvious that most avoided even this – why give additional false promises of “privacy protection” when they can act undercover?

The reactions from below, focusing mainly on tracking (contact tracing), were slow to emerge. The experts from states and corporations, as well as their paid or volunteer lackeys, quickly threw the false dilemma of “freedom or health” in everyone’s face, and didn’t even bother to throw out the adjacent one, “privacy or health,” because they obviously considered it unnecessary. They silenced mouths, not everywhere, not with the same success. They nevertheless created that mass (social) confusion which, combined with fear, produced the necessary fog for any kind of appropriator of health data and not only those. They are the same ones selling the digitization of healthcare packaged with great interest in protecting everyone’s personality.

In reality, it is questionable whether the richest and/or most powerful appropriators can be prevented from getting their hands on so-called meta-data in the first place. What would really have practical significance and consequences would be if the mass of existing users realized that the only safe thing is to withdraw as much as possible from using all these digital rogues that have become a daily habit. Before this use becomes mandatory and is imposed by (strict) laws…

temporary epilogue

Through a historical/genealogical presentation (necessarily brief), we showed the answer to what Walter Benjamin rightly condemned as the “unhistorical shock” of “how is it possible for this to happen?”… If various (business) health guardians had been regularly issuing announcements over the past years (for a decade and a half now) about the coming of a “catastrophic pandemic,” it was because they were preparing to exploit it in every way and against everyone! Covid-19 was not the killer that would justify what was done and what will be done; no problem! It was masked as if it were!!! And thus it was served.
Elsewhere (relentless machine) we supported that Western states copied certain totalitarian methods of population control that were used by the Chinese state (the universal restrictions on movement and all related measures) in order to boost the technological restructuring of their capitalisms and societies, which are clearly “behind” those of Asia. (More analytically in The Shape of things to come in this issue). Perhaps to stop losing ground in global intra-capitalist competition. It is “logical” that for such an undertaking, unlimited risk-taking should be poured out, and unlimited confusion and fear should be created.
What is not logical at all is that during all these years that this process has been evolving, those social and political structures were not created from below, with competitive goals, which on one hand would be timely fully informed and able to explain; and on the other hand would reveal the extremely dangerous character of this creative destruction that was imposed at many levels by the terror campaign about covid-19.

Saying that at Cyborg we have put in far more effort than what would be proportionate to prepare for both what came and what lies ahead would be accurate—but not sufficient. Unfortunately, the world, even that which pretends to be hostile to the system, lives in an ahistorical, anti-historical, historicist manner. It obsesses over every triviality—and remains stunned when a mega-scale emerges that was never hidden from it, yet never considered itself obliged to confront.

The tradition of the oppressed teaches us that the “state of emergency” we are experiencing now is not the exception but the rule. We must manage to grasp history with this awareness. Then we will clearly realize that our mission is to create a real state of emergency, and thus our position in the struggle against fascism will improve. One reason fascism has an opportunity is because, in the name of progress, it is treated by its opponents as a historical measure. The surprise at how things we experience are still possible even in the twentieth century is not philosophical. It is not the beginning of knowledge—unless it is the knowledge that the conception of history from which it originates is unsustainable.

Walter Benjamin, Theses on the Philosophy of History

Ziggy Stardust

  1. Available at https://www.datapine.com/blog/big-data-examples-in-healthcare/ ↩︎
  2. The evolution of this simple device was made in 1840 by George Camman in New York. The stethoscope he created remained unchanged in use for over a century, until the early 1960s… ↩︎
  3. Even regarding the contagiousness of certain diseases, the social attitude and medical advice were completely different from what was mainstream, as was triumphantly demonstrated with the case of covid-19. For childhood illnesses for which there was no vaccine (: artificial herd immunity), families would push their children to “catch” them, in order to acquire immunity from an early age (: natural herd immunity)… ↩︎
  4. Beyond Health Care: From Public Health Policy to Healthy Public Policy, Trevor Hancock, from the Canadian Journal of Public Health, vol. 76, May/June 1985. ↩︎
  5. The smoking ban is the most well-known and widespread “outlet” of these state designs… ↩︎
  6. First limited publication in the magazine g (flip side insert), January 2006. Recent republication Sarajevo.pdf no 147a, March 15, 2020. ↩︎
  7. April 2020. ↩︎
  8. With the concept of the Imaginary institution of society, by C. Castoriadis. ↩︎
  9. Sarajevo.pdf 148a, April 2020. ↩︎
  10. Big Data to see explosive growth, challenging healthcare organizations. ↩︎
  11. More than these pages here:
    on wearables: in Cyborg 2 – February 2015 (fitter, happier, more productive – on self-quantification), Cyborg 7 – October 2016 (body, pregnancy, recording, representation: the change of the normal), Cyborg 10 – October 2017 (wearables, wearable, subcutaneous: the body as motherboard).
    on biotechnologies: a large part of Cyborg 5 – February 2016, Cyborg 7 – October 2016 (genetics / cybernetics: a white wedding), Cyborg 8 – February 2017 (genetic tailoring: the big loophole), Cyborg 11 – February 2018 (the mutations died; long live the mutations!!!), Cyborg 13 – October 2018 (crispr/cas9: the “scissors of god” are unleashed!), Cyborg 14 – February 2019 (the worker and the queen: epigenetics on the determinism of dna).
    on medicine: Cyborg 9 – June 2017 (precision medicine: the personalization of medicine), Cyborg 12 – June 2018 (Watson for president!).
    on data: Cyborg 11 – February 2018 (data: the new “raw material”), Cyborg 13 – October 2018 (what is data and who do they belong to?), Cyborg 15 – June 2019 (Citius, Altius, Fortius… every day, all day long… – health as continuous upgrading), Cyborg 16 – October 2019 (big data: surveillance and shaping of behaviors in the 4th industrial revolution). ↩︎
  12. Cyborg 12 – June 2018: General Data Protection Regulation: a digital habeas corpus? ↩︎