Being a writer makes more sense than calling oneself a content creator

Being a writer for most of my working life, I have striven to understand the requirements and interests of readers. Understanding the audience influences the type of writing I use, and the topics I cover change. Instructional material is one thing – guiding the users of a Microsoft Word platform on how to copy and paste text is an example of a reader requirement.

Over the years, however, I have seen the description content creator become more frequent. I must confess, I still do not know what that means. Yes, I am well aware that writing involves more than just text. Images, audiovisual presentations, podcasts, audio, webinars – all these are now part and parcel of good writing.

The term content creator sounds like a corporate buzzword, intending to obscure rather than inform. Keeping things concise and easy to understand is the hallmark of an effective writer.

The days of taking your manuscript to a publisher, and sweating over the outcome of its reception, are long gone. The writer must have a say on the design, book over, graphics, data analytics of a potential audience – these are new tasks. However, you are not a content creator; you are still a writer.

The world of Instagram, TikTok and Snapchat has changed the ways writers interact with the reading public. That is great – having direct access to a reader audience is wonderful. However, do not be dragged into the cesspit of acting as a social media influencer. There is a profusion of influencers – I say misinformation peddlers – who are damaging the reputation of writers by circulating all kinds of conspiracy fictions, hyper-weaponised memes and content that belongs in the sewage.

Writers do appreciate larger audiences, but they are mainly aiming for credibility. The latter is difficult to find on social media.

Language emptied of any meaning

We are all aware of the ubiquity of corporate buzzwords. A particular type of jargon, corporate buzzwords are a way of making adults seem like grownups, in the words of Olga Khazan. The terms such as ‘innovation’, ‘disruption’, ‘synergies’, – all of them have meanings which we are meant to discern. While every profession has its jargon, corporate buzzwords are often euphemisms, deployed to disguise an ugly reality beneath them.

Mass layoffs are described as ‘pivots’; the closure of departments is described as a company ‘synergising its ongoing trajectories into operational capabilities’ – you get the idea. The rise of the tech giants – Apple, Google, Microsoft and the Silicon Valley behemoths – has produced its IT-related buzzwords. They have seeped into other industries, including sales and marketing, where the content creator originated.

Job descriptions are becoming more vague and yet grandiose. Content creator is one of those vague, yet grand-sounding jobs. The position of a salesperson can be recycled as a customer happiness enhancement officer. Corporate buzzwords make harsh realities sound softer. Lora Kelley, writing in The Atlantic, states that corporate buzzwords are euphemistic bubble wrap.

Language that is hollowed out, denuded of all meaning – corporate buzzwords then become their own reality. The market has earned a place as a hallowed ground, where all participants – sellers, buyers, workers, owners, corporations, small business entrepreneurs – are equal actors in an economic utopia. Obscured beneath the all encompassing term ‘market’, or indeed ‘free market’, is a network of social relations determining the power relationships in that market.

The word privatisation entered the public lexicon from the 1970s and 80s, a euphemism for the transfer of state assets into private hands. The pursuit of private corporate profit was disguised with positive-sounding buzzwords – ‘efficiency’ and ‘reform’. Even the word ‘austerity’ has a positive connotation to it. Who but only the cantankerous would be against spending within our means, and cutting back profligate spending? Privatisation has ended up one of the worst money-wasting scams of our times.

While most working class people can see that privatisation is a trick, it is still relentlessly pushed by political elites. The euphemistic bubble wrap is part of the package, softening public opinion for the final blow. No, I am not suggesting that people are fools – far from it. But if the corporate buzzwords take over the conversation surrounding socioeconomic issues, then people will be convinced to acquiesce to foolish things.

Monetisation is another of those corporate buzzwords that has taken off in the IT era. Blogging and content creation for the purpose of turning a buck is all well and good. But what happens when large tech companies, who have collected all our data, use that information for monetisation? Big tech is quite aware of a mental health crisis gripping the population. Should they use that to monetise our data? They are doing that already. Tim Cook, the CEO of Apple, stated that his corporation’s biggest contribution to humanity will be in the area of health.

The rise of the content creator is, in its own way, confirmation of the power of words. In the days before the internet and social media, it was novels, literature and books that disseminated the words of the author. If words were not powerful or impactful, then institutions such as the CIA would not have spent billions of dollars financing writers and publications to propagate its business-friendly message to the world.

Please be a content creator if that is your passion. We all have to adapt to changing times. But do not lose the value of insights from the past. AI may be a necessary accompaniment for copywriters, editors and content creators, but it can never be a substitute for human creativity and credibility.

Before Copernicus and Newton, there was the Islamic civilisation’s pioneering development of science

The story of science which we are taught in Anglophone nations usually traces the origins of the scientific method to Europe. European scientists – Bacon, Kepler, Galileo and so on – are regarded as starting a scientific revolution. Along come Copernicus, Newton, Descartes (the latter being a philosopher of science), and here we have the commencement of a global scientific revolution.

We like to place ourselves within a tradition tracing back to Ancient Greece and Rome, something we label western civilisation. The Anglocentric settler nations regard themselves as the inheritors and developers of the Greco-Roman school of science and philosophy.

There was a scientific revolution in Europe as a component part of the cultural movement known as the Renaissance, but it was definitely not global. That story is the way European Christendom discovered the scientific method. The rest of the world, in particular the Islamic empire, was already light years ahead of Europe when it came to the sciences, and the philosophy underpinning the scientific method.

We have all heard of Nicolaus Copernicus and Isaac Newton, but how many of us have heard of Ibn al-Haytham? (c. 965 – c. 1040). The latter lived and worked as a scientist centuries before the heavyweights mentioned above. We will get to Al Haytham in a minute, but first, some observations are in order.

It is a great shame, and an indication of our own prejudices, that the overwhelmingly ubiquitous stereotype of Islam is that of terrorism. We can easily find millions of images, through a Google search, of pictures of bearded men brandishing guns, yelling purportedly Islamic slogans, demanding the death and destruction of infidel societies. The immediate association of Islam with violence, savagery, beheadings and other unspeakable atrocities serves to promote the stereotype of the irrational Muslim, resistant to rational thinking and impervious to scientific enquiry.

We in the English-speaking world believe that it was only in Ancient Greece, or the Renaissance, that vast, philosophically deep scientific intellectual developments occurred. Not so – while Europe was mired in the relative ignorance of the Dark Ages, the Islamic world was developing science, philosophy and technology light years ahead of anything Europe could offer. In fact, Europeans owe a great debt of gratitude for the innovative discoveries and philosophical insights of the Islamic world – scientific outpourings which prefigure the debates of contemporary times.

Remember the Islamic polymath mentioned above, Ibn Al-Haytham? His name is sometimes Latinised as Al-Hazen. He was the first experimental scientist, and the father of the science of optics – no disrespect to Isaac Newton. Al Haytham, born in Basra, Iraq, criticised the theories of Aristotle, specifically that the planets did not move in perfect circles, but in elliptical orbits.

He birthed the scientific method, using experimental evidence to verify (or falsify) hypotheses. He overturned centuries of received wisdom regarding the nature of light and the eye. The ancient Greeks had held that the eye emits rays which bounce off objects, thus forming the basis of vision. Not so, said Al-Haytham. Overturning the emissions theory, as the prevailing view was known, he proposed that light rays enter the eye, the latter acting as a pinhole camera.

As Jim Al Khalili, the physicist and historian of science points out, Al Haytham did not invent the pinhole camera. The latter, known for centuries as the camera obscura, had been developed by numerous civilisations, such as the ancient Chinese. However, Al Haytham was the first to elaborate the mathematics underlying the operation of the pinhole camera, and that the eye possessed a similar structure.

Al Haytham was the first to connect visual perception with the subjective experience. He regarded vision as not only a function of the eye, but as an experience of the brain and mind. He was not alone among the Islamic scholars in exploring what we today would call the mind-body problem. Centuries before René Descartes and Cartesian dualism, Muslim scholar Avicenna (Ibn Sina c. 980 – June 1037) was conducting thought experiments about perception, the individual experience and the nature of mind,

We have all heard the seemingly intelligent observation – Islam did not have an Enlightenment. The cynical implication of this claim is insidious; that Muslims are way behind us in the enlightened English-speaking world, and so require an education in the scientific method. It is true that the Islamic world never had an Enlightenment phase; because they did not need one.

As Sean Ledwith points out, the Abbasid caliphate, which covers the time period of the Golden Age of Islamic science, is distinguished by its deliberate cultivation of state-sponsored Enlightenment. The authorities, having conquered their Persian and Byzantine neighbours, absorbed the cultural achievements of those societies, and proceeded to develop their own scientific and philosophical innovations. Science and the cultivation of knowledge was actively promoted during the Islamic golden age.

They did not require a specific Reformation or Enlightenment period to push back against theological mysticism, as was required in European Christendom. The Islamic world translated the Ancient Greek and Roman texts, but also went on to blaze their own trail of cultural and scientific flourishing.

It is relevant to note here that one thousand years before Charles Darwin, Islamic scholars such as the zoologist Al-Jahiz (c. 776 – 868/69) were discussing evolution. No, they never used the now-familiar expression natural selection. They were discussing competition in the natural world for finite resources, adaptations of characteristics to environmental conditions, and branching speciation. These lines of enquiry are precisely those explored in evolutionary biology today.

Am I suggesting that Islam is a genetically superior system to all other religions? No, I am not. I am suggesting that we re-examine our own anti-immigrant prejudices, especially in the light of the resurgence of far right parties in Europe. Attacking the allegedly ‘barbaric’ outsider may make us feel good about ourselves, but only serves to inhibit cross cultural cooperation and solidarity.

D-Day commemoration, the 1924 Immigration Act, and the long lasting legacies of eugenics

A number of news items, seemingly unrelated, come together to form a coherent subject. Indigenous Australian news is the starting point for us today, and this will lead us into an examination of racism, eugenics and World War 2. Let’s begin…

University of Melbourne truth telling project

This year, three scholars from the University of Melbourne released a documentary report regarding the treatment – indeed, mistreatment – of indigenous Australians by the University of Melbourne governing forebears. The report called Dhoombak Goobgoowana – which translates as truth-telling – is a disturbing report into the dark underbelly of racism and eugenics underpinning the institution of the university.

In the words of the authors of the project, the university throughout its history honoured racists, eugenicists, Nazi apologists, grave robbers and body snatchers. One professor of veterinary science, Daniel Murnane, not only participated in a massacre of indigenous people, but also advocated restricting the ‘lesser races’ to avoid polluting the superior white-Anglo stock. Until this year, the university had a scholarship and a building named after him.

The list goes on – numerous Australian academics, members of eugenics societies, advocated the forced sterilisation of ‘undesirables’, meaning those with developmental delays – to reduce the numbers of useless eaters. Celebrated anthropologists, doctors, scientists and others whose names adorn buildings at the university, were proponents of a ‘master race’ perspective, proposing the racial stratification of society, and restricting the breeding (and immigration) of nonwhite deemed to be unfit.

Legacy of eugenics

The documentation of the University of Melbourne’s truth telling report highlights just how ubiquitous the philosophy of eugenics was in academia. However, it is not only in the hallowed halls of university departments where eugenics made a lasting impact that resonates until today.

News item: This year is the one hundredth anniversary of the US Immigration Act. Named the Johnson-Reed act after the main politicians pushing for its approval, the Immigration act excluded nonwhite ethnicities, including European Jews, from entering the United States. This included those southern and Eastern Europeans fleeing Nazism in Europe. The act was a practical application of the eugenicist philosophy.

It is difficult to overstate just how restrictive the Immigration Act was. The New York Times, commenting on the law passed by then President Coolidge declared that America as a melting pot has ended. Immigrants from nations deemed inferior – Eastern Europeans, Jews, Arabs and other nonwhites – were subject to strict quotas. The alleged purity of the white Anglo stock had to be preserved, so immigrants from northern and Western Europe were prioritised.

The eugenicist underpinnings of the 1924 were well established in the decades prior to its enactment. There were already laws on the statute books prohibiting Asian immigration, legislation passed with the support of American labour leaders. Being of ‘pure’ blood was of incredible importance to American legislators, economists, scientists and journalists. The US enacted numerous ‘one drop’ laws, which deemed a person of mixed race if they had at least one nonwhite ancestor.

Oliver Wendell Holmes, Supreme Court justice and advocate of liberal causes, spoke out in favour of eugenics, and helped to pass the Buck vs Bell judgement in 1927, opening the way for thousands of forced sterilisations of those deemed ‘feeble-minded’.

It is interesting to note that the 1924 Immigration act, and American eugenics laws and programmes, were an inspiration for Hitler and the Nazi party. German scholars, looking for a successful example of a racially stratified society, examined the laws and practices of the United States. Eugenics was a mainstream ideology, influencing the passage of racial laws and antisemitic legislation in the US.

We like to think of eugenics as a relic of a bygone era, consigned to the dustbin after the defeat of Nazi Germany in 1945. In many ways that is true – the immigration was finally repealed in 1965, after decades of struggles against it by antiracist activists and legislators. The defeat of the Nazi white supremacist regime put a seemingly definitive end to the antiquated notions of breeding a ‘superior’ stock of humans through restricting immigration and forced sterilisation.

It would be wrong to let knowledge of eugenics to fall into an amnesiac gap. Ignoring the strong and intimate connections between American and German eugenicists in the prewar years is a serious omission, leading to widespread ignorance regarding the crucial and very real role of white supremacy in shaping domestic legislation.

News item; this month witnessed the 80th anniversary of the D-Day Allied landings. The veterans of that campaign confronted a monumentally powerful German military, implementing white supremacy in Nazi-occupied Europe. When African American soldiers returned home, they found a society unwilling to accept them as equals.

Eugenics dominated academic thinking and legislative policies on population and immigration. In Europe, and across the Atlantic, the Great Replacement conspiracy theory is becoming normalised and mainstreamed, influencing increasing numbers of political parties and policy makers. This trope asserts that white Anglo majority societies are under threat of being ‘replaced’ by mass immigration. Allegedly orchestrated by liberal elites – usually meant to indicate Jewish elites – this ideology motivates the violence of the far right.

In Hungary, under Prime Minister Viktor Orbán, the Great Replacement has become an official ideology. Providing a lightning rod for the ultranationalist Right, Orban has provided an ideological cement to solidify a European, and global, ultrarightist political force.

Reinvigorating white supremacy with false ideas and demographic paranoia, both eugenics and Great Replacement regard nonwhite immigration as an existential threat, unassimilable into the Anglo-majority national culture.

Did the D-Day veterans fight white supremacy eighty years ago, only to see the resurgence of that ideology in a new, mutated form today? Portrayals of immigration as a menace to Western society has a long pedigree. Elevating them to mainstream doctrine has real-world consequences.

Trinity downwinders, the longstanding links between astrophysics and the military, and the shadow of the mushroom cloud

We are all familiar with the general facts regarding the US atomic bombings of Hiroshima and Nagasaki. However, the residents of those cities were not the first victims of atomic warfare. With all due respect to the Japanese who died in those attacks, the first victims of the nuclear age were Americans. Specifically, those who were downwind of the first atomic explosion – at Alamogordo, in the New Mexico desert, in July 1945.

On July 16, 1945, in the New Mexico hinterland, the first atomic test was conducted by the scientists and military officials of the Manhattan project. Codenamed Trinity, the explosion was more powerful than the physicists anticipated, and the fallout zone was larger and more extensive than they had calculated. The area surrounding the explosion was sparsely populated, and US military authorities would later claim that the area was uninhabited.

Barbara Kent was 13 years old when the explosion occurred. She and her classmates – they were on a summer dance camp in Ruidoso, New Mexico – were thrown out of their bunk beds by the force of the blast. The detonation at Alamogordo, NM, was so bright it was seen from hundreds of miles away. In another small town, Carrizozo, residents ran into the local church, believing the Rapture and the end times was upon them.

Kent and her schoolmates ran outside, and began playing in what they thought was snow. Snowing in July? It was strange, but teenagers intent on playing cannot be stopped. The white dust which enveloped their location was radioactive fallout from the explosion. The kids rubbed it on their skin, in their faces – the dust contaminated their drinking and cleaning water. Farm animals consumed it with the grass and crops.

The mushroom cloud – the iconic symbol of nuclear power – went up 50 000 to 75 000 feet in the air from the Alamogordo blast. Higher than anticipated, the fallout from the explosion is only now slowly being uncovered. The US military failed to evacuate people from the immediate vicinity of the blasts, and the Trinity downwinders, as they are known, having been fighting for recognition, an apology and compensation.

Trinity downwinders were told, in 1945, that the explosion they witnessed was from an ammunition dumping ground. It was only after the atomic bombings of Hiroshima and Nagasaki that the downwinders began to understand the magnitude of what had befallen them.

The families of the Trinity downwinders have experienced generations of cancers – stomach, thyroid, pancreatic, among others. Tina Cordova, a resident of Tularosa, NM, has been fighting for the Trinity downwinders. Her home is only 34 miles from the original atomic test site. Her family, from her great grandparents onwards, have been afflicted with various types of cancers. In the immediate three months after the Trinity tests, infant deaths from cancer in NM jumped by 56 percent.

Cordova formed the Tularosa Basin Downwinders Consortium in 2005, and has painstakingly accumulated medical evidence and statements from downwinder families about the cancers they have suffered, along with the official neglect they have encountered. It is true that in 1990, the US Congress passed the Radiation Exposure Compensation Act (RECA) which provided limited and partial compensation for those affected by radiation exposure from US military atomic tests.

The RECA act excluded the Trinity downwinders, but provided compensation to those affected by the hundreds of post-1945 atomic tests throughout the United States. From 1945 to 1992, the US conducted 1054 atomic tests, including atmospheric and underwater environments. Each one took its ecological and financial toll.

The other group of Americans excluded by the RECA act is the uranium miners, most of whom are from the Navajo indigenous nation. The tests performed in south west Nevada, for instance, required the laborious exertions of labourers from the indigenous and Hispanic communities.

It is indicative of the priorities of the US Republican Party when they complain about the financial burden of RECA. In what way? The $2.5 billion dollars paid out to radiation victims over the last approximately 30 years pales into insignificance compared to the hundreds of billions spent to upgrade and maintain the stockpile of nuclear weapons. Somehow, financial burdens are absent when considering the maintenance of nuclear weapons.

When Soviet dissident and nuclear physicist Andrei Sakharov (1921 – 1989) denounced the Soviet nuclear programme, and upheld the basic principles of human rights and civil liberties, he was hailed as a hero in the West. Sakharov was granted east access to the Reagan administration, and received overwhelmingly positive coverage in the corporate media.

That is all well and good, except for the glaring hypocrisy at the heart of the Sakharov human rights project. Not only was the Reagan administration a ferocious advocate of nuclear weapons, increasing spending on that military technology by the millions. Astrophysics and the military have a close, intertwined relationship. Authors Neil deGrasse Tyson and Avis Lang describe that relationship as a double-hinged door in their 2018 book Accessory to War.

The principles of high energy physics are partly based upon the thermonuclear fusion that occurs in cosmic environments. Astrophysicists who study the collisions and emergence of particles at such high energy states are fully aware of the military implications of their work, sharing laboratory facilities and sometimes working within shouting distance of each other.

Our understanding of cosmic interactions, and the particles that pop in and out of existence, is inextricably linked with government programmes to support military research. Sakharov was well aware of this symbiotic relationship. His failure to even acknowledge this issue blasts a huge hole in his credibility as a universal human rights peacemaker.

How about we begin to acknowledge the ecological and medical harm caused by US nuclear testing. As Cordova explains in her documentary film, First We Bombed New Mexico, thousands of Americans were lied to about the Trinity tests, exposed to dangerously high levels of radiation, and then neglected for generations. A historic injustice needs to be corrected.

Cold War defector stories, Hong Kongers in China, and morality tales

Defector stories – high level personnel who escaped from the former USSR and sought asylum in the West – made for gripping propaganda. Ideologically driven people, attracted by the magnetic appeal of freedom and capitalist consumerism, risked their lives to escape the dreary life of Communist tyranny. Sounds like a great morality play; the triumph of the indefatigable human will to freedom over totalitarian conformity.

How true is this picture? While there is a grain of truth in this fable, the stories of defectors, and the motivations of US and British policies in utilising them, tell a more complicated picture. This subject should be examined closely, because it has relevance for our times. US imperial policy towards regimes it deems adversarial remains essentially unchanged – mobilise extreme ultranationalist groups to advocate for regime change, and create a domestic culture sympathetic to that objective.

It is no secret now that defectors from the Eastern bloc were viewed not only as heroes to be cultivated by US and British intelligence agencies. Numerous scholars, such as Benjamin Tromly, have established beyond a shadow of a doubt that US intelligence agencies encouraged and publicised the plight of defectors, attempting to create a consensus for capitalist economic and cultural policies at home, while demonstrating to the Moscow leadership the ‘superiority’ of capitalist consumerism.

They were used as political and intelligence assets, valuable sources of information about the inner workings of the Soviet security and military apparatus. Russian emigres, for instance, were actively recruited and organised by the CIA and its predecessors (the OSS) to serve as anticommunist forces in covert actions against the Eastern block.

Consider the high profile defection, in 1948, of two Soviet combat pilots. Flying their plane into the American zone of occupation in Austria in 1948, Peter Pirogov and Anatoli Borzov were transferred to the United States, where they were celebrated as heroes. Welcomed by the American intelligence establishment, they received adulation in the media. Was this not clear evidence that American consumerism and individual liberty superior to Soviet drudgery?

The defectors, who had undergone miserable experiences in the Soviet Union, received surreptitious support from the CIA. Pirogov, only a few months after his escape, wrote a bestselling book in English, and settled into a house paid for the CIA. He was immediately cultivated for intelligence information about the inner workings of the Soviet military system.

Defectors’ stories provided a feeding frenzy for the political sharks of the London-Washington axis. Pirogov, who retreated into a quiet life with his new family, was dropped from the CIA payroll in the 1950s for failing to actively participate in an anticommunist emigre organisation set up by the intelligence community. He went off-script from the narrative as a freedom-loving defector.

Borzov went even further offline from the scripted part as an ideologically zealous defector – he returned to the USSR six months after his defection, dissatisfied with American consumerism and unimpressed by American supermarkets.

The 1950s was a period of intense CIA activity among anticommunist Russian and Eastern European communities. Mobilising defectors was one plank of a multifaceted strategy of using ultranationalist and ex-Nazi collaborators from Eastern European nations as private armies in the Cold War.

It is certainly no crime to seek asylum. During the Cold War, the United Nations ratified the convention on the status of refugees (1951) to elaborate the specific rights of asylum seekers. This was a time when there were refugees from Eastern European nations making up the bulk of Communist nations. Being receptive to those fleeing Communism was in line with human rights doctrine; it was also a cynical exercise in encouraging illegal immigration, an activity demonised by the media in our times.

Non-ideological reasons for defecting from the Eastern bloc nations were routinely minimised. There were defectors escaping imminent arrest or avoiding criminal charges; they were provided lenient treatment given that they were from the ‘captive nations’. The needs of political propaganda superseded considerations of law and order. American-backed dictatorships which produced an outflow of refugees somehow escaped the classification as ‘captive nations’ in the calculations of the Washington Beltway foreign policy experts.

How does this relate to contemporary times?

By way of investigative journalist Kit Klarenberg, an interesting story was published in Bloomberg web magazine. Remember in 2019 the Umbrella revolution which gripped Hong Kong? Thousands of activists were on the streets protesting laws that would bring the enclave closer to the rules and regulations of Beijing. It appears that the participants in that failed adventure are rethinking their positions.

In 2024, increasing numbers of Hong Kongers, including Umbrella activists, are choosing to work, live and study in Shezhen, China. The record economic growth achieved under Chinese premier Xi Jinping, including the construction of high speed rail, has convinced young Hong Kongers to avail themselves of the economic opportunities in China,

Shopping malls with loads of much-coveted consumer goods, entertainment, booming cultural parks, and all the modern conveniences have appealed to the former democracy activists. They have largely reconciled themselves to the rule of the Communist Party of China (CPC). It is a sign of a regime’s effectiveness to persuade its erstwhile opponents to accommodate and accept new realities. Providing a lower cost of living, cheap housing and work/educational opportunities has certainly placed the Umbrella movement’s NGO-style astroturf revolution into perspective.

Am I suggesting that we all migrate to China this instant? No I am not. Am I advocating adopting Xi Jinping Thought as an official doctrine? No I am not. I am suggesting that defector stories made for exceptional morality tales to soothe our collective conscience. They serve a particular propaganda purpose, disguising the cynical political motivations of the US and British authorities.