Ray Kurzweil, a computer scientist and entrepreneur, published a prophetic text in 2005 about what he called "the singularity". Kurzweil predicted a moment in the near future when superintelligent technology would surpass all conceivable human abilities, incorporate humanity into its operations, and expand its mastery over the entire universe. The title of his book, "The Singularity is Near", hinted at the inevitability of this event, with a predicted date: 2045.
This year, almost halfway between 2005 and 2045, Kurzweil released an update to his forecast. The title of the book is now somewhat less ominous: "The Singularity is Nearer".
To understand Kurzweil and the techno-prophets who followed his lead, it is worth considering the nature of prophecy. Even in its ancient and religious forms, the purpose of prophecy was never to predict the future. It was always to influence the present - to convince people to live their lives differently today, preparing for a tomorrow that might be purely hypothetical.
In this context, it is interesting to ask why so much discourse about new technologies has become so apocalyptic. What can such discourse achieve? Does predicting the impending eclipse of humanity give anyone a reason to act now or change any aspect of their life? Or is the projection of inevitability more likely to convince people that nothing they do can have consequences?
There is no doubt that there is something darkly attractive about proclamations of the end times. Their ubiquity throughout human history suggests this. But there are more productive, balanced - if less sensational - ways of thinking and talking.
Marcus Smith
Marcus Smith's new book "Techno: Humans and Technology" is one of the more moderate approaches to this topic.
Of course, like everyone else in this genre, Smith quickly suggests that the present moment is exceptional and unique. The first sentence of his book reads: "We are living in the midst of a technological revolution." References to the concept of "revolution" are scattered throughout the text.
But the central argument of Techno is that we must regulate technology. More importantly, Smith argues that we can. As an associate professor of law at Charles Sturt University, he suggests that the law has more than enough resources to bring machines under human control.
In fact, according to Smith, Australia is uniquely positioned to lead the world in technological regulation precisely because it is not home to the large tech corporations that dominate American and European society. This explains why Australia, in Smith's words, "punches above its weight" in this area.
Threat to democracy
Smith divides his book into three tightly structured parts that examine the relationship of technology to government, the individual, and society.
In the first part, he addresses major political issues such as human-caused climate change, the application of AI to every aspect of public life, and social credit systems enabled by digital surveillance and big data.
Perhaps the most interesting argument here is the similarity between the notorious social credit system used by the Chinese government and the social credit systems being developed by commercial forces.
It is easy to criticize a government that uses a range of technological methods to observe, rate, and regulate the behavior of its citizens. But don't banks collect data and make judgments about potential clients all the time - often with deeply discriminatory results? And don't platforms like eBay, Uber, and Airbnb use reputation credit scores as part of their business model?
For Smith, the question is not whether social credit systems will exist. It is almost inevitable that they will. He calls on us to think long and hard about how we will regulate such systems and ensure that they are not allowed to override what he considers the "core values" of liberal democracy. Among these, Smith includes "freedom of speech, movement, and assembly", and "the rule of law, separation of powers, freedom of the press, and the free market".
The second part of Techno focuses on the individual and the threat that new technologies pose to privacy rights. The main concern here is the vast amount of data collected about each of us every time we connect to the internet - which means, for most of us, more or less all the time.
As Smith points out, while this is obviously a global phenomenon, Australia has the dubious honor of leading the world's liberal democracies in legislation that allows government access to this data. Private tech companies in Australia are legally required to insert back doors into the encrypted activities of their clients. Law enforcement agencies have the power to take over accounts and disrupt these activities.
"The fact is that liberal-democratic governments operate in the same way as the authoritarian regimes they criticize," writes Smith:
They may claim that they do so only in specified and justified cases under a warrant, but once the technology becomes available, it is likely that some government agency will expand its powers, believing that their actions are justified by the benefits their work brings to the community.
The rise of big data thus inevitably "pushes liberal democracies toward a more authoritarian stance". But for Smith, the solution is clear:
If rights such as privacy and autonomy are to be maintained, then new regulations are key to managing these new issues of privacy, security, and politics.
Practical difficulties
The final part of Techno focuses on the relationship between technology and society, by which Smith mostly means the economy and markets.
He provides a useful overview of the blockchain technology used by cryptocurrencies, which promised to alleviate inequality and create growth by decentralizing exchange. Here again, Smith avoids a triumphalist or catastrophic approach. He poses reasonable questions about how governments might mediate such activity and keep it within the bounds of the rule of law.
He points to examples of China and the European Union as two possible models. The first emphasizes the role of the state; the second attempts to create legislative conditions for digital markets. And while both have serious limitations, some combination of these two models is likely to be most successful.
But indeed at the very end of the book, Smith's central concern - regulation - comes to the fore. He has no difficulty in expressing the significance of his work. "Regulating technology," he writes, "is probably the most important public policy issue facing humanity today."
Stating that we must regulate technology, however, is much simpler than explaining how we can do it.
Techno provides a very broad outline of the latter. Smith suggests that this would require "engaging key stakeholders" (including technologists, corporations, and ethicists), "regulation by technology" (that is, using technological means to enforce laws on technological systems), and establishing a "special international agency" to coordinate regulatory processes.
But Smith does not really address the complexity of implementing any of these recommendations in practice. Moreover, it is possible that, despite his considerable ambition, his approach fails to encompass the true scale of the problem. As another Australian academic, Kate Crawford, recently argued, we cannot understand intelligent technologies simply as objects or tools - a computer, a platform, a program. This is because they do not exist independently of the intricate networks of relationships between people and the world.
These networks extend to the lithium mines that extract the minerals that enable technology to work, Amazon warehouses that deliver components worldwide, and digital sweatshops where people are paid pittance to create the illusion of mechanical intelligence. All of this harms the environment, exacerbates inequalities, and facilitates the erosion of democratic governance.
If a regulatory project were to touch on phenomena of this kind, it would have to be much broader and more comprehensive than even Smith suggests. It could mean reconsidering, rather than simply trying to secure, some of what Smith calls our "core values". It could require asking, for example, whether our democracies have ever really been democratic, whether our societies have ever really aimed at equality, and whether we can continue to believe in the so-called "free market".
Asking such questions would certainly not mean an apocalypse, but it could mean a revolution.
Original:
Charles Barbour
Associate Professor, Philosophy, Western Sydney University
Creation time: 18 July, 2024
Note for our readers:
The Karlobag.eu portal provides information on daily events and topics important to our community. We emphasize that we are not experts in scientific or medical fields. All published information is for informational purposes only.
Please do not consider the information on our portal to be completely accurate and always consult your own doctor or professional before making decisions based on this information.
Our team strives to provide you with up-to-date and relevant information, and we publish all content with great dedication.
We invite you to share your stories from Karlobag with us!
Your experience and stories about this beautiful place are precious and we would like to hear them.
Feel free to send them to us at karlobag@ karlobag.eu.
Your stories will contribute to the rich cultural heritage of our Karlobag.
Thank you for sharing your memories with us!