The Great Multiplier

A while back, I made the observation that technology is, of itself, neither good nor evil, but that its basic function, whether intended or not, is as a multiplier. In warfare, technology multiplies the force wielded by an individual or a group of individuals; it multiplies the distances from which one can strike and the impact of that strike. In transportation, it multiplies how far and how fast one can travel. In communications, technology allows the transfer of more information almost instantly [at least on our planet] to more people.

But there’s one aspect of technology that’s seldom mentioned, and that’s the impact not only on the person or people affected by the technology, but also on the individual using the technology, where often technology multiplies the ability to do harm and the ability to avoid being caught or punished for that harm..

Donald Trump effectively mobilized somewhere between thirty thousand and a hundred thousand protesters (depending on where people were counted and by whom) on January 6th, from all across the nation, and more than a thousand actually stormed the Capitol, of whom more than 700 so far have been arrested and charged, with most being convicted or pleading guilty.

The problem with technology, in the case of Trump and others, is that while technology multiplies their abilities, it fails to multiply their accountability. In fact, in the case of Trump, his uses of technology has made it difficult to enforce any accountability.

Con men and swindlers can commit thefts from places where they can’t be discovered, let alone prosecuted. Cyber-bullying among teenagers has become endemic, and definitely contributes to increases in teen suicide. Trolls can badger and harass people with little fear of either retaliation or repercussion.

Functioning societies fall into two categories – autocracies and those based on popular trust, generally but not exclusively democracies. But technology is increasingly being used in ways that isolate people and create greater mistrust of any one who is different. Because isolation and mistrust undermine governments, one of the questions facing democracies is how to stop the increasing misuse of technology, because, skeptic that I am, I sincerely doubt that the people who are using technology to harm others are going to stop of their own free will. Trump and Putin certainly aren’t, nor are all the others.

10 thoughts on “The Great Multiplier”

  1. KevinJ says:

    Not to mention those who see a profit in technology. Would cyber scams and bullying be as prevalent if the internet had been modified early on for accountability? But no one was going to wait for that when there was $ to be made.

    (Misusing the accountability against, for example, dissidents a separate issue. Just as important, of course.)

  2. Mayhem says:

    The lack of accountability isn’t a flaw, it’s a feature, innately designed in to the systems.
    Much of the past 50 years worth of progress has been built around insulating the people who act from the consequences of their actions, for good or ill.

    Initially that was for protection – eg. limited companies and directorial independence allows protection of your own assets from bankruptcy or seizure, whistleblower protections allow the reveal of problems and so on. However over time those same protections were corrupted to protect those who have from having to contribute – avoiding taxes, public service, liability for their sins and so forth.

    It’s only in the past decade that society has changed enough that those corrupted protections are being exposed, and things actually being done about it. And even then what is being done is surface level.

    The free movement of capital is so innately corrosive that I’m not sure there is any easy fix.

    1. Shannon says:

      Not sure what you mean by ‘free movement of capital.’ Freedom is required for innovation. As you pointed out, the internet was designed with freedom, not accountability or safety, in mind. Our society has made a decision to prioritize freedom over accountability. That’s enshrined in the l justice system which imposes the burden of proof on the government with restrictions on their ability to investigate. I don’t view that as all bad.
      Technology multiplies the harm that can occur and may cause us to re-evaluate the acceptable balance between freedom and accountability. But life isn’t fair and the government has limited ability to change that. The more government attempts to regulate, the less freedom everyone has and the more potential for unintended consequences and stagnation.
      We can’t have maximum freedom while also having maximum safety or innovation.

      1. Mayhem says:

        Oh I’m not disputing the freedom to innovate. But innovation rarely happens in large organisations, it’s the province of small flexible companies. I’m deeply suspicious that the big tech players today have any interest in innovation beyond renting access to their various wallled gardens and stomping out potential competition, just as Rockefeller and his fellows did in the 1900s.

        By free movement of capital I mean in the modern world, money has no borders. Vast flows of money flowing back out from Russia, China, and the Middle East has had wildly destabilising effects all around the world, in housing, in politics, even reshaping how and where towns exist as mineral exploitation creates new roads and communities literally relocate to where the roads are now.

        No matter what regulations are created, money soon erodes the protections, or ownership registration moves offshore so the regulations no longer apply. Nested anonymous shell companies pretend to have a relationship in order to move the money from where it is generated to where it can be reused in ways that hide the source and the process from taxation but also from legislation and sanctions.

        I suspect Ada Palmer has the most plausible future, with the nation state vanishing in favour of global collective societies linked by common interests and goals. Certainly it’s better than one ruled by multinational corporations.

        1. Shannon says:

          Unfortunately, if you have enough money, regulations don’t matter. I view the law as applicable to 99% of people and resign myself to remainder as not worth the resources to pursue.

  3. Tom says:

    “Our society has made a decision to prioritize freedom over accountability.”

    I am not sure that the individuals think that way. They are certainly interested in the figment of ‘freedom’; even over ‘accountability’. In democracies, it seems that we allow minorities to outweigh the majority to the extent where people honestly believe that personal sovereignty is paramount (and everything in the Universe is responsible for itself alone).

    As the seventy plus works of LEM point out the there is nothing that does not have something that it needs to balance with in order to function ‘properly’. That includes the internet and the rest of the “multiplier” media.

    Such a balance is difficult to arrange especially as it will mean a restriction in our ‘freedom’. The US and EU are already fighting about some of the factors involved. Russia, China, Iran, etc. have taken care of the problem their way: I hope we do better.

    1. Shannon says:

      I agree with you. We allow minorities to outweigh the majority. I do wonder whether the majority actually understands the implications of choosing that figment of freedom or have a comprehensive enough understanding to grasp the implications of that choice.
      LEM’s books do an excellent job of exploring balance and trade-offs.

  4. R. Hamilton says:

    Technology also provokes an ever increasing regulatory intrusion, much of which could be unnecessary if laws were better written. Fraud is fraud, theft is theft, even breaking and entering is not entirely dissimilar whether done by ancient methods or in cyberspace (a researcher at Bell Labs published an article once on how to reverse-engineer master keys, so even the methods can be more similar than one might suppose). Libel laws could cover some willful and malicious misinformation…and occasionally provide the opportunity to refute the unjust labeling of something as misinformation.

    Right and wrong haven’t changed, although I suspect that the degree of skepticism of law enforcement and even judges is near or at an all-time high, which makes transparency (as much as possible) all the more urgent.

  5. Tom says:

    “… I do wonder whether the majority actually understands the implications of choosing that figment of freedom or have a comprehensive enough understanding to grasp the implications of that choice. LEM’s books do an excellent job of exploring balance and trade-offs.”-and- “But innovation rarely happens in large organisations, it’s the province of small flexible companies.”

    Increasing human populations reduce the individuals’ unrestricted physical space and increase the complexity of human living: specifically individual and group obligations and responsibilities change. In ‘The Dawn of Everything” an anthropologist makes such points on the evolution of humans basing it on ‘corvée’ or ‘service’ (something that I used to think of as basic to US culture). He brings up Ethno-Mathematics as the basis for societal rotation, serial replacement and alternation (which I considered to be “cooperation”) and which resulted in the existence of a flexible system of group living. The mathematics created rules, but the choice of implementation and enforcement was up to the chosen governance and depended heavily on balance of trade-offs.

    In large organizations the increasing complexity requires increasing rules which restrict individual excess for the benefit of all; but these rules and regulations must also allow for freedom of innovation or the group dies. In a small group basic rules such as biblical Ten Commandments are easily understood and can be adhered to. In large groups a political policy such as the US Constitution has items that do not pertain directly on everyone’s life and thus are more difficult to understand and apply to an US individual’s life.

    Elected governance has to constantly review the restraints/restrictions on the group for which it has responsibility. This is a very difficult job and more so when rules and regulations are altered rather than replaced: alteration/remodeling/repair does not in fact preserve what was and does not produce something more applicable to the present social group (usually resulting in imbalance).

    I have indicated before that it seems to me that AI can also be a reducer (not a divider) and specifically as a tool in managing the increasing complexity in the group behavior as humans evolve (similarly to the psychologists use of weighting factors in their assessments and tests).

    1. Mayhem says:

      Yes, larger groups really need simpler rules, the more complex they are, the quicker you’ll find an edge case which breaks them. It’s one solid argument for laws being made up of prescriptions mixed with precedents, because case law is more flexible than prescriptive laws. That being said there’s probably something said for rewriting them to be simpler every so often in order to reset the complexity and reduce the accumulated corruption.

      The problem with AI in the present day is that it has to be trained, and the training showcases the inherent assumptions and flaws of the trainers more than it does the intelligence of the AI.
      See for example Dall-E, which depicted males as manly doctors and firefighters, while depicting females in sexualised versions. Or Microsoft’s chat bot which rapidly became virulently racist. Or Amazon penalising job applicants who were female because the training data was 90% male.

      There is also an incredibly strong US bias, which bakes in all the US centric issues of racial and societal inequity. These are different to say the core issues in India, or China, or the various European nations, all of which have their own wildly different profound issues.

      As our internet becomes increasingly fractured and walled off, training data will likely become worse – already the problematic datasets of today are being onsold and are being baked into the models of tomorrow, and training by scraping the cesspits that social media has become … well, we already see the problems there. And the use of simple keywords to tag huge volumes of pornographic material or memes has led those keywords becoming inherently contaminated when searched within apparently benign datasets.

Leave a Reply

Your email address will not be published. Required fields are marked *