Re “Could this be the end of the Internet as we know it? Questions swirl around how Supreme Court will rule on Internet law and social media” by Hiawatha Bray (Tech Lab, Business, Feb. 24): Algorithms are thought of by some, including, it seems, Justice Clarence Thomas, as neutral programming tools. This is wrong. They are formalized statements of human-derived policies and priorities, without which code could not be written. The algorithms are designed, by humans, to interpret a company’s priorities and curate content to specific individuals. The companies make money when they target successfully.
Anger and violence will bring users to certain platforms and increase corporate profits. A line has been crossed — and a choice made — when the company’s officers decide to promote these kinds of drivers to make money. As a tech professional, I can assure that a neutral algorithm, if it ever existed, is not the job of today’s engagement algorithms.
Advertisement
Making money by aiding and abetting criminal behavior is nothing new. If I lent my vehicle to a local drug dealer, knowing they intend to engage in lawbreaking, a court would color itself unimpressed with a defense that I just wanted my logo to be seen by a larger number of people.
Social media is far more than lending out the truck, as it were. It provides the gasoline, maps, directions, a driver, and a getaway car as a single package, while it happily banks the funds it receives for the services.
Social media companies comply with laws regarding Nazi propaganda in Germany and bans on child pornography. They curate content and spend millions to identify who should view it, based entirely on their own corporate policies and goals.
Code does not write itself, and the law should be no shield when those choices bring harm to others.
Advertisement
Susan Franz
Uxbridge