the purpose of """ai"""
2024-09-18
we've all been on the """AI""" hype train for quite some time now, and gods has the ride been bumpy.
i slap """AI""" in scare quotes here, not because it's inaccurate or misleading, but because the term itself is a bit meaningless. the longer all this goes on, the less coherent any given definition of "AI" is. it's been used to refer to generative models, algorithms, the concept of an algorithm itself, capitalism-in-a-box, hypothetical god-like superintelligence, and literal magic. sometimes "artificial intelligence" doesn't even need to be artificial at all, as with Amazon's Just Walk Out stores, which just turned out to be around 1000 workers in India1 manually verifying and/or fixing 70% of the sales.
with all this nonsense in play, it becomes easier to understand "AI" as a marketing term first and an actual descriptor of technology second. someone calling something "AI" is trying to sell that thing to you, and "AI", as a descriptor, will mean anything it needs to mean in order for you to buy. all the waffling about neural networks and machine learning becomes buzzword-laden technobabble, the magic sauce that makes everything work, or at least, seem to work until the cash changes hands and the contracts are signed, and you're stuck with the whatever-it-was-they-sold-you.
with that said, i will use "AI" for the remainder of this article to describe the bubble itself, and where applicable, i will use the more specific names of things when describing e.g. generative models.
it usually takes a while for tech industry hype bubbles to burst. for a while in 2018, the buzzwords-du-jour were "blockchain" and "cryptocurrency". the hype was inescapable -- multiple companies annouced they were "pivoting to the blockchain" (and seeing massive increases in investment as a result2) and major industry titans like Amazon3 and IBM4 geared up to sell shovels in the gold rush. it took until about 2020 for everyone to realize that blockchains, as a technology, are painfully slow, cataclysmically wasteful in both electricity consumption and burnt GPUs, and utterly useless. the waning hype got a second wind in 2021 with NFTs, but i think the FTX collapse in late 20225 finally marked the end of the crypto craze, at least in the mainstream. (companies like IBM6 have since quietly downplayed and/or nixed some of their involvement with blockchain, though they will still happily sell you on the benefits of enterprise blockchain despite all reason to having evaporated already.7)
also in 2022 we witness "AI" start to enter the tech hype memeplex. unlike "blockchain", "AI" seems to have a lot more wind in its sails, with companies like Google8 and Microsoft9 skipping the shovel-selling strategy and trying to dig for gold directly. (if anyone's selling shovels, it's NVIDIA, and they're making bank.10) there's been some hints that the music stopping is on the horizon -- OpenAI has basically been bleeding money since inception11 and show no signs of stopping, and it doesn't really look like anyone else is turning a profit either12.
trying to predict when exactly a bubble bursts is a gambler's task -- something something, irrational liquid. i personally feel that, much like crypto, we aren't going to get some big revalation and everyone comes to their senses about the inherent limitations of chatbots or image generators. instead it will be a slow winding down as companies and people realize the market has been operating on (and trying to inflate) synthetic demand. maybe we'll get some big AI scandal or fraud that deals a major blow to the whole ship -- OpenAI turning out to be massively defrauding sales figures or something, but unfortunately i think the sins are going to be the regular, boring, unpunished ones; that is, the ones committed en masse at the expense of the unprivileged. like traumatizing offshore workers in Kenya for about $1.40 an hour13, or scraping the internet for artists' and photographers' work to use for "research" (until the money started flowing in, at which point they dropped the "we're not for-profit" lie14).
"the purpose of a system is what it does". this a phrase has been bouncing around a lot in my head recently. i think of it as a call to explicitly reject enduring narratives about systems -- capitalistic, social, political -- in favor of studying their actual, material outputs. it's extremely useful for cutting through metaphors, empty promises, and bullshit stories about companies, states, and political superstructures.
under this lens, Facebook is not a social media site for you to "connect with friends and the world around you", its purpose is to sell mass-surveillance-as-a-service1516 and sway elections for fascists17. Amazon's goal isn't to be "Earth’s most customer-centric company", it's to provide facial recognition18 to the FBI and police, whose goals aren't to "protect and serve", but to uphold white supremacy19 and brutalize Black people. Google's goal isn't to "organize the world’s information and make it universally accessible and useful", it's to supply tech20 to facilitate genocide21. that is what they do, and therefore, that is their purpose.
what about "AI", then? what's its purpose?
the internet is now full of autogenerated bullshit. it had plenty of bullshit before, but the sheer volume of top-ten listicles, waffling non-speech sites, and completely fabricated Q&A "forums" is completely gumming up search results. search engine companies like Google and Microsoft seem totally okay with, if not downright giddy at the prospect of, allowing garbage to take over their information tools.22 in the worst cases, they're happy to be the garbage-provider, as long as it keeps the stock ticker going up.
some information domains, like product reviews and consumer advocacy, have been all but completely overtaken, stuffed to the brim with affiliate link spam and five star reviews from John Q. Realperson.23 the effect of this is a zero-information environment, where it becomes impossible to determine what, if any, features of or promises about any given product are of any quality whatsoever. this is an environment that erodes away extremely basic forms of consumer evaluation and explicitly rewards dishonest and underhanded companies who have the resources to fabricate positive appraisals.
beyond just products and services, these tools are also easily deployed to try to sway public opinion on policy. we've had state-sponsored social media sockpuppet campaigns to spread disinformation24 and try to swing elections before25, but the cost-effectiveness of spinning up a couple hundred or thousand accounts to push conspiracies has dramatically shot up.
the end result is that AI exists to consolidate power. those that have the capital to do so use AI to create ideological bubbles and media environments for financial and political gain. the hype cycle, in turn, exists to animate tech companies to develop the technologies to carry out this process. in a world wracked with the consequences of increasing inequalities, AI must be resisted.
from The Information. alternate article at USA TODAY. Amazon attempted to deny these claims in a post on their blog and claims that the workers weren't hired to actively watch live video of shoppers. however, since that's the only claim they deny, i consider it safe to assume the rest, by omission, is true. ↩︎
of particular note is the company that did nothing but add "blockchain" to their company name and watched their stock ticker surge nearly 400% (archive), in case the sheer irrationality of the market needed more demonstration to you. ↩︎
Amazon Managed Blockchain managed to hook in companies like AT&T and Nestle on the hype. ↩︎
IBM Blockchain, somehow, managed to sucker in the state of New York into using their blockchain tech to store... some kind of information for their COVID vaccination pass system. ↩︎
courtesy of Web3 Is Going Just Great. ↩︎
NVIDIA's own Q4 and fiscal-year report for 2024 shows them up 126% in full-year revenue from FY 2023, to the tune of $60 billion. ↩︎
from The Information, alternative article at Futurism. ↩︎
from the Wall Street Journal, (archive). ↩︎
from TechCrunch. technically, if they make excess profit above a fixed limit off an investment, anything over that limit goes to the parent nonprofit. naturally, they set their profit limit to 100x investments, just to make sure they never actually have to pay out. ↩︎
like the time it sold access to names of friends and private messages on Messenger to Netflix and Spotify (archive)... ↩︎
...or the time they did (allegedly-)illegal man-in-the-middle attacks on users of Snapchat, Amazon, and YouTube to steal private data... ↩︎
...namely, the time it illegally sold user data to Cambridge to do targeted advertising in service of the 2016 election of Donald Trump. ↩︎
i could find thousands of instances to cite here, but given police in the US started out as slave patrols, the point is a bit moot. ↩︎
Project Nimbus at Wikipedia. ↩︎
often times making legitimate research tasks impossible, in case the disdain for academia wasn't obvious enough. ↩︎
it's old and not directly related to "AI", but see HouseFresh's analysis of how Google's SEO policies encourage product "review" spam and compare their examples to spam sites now. ↩︎
such as when the US ran a anti-vaccine conspiracy campaign in the Philippines. ↩︎
probably most notable in the Russian interference in the 2016 US election. ↩︎