🌀🗞 The FLUX Review, Ep. 91
March 16th, 2023
Episode 91 — March 16th, 2023 — Available at read.fluxcollective.org/p/91
Contributors to this issue: Ben Mathes, Erika Rice Scherpelz, Justin Quimby, Dimitri Glazkov, Jon Lebensold, Neel Mehta, Boris Smus, Ade Oshineye
Additional insights from: Gordon Brander, a.r. Routh, Stefano Mazzocchi, Alex Komoroske, Robinson Eaton, Spencer Pitman, Julka Almquist, Scott Schaffter, Samuel Arbesman, Dart Lindsley, Lisie Lillianfeld
We’re a ragtag band of systems thinkers who have been dedicating our early mornings to finding new lenses to help you make sense of the complex world we live in. This newsletter is a collection of patterns we’ve noticed in recent weeks.
“High up in the North in the land called Svithjod, there stands a rock. It is a hundred miles high and a hundred miles wide. Once every thousand years a little bird comes to this rock to sharpen its beak. When the rock has thus been worn away, then a single day of eternity will have gone by.”
— Hendrik Willem Van Loon, The Story of Mankind
🎥📱 What vaudeville teaches us about disruptive innovation
Oftentimes, the first use of new technologies is just a projection of existing things onto a new format. The early days of film relied on taking well known forms of stage performance, like vaudeville, and simply recording them as movies. The creative choices we take for granted today — panning, close ups, lighting, etc. — were not yet part of the common toolkit for film producers, whose primary reference point was stage productions. Similarly, early websites were little more than static news pages. The first mobile apps were websites on phones. It was the same product in a different medium.
Disruption comes as people realize how to utilize the unique properties of the new medium. On the web, the content can change as the information and the viewer change. Now we have web applications that are more dynamic and personalized than even the freshest newspapers. Mobile devices can use location and real-time communication to open up new use cases such as ridesharing applications, which were never feasible on desktops.
We are going through one of these moments with AI. Many of the current applications of AI look like sustaining previous use cases. We can imagine GitHub Copilot for everything: an assistant helping you do what you were already doing in coding, documents, presentations, design, and so on.
These use cases may open the doors to disruption, but they will not, themselves, be the disruption. Disruption will look different in ways that we don’t yet know. The new thing will build upon the novel aspects of the medium. One aspect that we can already start to see is that AI is fuzzy. That means that things we assume will work — like composing 7 deterministic API calls into a single reliable service — will work no longer. Much of the industry’s assumptions about how software gets designed and built will have to change.
Organizations that are structured around today’s assumptions will need to transform — or die. Disruption theory says most will likely die instead of enduring the pain of metanoia. In a classic innovator’s dilemma, the people in positions of power who won the old game won’t want to take the risk of changing the game. Plus, it is just so hard to see the new thing when everything around us is built around old assumptions.
When all signs point at a new large opportunity space opening up, be prepared for this intermediate period of projecting existing forms onto the new medium before wholly new forms emerge. That first transition may look a bit boring and silly. But it unblocks the second transition which acts as the powerful engine of transformation, spurring the gale of creative disruption.
Clues that point to where our changing world might lead us.
🚏🥧 Meta’s leaked language model LLaMA can now run on laptops, phones, and Raspberry Pis
A fully-trained version of Meta’s LLaMA language model leaked at the beginning of March, and there’s been an “explosion of development” around it ever since. Notably, developers have released open-source tools that let you run LLaMA locally on MacBooks, Windows PCs, Pixel phones, and even miniature Raspberry Pis — a huge change compared to the more famous GPT-3 model, which requires multiple “datacenter-class” GPUs. Similarly, Stanford’s Alpaca model, which builds on top of LLaMA, can be fine-tuned for less than $100.
🚏👁 Be My Eyes is using GPT-4 to help visually-impaired people in real time
The popular Be My Eyes app has traditionally relied on human volunteers to help blind or visually-impaired people with everyday tasks: the person points their phone at something and a volunteer helps answer questions or explain what’s going on. The organization has now announced a new “Virtual Volunteer,” powered by the new GPT-4 model, that uses AI to instantly answer questions about photos in real time — from reading subway maps to finding items in vending machines to navigating physical spaces.
🚏🤐 TikTokers are coining cryptic words to evade content moderation
“Algospeak” has become popular on social media apps like TikTok as users seek to dodge the platforms’ (often heavy-handed) content moderation filters. Marginalized groups often avoid directly saying the names of the challenges they’re facing (“cornucopia” instead of “homophobia”), and others try to disguise even commonplace words for fear of getting demonetized (“unalive” for “kill,” often used in frank discussions of suicide). Put those trends together and you get surprising coinages like “le dollar bean,” which TikTok’s auto-captioning tool renders as “Le$bian.”
🚏🍿 DreamWorks open-sourced its renderer used for major animated movies
DreamWorks Animation Studios, known for franchises like Shrek and Madagascar, has open-sourced its 3D rendering software called MoonRay. MoonRay was used to render several notable DreamWorks movies in recent years, including Puss In Boots: The Last Wish (2022) and How to Train Your Dragon: The Hidden World (2019). The full software package, which also contains virtual camera, lighting, and geometry tools, is now available on GitHub.
🚏🏭 China is rapidly becoming an expensive place to do manufacturing
Much of China’s economic boom has been due to the historically low cost of manufacturing in the country, even compared to other Asian countries. But over the last decade, manufacturing labor costs in China have skyrocketed: they’re now about three times higher than the costs in Malaysia, India, Vietnam, the Philippines, or Thailand.
📖⏳ Worth your time
Some especially insightful pieces we’ve read, watched, and listened to recently.
How Complex Systems Fail (Richard I. Cook) — A classic list of 18 insights into failure in complex adaptive systems. We especially enjoyed #3 (“Catastrophe requires multiple failures — single point failures are not enough”), #5 (“Complex systems run in degraded mode”), and #16 (“Safety is a characteristic of systems and not of their components”).
The Expanding Dark Forest and Generative AI (Maggie Appleton) — Argues that Large Language Models (LLMs) will continue to be ever more impressive at emulating B+ college essays, then asks: how can real humans differentiate ourselves from these AIs? The conclusion: strive for original insights and show up in the physical world.
The End of Writing (iA Writer) — In our age of LLMs that churn out B+ human writing, a poignant (if curmudgeonly) reminder that the important thing about human language is that it connects one human being to another, through space and time.
Prairie Strips (Sam Knowlton) — Describes how turning a small slice of Midwestern farms back into native grasses can have huge benefits for farmers and the ecosystem alike: “By converting 10% of cropland to native prairie, farmers can reduce soil loss by 95%, total phosphorus loss by 90%, and total nitrogen loss by 85%.” These strips replenish topsoil, capture harmful chemicals, attract pollinators, and defend against pests.
The Surprising Effects of Remote Work (The Atlantic) — Observes that the US’s fertility rate (long on the decline) ticked upward in 2021. Remote work may be to thank; one study found that remote workers in the US were slightly more likely to get married and have babies. Scientists hypothesize that the lack of commutes gives workers more time to devote to family, and that increased geographic flexibility helps people “settle down” more easily.
🔍📆 Lens of the week
Introducing new ways to see the world and new tools to add to your mental arsenal.
This week’s lens: Moat Digging and Moat Filling.
Continuing our discussion from the main article, another pattern we see around technological advancement is the cyclic swing between moat digging and moat filling.
During the personal computer revolution, PCs unleashed an explosion of productivity tools and software services for back-offices. Calendaring, word processing, spreadsheets: all of this “boring” software was revolutionary at the time. But that power came with a cost. Proprietary data formats meant that choosing one offering locked you into that vendor for years. Big Tech was building amazing tools, digging wider and wider moats in the process.
Then came the moat-fillers, enabled by the connectivity of globally networked computers. First were the eager open source developers who built an equivalent set of tools. These volunteers relied heavily on reverse-engineering those proprietary formats, but over time the open source software movement developed open protocols and software that, in some cases, were eventually adopted by the moat diggers.
Software as a Service (SaaS) and cloud computing built on that work to fill the moats and replace them with an ecosystem of tools that can be used together with much greater composability than their ancestors. With the moats filled in, organizations had the freedom to choose which tools were best suited to their needs. Interoperability and portability aren't perfect, but at least they are often plausible.
Generative AI systems seem ripe to go through a similar cycle. Right now, we have moat diggers building bigger, more powerful models. Open-source startups and researchers are releasing ever more capable alternatives. Today’s product launch strategies are looking for ways to build a “defensible” business model, and ecosystem interoperability is not foremost on most people’s minds.
If history is any indication, once the moat diggers have taken a pass at figuring out where the value lies, the moat fillers will come along. It’s exactly because they are the second movers that the moat fillers have the time and resources to fill in the gaps, create new connections, and, we can hope, set the stage for another round of long-term growth.
© 2023 The FLUX Collective. All rights reserved. Questions? Contact email@example.com.