Postcards from the Edge of the World (Vol. 9)The Stadium's Been Filling the Whole Time... And Now the Real Pressure Begins.To Whom It May Concern (You): Three days ago, a research company outside the United States contacted me. It was an expert network firm that needed consultants in a very niche area of financial analysis and alternative investment management. A hedge fund wanted someone with a background in model portfolios and capital-allocation strategies… blah, blah, blah. I was interested in linking up with the client. Friday morning, the research company sent me “Terms and Conditions” and an informal offer… As I read through the agreement (and yes, you should read all agreements when you update software or accept online terms…), the company was EXPLICIT about one thing. At no point when sharing industry expertise with a client may a consultant use any form of Artificial Intelligence (AI). No ChatGPT, no Claude, no Gemini. Nothing at all for research or even using it to help frame opinions and ideas. Which is interesting given how ubiquitous AI is. There were legal statements and clawback provisions explicitly tied to AI use. This clause exists for a reason. And Occam’s Razor suggests that at some point, many so-called experts in that network began outsourcing their thinking to AI. That creates enormous liability, not because AI is malicious, but because it has no accountability. If $1 billion in capital is misallocated based on an AI-generated framework rather than lived experience, no machine will face any consequences. Humans do. Meanwhile, information, even expert-level insight, has been commoditized. The gap once created by specialization has narrowed dramatically. I don’t care whether people use AI. I suggest that you shouldn’t either. Heck, run your next Apple agreement through ChatGPT and ask if there are any red flags. You’ll be stunned by what comes up. What matters is if people outsource judgment and responsibility to it. In the equity markets, I’ve watched that same compression hit financial editorial. Most online editorial content over the next few years will be generated by artificial intelligence, often without meaningful human guidance. That pressure is already reshaping how analytical and creative work is valued. I can’t control the compression rate, but I can control how I respond to it. I’ve built a career around words and how to construct them into clear passages. I’ve learned how to explain systems simply… and to recognize patterns in a world that shouldn’t be recognizable. Here at Postcards, we connect history, global markets, and incentives into a framework that explains how the world works and what to do about it… (at least, I hope I’m connecting). But I have to be honest with myself. There’s one thing that matters, which is now under threat. My identity. This pressure isn’t happening in a dramatic, headline-grabbing way. No one is knocking on my door, ordering me to turn off my computer. No one will give me a pink slip. But I can look ahead and decide how I adapt. Something subtle is happening, and it’s happening to many people. People don’t talk about it, even if it reshapes their lives. We must get ahead of this trend, not only as investors but also as human beings. This is the most important conversation we’ve had about extraction. Welcome back to the Edge of the World. The Ways They TakeThe changes in the years ahead won’t be slow and gradual. They will come in large leaps, sizeable disruptions, and career-altering moments. It won’t be one foot in front of the next. Change will be exponential. What does “exponential” look like? Take an example that Chris Martenson gave me during a research project 15 years ago. Imagine you’re sitting at an NFL football game. It’s the start of the first quarter. In the first minute of the game, a single drop of water lands on the 50-yard line. No one notices. A minute later (minute 2 of the game), two drops fall on the 50 (on top of the first one). At three minutes… There are four more drops added. Minute four, eight drops fall. We continue that pace of adding and doubling every minute. At the end of the first quarter, a puddle had started to form quietly. The crowd’s still ordering beer and arguing about a penalty. No one is thinking about the weather forecast. But the number of drops continues to double every minute of the game. By halftime, there’s a half-inch of water across the field. By minute 35, the field is now under a foot of water… At minute 38, you have a six-foot pool. And by the time the fourth quarter starts, you’ve created the Mariana Trench with end-zone paint. Everyone swears it happened all at once. It didn’t. This aquatic example is how exponential change works and feels. There isn’t a dramatic start. Things just quietly compound while everyone else is busy looking at distractions. This is what will happen, and continue to happen, with AI. AI doesn’t accelerate because it’s smart or clever. It does so because its growth process is exponential, and it doesn’t care what it does to people in its wake. The economic focus on this AI trend naturally centers on white-collar jobs. Goldman Sachs offered a seismic prediction in August 2025 that AI would displace roughly 6% to 7% of the U.S. workforce by 2030. That’s the type of percentage level - and displacement level - that fuels drastic shifts in consumer habits, macroeconomic projections, and voter sentiment. This is what can lead a fringe candidate to the front of the primary line… a few percentage points. The political fallout will relate to job displacement and loss. But I don’t think the job loss narrative is the real or most important story. Job figures are something that we can measure… so it feels like we’re doing our jobs as economists and policymakers. But how do you measure something that is far more important, but doesn’t show up in hard data? I want to measure something more meaningful and difficult to quantify… Something politicians should confront, but they don’t know how because it forces them to examine a lot more about themselves and their priorities. That’s human relevance. I’m focused on the quiet fear that AI not only takes away a person’s job but also impacts that person’s identity, because it’s now abundant, cheap, and automated. I repeat that AI doesn’t just threaten labor. It threatens our personal identity. For most of modern history, a person’s professional relevance came from knowing something other people didn’t. In the business world, experience, credentials, and niche-knowledge matter. When you go to a specific, prestigious school, that action and work open up networks. People earned influence by having an answer to questions or an understanding of processes that others didn’t. We called these things on a resume: Skills. The deal for centuries was clear. You would invest years of your time into learning something difficult. In return, you gained a reputation, identity, and brand as the person who knew what to do and how to think around a niche. Well… AI officially breaks this contract. AI doesn’t sleep, forget, or need meetings to think about ideas or strategy. It doesn’t get distracted, and it doesn’t need help finding another word for “change mechanism.” AI does something that we aren’t talking about enough these days. It limits the scarcity of knowledge and specialization. It speeds up the analysis process to levels even quantitative geniuses can’t maintain on a treadmill forever. And this shift will be much harder than people want to admit. It will extract the expert's identity… and that has real consequences. What’s Beneath the NoiseThe people I speak to about this matter aren’t feeling anything irrational. Researchers have increasingly focused on this fear in recent years. A 2021 study in Electronic Markets finds that changes to work and perceptions of status loss are predictors of what they call “AI identity threat.” Employees don’t just fear losing tasks. They fear losing what defines their roles and what control they have. Research on work disruption shows that changes in role and status can impact professional identity and autonomy. Workers describe emotional and identity-related effects that last long after the immediate job transition or displacement. It’s not really a fear of unemployment. It’s about losing the identity that defined the job. A lot of people are already in a mourning period over their current jobs and future prospects. Recent studies show that greater awareness of AI adoption is associated with increased feelings of job insecurity and workplace stress, though the effects vary by individual and context. The psychological toll compounds unless people have what researchers call “career resilience.” That term basically means that people are capable of adapting and persisting despite change or disruption… But this really isn’t just limited to work. This stress is evident in other places in the world. Parents feel it when their kids stop asking questions. Leaders feel it when decisions no longer flow through their office, and they sense a loss of control (which is brutal for the ego and tends to lead to defensiveness) Experts feel it when the room no longer waits for them to speak, or clients stop calling to request their insights on specific topics. This AI-driven impact on personal identity is the quiet crisis of our time. People are afraid of becoming irrelevant. Strip a person of purpose, and they’ll quickly turn inward and protest... We’re reaching a point where a person in isolation shouldn’t confront this alone. AI didn’t invent their fear… It’s only accelerated it. How Power Really WorksEvery system in the world extracts something. That is because the purpose of a system is to achieve the results for which it was designed. And if there isn’t a stated purpose (especially from policymakers), just look for patterns in results. Money printing and similar monetary and fiscal processes extract purchasing power. Insurance extracts permission to own something. Bureaucracy extracts worker autonomy and independence. AI extracts something far more personal. In business, it indirectly extracts the part of us that makes us relevant. Consider the impact on memorized expertise (the hallmark of our education system), our ability to recognize patterns, our ability to establish technical authority… and (this one can hurt), the credential-based identity. AI has most, if not all, of us beat… Will Hunting joked that the long-haired Harvard grad could have gotten the same education in economics “for $1.50 in late fees at the public library.” Now… you don’t even need the library. You just need AI-prompting skills and the ability to ask questions and shape narratives. Anyone, anywhere, can be an expert now on financial markets, Greenland, or geopolitics… And when everyone is an expert on something… no one is. That extraction feels like a loss because it is. Pretending otherwise just creates resentment and denial. People aren’t just uncertain anymore. They’re now increasingly anxious and unsettled about loss of self-worth, agency, and purpose. Nobody planned this, but history has never rewarded people who clung to identities built on scarcity. It rewards those who re-anchor… which we must do. The Everyday HustleThere are things AI does very well in my field... It can compress research time and flatten certain kinds of analysis. And it makes competence cheaper than ever… Why even have the expert network?. That’s a cynical question… Let’s be a bit more optimistic. We don’t control the disruptions and the way the system is structured… We don’t win because one of us is the smartest person in the room. We can win because we’re the ones still talking when everyone else is optimizing. There is one thing we still control. And it’s the only thing that really matters now. We control how human we remain. We’ll always have the unique human ability to control and shape our individual voice and tone. And, we’re the ones who register our sense of humor. But it goes beyond that… because our regulation of self is what cannot be taken away. We must be willing to admit uncertainty without collapsing into it. We must ensure our judgment is sound when efficiency collides with ethics. That will be a massive battle in the future. Forget the constant debate about profits versus human beings. Instead, consider the extraction of more than money… and the temptation to take unreliable shortcuts at scale. A machine can generate text. It cannot give you a reason to root for the person writing it. And it turns out, in a world where intelligence is abundant, things like trust, purpose, and meaning become scarce. So too does authenticity. Research shows that people react differently (and more positively) when they know an interaction is with a human versus AI. There’s something invaluable and unquantifiable about human presence that matters. And that’s one of the core things that machines can’t extract. The Real EconomyA lot of people will struggle in adapting to this new, exponential world. It will all come down to how they react. They might think the answer is to compete with the machine. They may focus on faster outputs, louder certainty, greater authority, and more control. That’s a recipe for burnout. Some may even use AI as much as they can - and even cheat in the expert network I discussed above, because there’s every incentive to do so. That instinct is understandable, but it’s also fatal. Every time humans have tried to out-optimize systems, they’ve lost. Speed is not what survives. Stewardship and judgment do. AI can’t make or properly measure moral trade-offs. AI can’t accept responsibility or reason about the consequences of its decisions. Someone… SOMEONE… still must own outcomes. And that someone is a human being. Meanwhile, I continue to stress that purpose matters as much as skill. We have to ensure our identities shift away from job titles or functions. Instead, it needs to focus on personal growth, relationships, and responsibility. This is a key way to reduce anxiety and achieve greater resilience… You’ll achieve more than any person who only focuses on their static expertise. Household Moves... Reduce What They ExtractSovereignty is about who you remain when things change, and systems shift. I knew when I started this letter that not everyone can invest or has the capital to deploy. That’s why the first part of my job is to help you build enough that you eventually can. Even if you do invest, the sovereign move only captures part of the picture. The other part? Reducing what they extract from your identity starting now... You can’t beat extraction systems by yelling at them or by going on television to warn about what's coming next... You reduce extraction by repositioning who you are. Start with these simple steps… Own decisions, not information. Information is now abundant because of AI, and it’s been this way since the growth of search engines. That said, decisions (and the people who know how to make them) aren’t abundant... AI can summarize, simulate, and suggest ideas and scenarios. But AI cannot and will not ever own the outcomes related to these things… If you want to stay relevant, be the person who decides, not the one who just informs. Accept accountability visibly… in front of people… via emails… Own processes. When you say “this is my call,” mean it. Relevance migrates toward responsibility when knowledge becomes cheap. Build a voice, not a resume.Resumes are basically just lists of functions. However, your voice is the pattern that people will recognize. Your voice is what defines the way you frame problems and the details that you notice. It also (and this is so important for editorial) helps define and structure what you’re willing to say out loud when others hedge or won’t admit what they feel. AI can replicate editorial style. It will never replicate the trust that a person earns over time. It can’t capture your real voice, even if it reads and builds models over and over again. If people would recognize you without your title, you’re protected. Move toward moral weight.Look for places where decisions have consequences. So, I guess maybe consider staying out of politics? Machines (much like politicians) avoid moral weight, but ordinary humans absorb it. There’s a reason why doctors still sign charts and lawyers sign filings… Do your best to get as close to consequences as possible. It will make you harder to replace, and produce greater purpose and meaning. Become the one people call when things break.Systems are fragile, and the ability to make clear decisions under stress is rare. Ask yourself who calls you when something goes wrong. Who wants you in the room when they need answers? AI can mimic interaction, but it can’t match the power of human loyalty. Keep small, durable circles of people you trust and people who trust you. You MUST show up consistently and follow through without making a scene. If your relevance depends on an algorithm, it isn’t sovereign. Anchor part of your identity in the physical world.The digital world abstracts fast, but reality does not. An identity grounded solely in digital output is dangerous. Bury yourself in craft, teaching, building, or caring for something living. Get a freaking pet if you need to… volunteer at places where people have needs. Listen to people who need help… and their stories. You’ll smile when you hear “It’s so good to see you again…” or when you open a door and that tail is wagging. These small things mean something. The more virtual the economy becomes, the more stabilizing and important the physical becomes. Practice taste, not speed.Speed is being commoditized, but taste is not. Taste means knowing what not to include, recognizing quality, and connecting things that don’t belong together. AI generates more information and insights, but humans choose what matters. Be someone worth rooting for.This is the quiet feature, and easily the most important. This is about authenticity… People don’t follow perfection. They follow humans who remain human under pressure. That means showing your humor without cruelty. It is easy to mock like a late-night comic these days… to be the bully when you say, “I’m just making a joke.” Late-night comedy viewership isn’t declining because people are going to bed earlier. It’s because the “ranting” and “clapter” elements get old fast. People don’t want to be outraged by the world before bed. These shows just aren’t authentic. There is incredible comedy outside of mindless jabs that people ignore... Engage in cooperative laughter, share human experiences, and revel in the absurdity of it all. It’s hard to do, because cheap laughs are easy to come by… but making people think and laugh at the same time, and finding relational experiences in humor, creates a bond. It also means showcasing confidence in uncertain times, demonstrating your conviction without arrogance, and remaining humble even when the situation doesn’t demand it. A machine can produce answers, but it cannot earn goodwill. People want to know the secret to life. Mine is simple: First, find a purpose… whatever it is. Then surround it with your personality, humor, and conviction. That’s sovereignty at the household level. The Back PageAI runs on electricity, grids, and copper. It’s not a real thing… take off the makeup of an AI girlfriend, and it’s just a bunch of computer chips and hardware. AI requires an unnatural number of transformers and backup battery systems. We need power generation and cooling infrastructure. And none of this stuff scales at the speed of software. We should be investing here… building our wealth here. Extracting from the very things that are extracting from us. The more intelligence we automate, the more energy becomes the bottleneck. That’s a civilizational issue... Every exponential system eventually runs into a constraint. And when it does, the constraint becomes the asset to own… The bottleneck… the toll road… the… pipeline... As we focus on AI, we have to turn to the real bottleneck… The power and energy systems, and the physical backbone that make the digital illusion possible. Keep reading with a 7-day free trialSubscribe to Postcards From the Edge of the World to keep reading this post and get 7 days of free access to the full post archives. A subscription gets you:
|
Home
› Uncategorized





Post a Comment
Post a Comment