Interview

Scott Belsky on AI and creativity: 'Personalization effects are the new network effects'

Jun 10, 2025 with Scott Belsky

Key Points

  • Scott Belsky argues personalization effects are replacing network effects as AI's core moat, with OpenAI's third-party login strategy designed to deepen user lock-in across the broader web.
  • Enterprise AI adoption faces an unsolved data ownership question: whether years of employee context stored in AI systems belongs to the individual or the company, likely settled through bilateral backroom negotiations rather than open APIs.
  • As AI tools compress the product development workflow, taste becomes the differentiating human skill over technical ability, requiring judgment about culture and non-obvious conceptual choices that remain difficult to automate.
Scott Belsky on AI and creativity: 'Personalization effects are the new network effects'

Summary

Scott Belsky — founder of Behance, which now hosts around 58–60 million creatives, and former Adobe executive leading emerging products — argues that personalization effects are the new network effects. The claim is structural: as AI context windows grow and accumulate user history, the tool that knows you best becomes the hardest to leave. OpenAI's recent move to let users log into third-party tools with their OpenAI account reads, in his view, as a direct play to deepen that personalization layer across the broader web.

Collective memory and enterprise data wars

The more unresolved question is what happens when individual AI memory becomes organizational. Belsky frames it plainly: if an employee's years of context — their decisions, reasoning, and institutional knowledge — are stored in an AI system, does that data belong to the person or the company? And can successors query it after they leave?

Big tech's response to this won't be clean API openness. Belsky expects a "data war" settled largely through bilateral backroom negotiations — companies agreeing not to cut off each other's connectors because mutual access serves everyone. The practical upshot for enterprises is that it may not matter which tools create the data, as long as everything lands in one of three or four clouds and can be surfaced through AI. The moats that survive, he argues, will be personalization and permissioning — the latter being genuinely hard to crack at enterprise scale, where controlling who can ask what and retrieve which columns of data is an unsolved product problem.

Apple's privacy trap

Apple is late on AI, and Belsky doesn't discount that. But the more interesting tension is structural: Apple built its brand on keeping others away from user data, and that's precisely the data that enriches AI personalization. Its privacy-first posture, once a competitive asset, is now a constraint on the experience layer AI companies are racing to build.

The bull case for Apple is that on-device models close the gap. If small local models keep improving and chip performance follows, a fully local AI that answers questions, searches the web when needed, and stores nothing in the cloud becomes a viable product — and Apple would be better positioned than anyone to ship it. Whether Apple is actually optimizing for that future, or just moving slowly, is genuinely unclear.

Knowledge arbitrage

Belsky draws a direct parallel to the early social media era, when people in their late teens and early twenties could charge Fortune 500 CMOs consulting fees just for understanding what Twitter was. The same window is open now. Fortune 500 leaders are scrambling to understand AI while a generation that used ChatGPT through college treats it as native infrastructure. The difference this time, he argues, is that AI cuts across every function — compliance, internal comms, finance, legal — whereas social media was largely a marketing story. The arbitrage opportunity is wider.

The hiring evidence is immediate. TBPN's most recent hire vibe-coded a guest directory on a Thursday, was seen by the team Friday, and started Monday. The ability to ship a functional V1 with AI tools was sufficient proof of value.

Collapsed stack talent and the role of taste

The product development workflow is compressing. The old sequence — scoping, Figma prototypes, redlines, engineering handoff — is giving way to parallel vibe-coding, designer refinement, and rapid deployment. Belsky's hiring thesis follows: the most valuable people now span multiple layers of the stack, whether that's engineering plus design, or product plus copy, because the tools let one person do what used to require a handoff chain.

As technical skills get offloaded to compute, taste becomes the differentiating human input. Belsky defines it broadly: judgment about what to leave to the imagination, knowing where culture is going rather than where it is, the ability to combine unrelated ideas in ways that produce a reaction. He cites the Harry Potter Balenciaga video as a durable example — AI-generated aesthetics, but a distinctly human conceptual leap that made it memorable long after the content cycle moved on.

Whether taste can be taught is a question he leaves genuinely open. Skills have always been teachable. Taste, he suggests, is built from human experience, accumulated exposure, and the confidence to make non-obvious choices — harder to train, and for that reason, harder to automate away.