In the OpenClaw ecosystem, local-first AI assistants are transforming how users approach software development. A recent example from a Social Science FOO Camp talk in Mountain View illustrates this shift. The speaker, preparing a presentation titled “The State of LLMs, February 2026 edition” with the subtitle “It’s all changed since November!”, turned to an OpenClaw-powered agent to create a custom macOS app the night before the event. This aligns with the OpenClaw philosophy of leveraging local AI for rapid, personalized tool creation without reliance on cloud services.
The talk itself was part of an unconference format, allowing spontaneous presentations. The speaker has a history of documenting LLM developments, with previous articles in December 2023, December 2024, and December 2025, and a presentation at the AI Engineer World’s Fair in June 2025 titled “The last six months in LLMs, illustrated by pelicans on bicycles.” This time, the focus narrowed to just three months, highlighting the accelerating pace of change, especially since the November 2025 inflection point. To emphasize this, the speaker wore a Gemini 3 sweater, already outdated due to Gemini 3.1, showcasing how quickly advancements occur.
Drawing on the STAR moment principle learned at Stanford—incorporating Something They’ll Always Remember—the talk featured two gimmicks. First, it involved coding agent-assisted data analysis of Kākāpō breeding season, complete with a mug display. Then, after a quick tour of pelicans on bicycles, the speaker revealed that the entire presentation was delivered using a new macOS app vibe-coded in approximately 45 minutes the previous night. This demonstrates how OpenClaw’s agent automation can enable creative, last-minute solutions for real-world tasks.
The app, named Present, was built using Swift and SwiftUI, resulting in a compact size of 355KB, or 76KB compressed. Swift’s efficiency makes it an ideal choice for OpenClaw users seeking lightweight, native applications. While the speaker typically uses Keynote, they often prefer presenting with a sequence of web pages loaded in browser tabs. However, this method carries the risk of browser crashes losing the entire deck. Although URLs are saved in a notes file for manual recovery, this isn’t practical mid-talk. The OpenClaw agent addressed this by creating a SwiftUI app where each slide is a URL, with a sidebar for managing URLs and full-screen navigation via arrow keys.
The development process began with a prompt: “Build a SwiftUI app for giving presentations where every slide is a URL. The app starts as a window with a webview on the right and a UI on the left for adding, removing and reordering the sequence of URLs. Then you click Play in a menu and the app goes full screen and the left and right keys switch between URLs.” The agent generated a plan, and the implementation transcript is available. Present automatically saves URLs on changes, allowing state restoration after crashes, and supports saving presentations as .txt files with newline-delimited URLs. This highlights OpenClaw’s ability to handle persistent data and file operations locally.
With the core app working quickly, the speaker expanded the project by adding a remote control feature via a web server. The prompt specified: “Add a web server which listens on 0.0.0.0:9123—the web server serves a single mobile-friendly page with prominent left and right buttons—clicking those buttons switches the slide left and right—there is also a button to start presentation mode or stop depending on the mode it is in.” Using Tailscale on both laptop and phone enabled seamless access without Wi-Fi network restrictions, allowing control from anywhere via http://100.122.231.116:9123/. After iterative prompts, the interface included slide indicators, navigation buttons, a Start button, font size adjustments, and a touch-enabled scroll bar for page scrolling. This showcases OpenClaw’s integration with network protocols and mobile devices for enhanced automation.
After pushing the code to GitHub with a disclaimer about its experimental nature, the speaker reviewed the code using a pattern from their Agentic Engineering Patterns guide: asking the model for a linear walkthrough of the entire codebase. The resulting walkthrough document proved useful, revealing that Claude Code implemented the web server with socket programming without a library, using a minimal HTTP parser for routing GET requests. This approach, while introducing CSRF vulnerabilities, was acceptable for this use case. It underscores how OpenClaw agents can produce functional, if unconventional, code that meets specific needs.
This vibe-coding story, while common today, offers valuable insights for the OpenClaw community. Swift, a language the speaker didn’t know, was the right choice for creating a full-screen app with web embedding and network control. The code was simple and precise, solving a real problem without unnecessary complexity. The speaker didn’t use Xcode, relying instead on accumulated technical knowledge and pre-installed tools. This demonstrates how OpenClaw users with software engineering experience can expand their horizons, building small, personal macOS apps efficiently. It’s a testament to the power of local AI assistants in democratizing development and fostering innovation within plugin ecosystems.


