EPISODE · Oct 26, 2025 · 47 MIN
#1677: Snap’s AR Developer Relations Plan for 2026 Specs Consumer Launch with Joe Darko
from Voices of VR
I did an interview with Joe Darko, Global Head of Developer Relations at Snap, at Snap's Developer Conference of Lensfest. See more context in the rough transcript below. You can also check out all 11 episodes in this Snap Lensfest series here: #1667: Kickoff of Snap Lensfest 2025 Coverage & SnapOS 2.0 Announcements #1668: Snap Co-Founders Community Q&A about Specs 2026 Launch Plan #1669: Snap's Resh Sidhu on the Future of AR Commerce & Developer-Centered Innovation #1670: Snapchat's Embodied Gaming Innovations with AR Developer Relations Head #1671: Reflecting on Snap's AR Platform & Developer Tools Past and Future with Terek Judi #1672: Niantic Spatial's Project Jade Demo Shows Latest Location-Aware, AI Tour Guide Innovations #1673: Snap Lensfest Announcement Reflections from AR Gaming Studio DB Creations #1674: 3rd Place Spectacles Lensathon Team: Fireside Tales Collaborative Storytelling with GenAI #1675: 2nd Place Spectacles Lensathon Team: CartDB Barcode-Scanning Nutrition App #1676: 1st Place Spectacles Lensathon Team: Decisionator Object-Detection AI Decision-Maker #1677: Snap's AR Developer Relations Plan for 2026 Specs Consumer Launch with Joe Darko Here are some concluding deep thoughts that I just posted in a LinkedIn post. Reflections on Snap Lensfest XR & AI Trends Covered in Latest Voices of VR Podcast Series Snap brought me down to LA to cover their Lensfest developer conference where they made a lot of AR developer platform announcements, had a hackathon featuring those new capabilities, and are gearing up for their 2026 consumer launch of Specs, their fully 6-DoF, hand-tracked enabled, AR Glasses. It’s been a full year since their Spectacles dev kit was announced and made available to developers, and I feel like Snap is on the bleeding edge of where the overall XR industry may be headed. These latest 11 Voices of VR podcast episodes spanning nearly 7 hours dig into these deeper trends that go beyond the headline announcements from Snap Lensfest. I recorded five interviews with various Snap employees, and I had a chance to catch up with some of the leading AR developers in the space, including Niantic Spatial’s latest VPS guided tour experience on the Spectacles with an AI virtual being. I also served as a preliminary hackathon judge where I got hands-on experiences with all of the AR experiences exploring what’s possible with the latest Snap Cloud announcements, and I’m featuring interviews with the top three Lensathon teams from the Spectacles track. Snap's Latest AR Developer Platform Announcements Snap is gearing up for a 2026 launch of Specs for what will likely be nearly two full years of the Spectacles dev kits having been made available. So this Lensfest marks a half-way point towards a consumer release, and the product team has been busy rapidly iterating on their bespoke, AR app production pipeline. Dedicated AR glasses are very resource constrained, and so Snap has been continuing to evolve their Lens Studio developer tool and optimizing their SnapOS platform for Spectacles. Snap didn't share any news on their target specifications for the Specs, but they released eight significant releases of their development tools over the past year with some of the biggest announcements being shared as the primary focus at Lensfest. Snap is launching Snap Cloud, based upon a Supabase deployment of their open source, PostgreSQL hosted solution. This will allow developers to dynamically load assets, call edge functions, and more easily set up database backends. This will hopefully help to enable Spectacles AR lenses to go beyond some byte-sized entertainment and rapidly prototyped experiments into more fully-featured applications that also leverage cutting-edge AI models and computer-vision enabled applications. Spectacles developers have been limited by 25MB lens size limits, but the Snap Cloud announcements makes it so that larger assets can be dynamically loaded, and I expect to see more sophisticated experiences, more AI-driven applications using various cloud services, and lenses that have data persistence that make it more likely for users to want to come back to them. Is the AR Glasses as Front-End to AI a Viable Path? There’s certainly a lot of experimental things happening with various AI services that are cropping up, and Snap is very much embracing the exploratory potentials for AR devs to see what’s possible by making it easier to integrate with these services. While many are excited about the possibilities and potential of the mashups between AI and AR, there are also a ton of open questions that have yet to be answered on what types of business models will prove to be sustainable. We very well could be in an AI Bubble where many of these emerging AI services prove to be economically unsustainable as the costs to run them may continue to outpace the revenue that they generate. But Snap seems content to go all-in on enabling AR developers to see what’s possible, while also trying to mitigate the risks through some more experimental flags and requests for consent from end users. See my conversation with Terek Judi for more context on how Snap is striking this balance between innovation and AI trust & safety. Snap doesn’t have their own preferred LLM or AI service, and so the Spectacles and Specs may be one of the only AR devices that allow developers to more freely explore all of the various AI options that are out there. But at the same time, the developers of these apps may also be on the hook to foot the bills for whatever AI-driven services they create. The business models for all of these AI-driven AR experiences have yet to be fully fleshed out, and the fly-wheel of innovation is at the point of pure experimentation to see what types of compelling AI-driven experiences may be enabled by the convenience of a face computer. An oft-repeated adage in a number of my conversations is that AR will likely serve as the experiential UI and frontend to an AI backend. Therefore, Snap is very much interested in empowering developers to experiment with these new AI capabilities. The prompt for the Spectacles Lensathon participants was to leverage these new Snap Cloud features from Supabase, including being able to call edge functions to various AI services, implementing database-driven apps, or to have some sort of live multi-player and social interaction facilitated by the Spectacles. Serving as a preliminary judge for the Lensathon allowed me to have a chance to experience what the ten Spectacles track teams were able to pull off in a quick 25-hour hackathon. I share more about some of the trends that emerged in the introduction to my interview with the Lensathon winner, but also within my interviews with the 2nd place and 3rd place teams. Yes, technically many of these AR apps could also be phone-based apps, but the convenience of hands-free, gesture-based triggers with a head-mounted camera on your face may lower the friction enough to make new AR applications much more viable than a phone-based equivalent. Snap as a Dark Horse in the AR Glasses Race Overall, I see Snap as a bit of a dark horse in the race towards fully functional AR glasses, and the big differentiating factor may be what types of experiences developers will be able to make for the Snap Specs launching sometime next year (likely sometime after Labor Day in either Q3 or Q4). This dark horse status is mainly because Snap is going up against some of the biggest companies in the world. The 14th-Anniversary of Snap on September 8, 2025 was marked by an email that Evan Spiegel sent out to all of his employees. In the letter, Spiegel says, “The cutoff for inclusion in the Fortune 500 was $7.4 billion in revenue in 2025, and with analyst estimates suggesting Snap could reach nearly $6 billion in revenue in 2025, we’re not far from achieving Fortune 500 status.” Snap is competing in the XR space with other companies that are near the top of the Fortune 500 list by revenue with Apple at #4, Alphabet (Google) at #7, and Meta at #22. By profit, then Alphabet is #1, Apple is #2, and Meta is #6. It takes a lot of money to do a proper consumer launch of XR hardware, and reporter Alex Heath published a report last week on his new “Access” Substack that Snap CEO Even Spiegel will be in Saudi Arabia this week speaking at their Future Investment Initiative with the intent to raise a $1 billion round money for the Specs release. Heath reports that “sources say Snap plans to turn its Specs hardware unit into an independent subsidiary that can continue raising capital from investors. The idea under discussion is to structure it similarly to Waymo, which operates independently within Alphabet, rather than fully spin off Specs into a new company outside of Snap.” Heath’s report answers some of my own logistical questions, and could provide some additional puzzle pieces for how Snap would continue to punch above its weight on releasing consumer AR glasses in competition with some of the largest companies in the world. Snap Betting on Developer Relations as Differentiating Factor Given that Snap is an underdog in the race towards AR glasses, it means that they have had to differentiate themselves in some fashion, and Snap is betting on their development relations strategy as the key differentiating factor. Meta has de-emphasized collaborating with third party developers for their AI Glasses and Meta Ray-Ban Display Glasses as their smart glasses have been on the market for a couple of years before Meta finally recently announced a pathway for developers to have their own apps interface with them. In contrast, Snap has been taking a much more developer-centric approach for their AR glasses strategy with the Spectacles dev kit being made available on a subscription-basis. Despite the odds, the Spectacles dev kit feels like it is on par with what Micro
NOW PLAYING
#1677: Snap’s AR Developer Relations Plan for 2026 Specs Consumer Launch with Joe Darko
No transcript for this episode yet
Similar Episodes
Mar 19, 2026 ·34m
Feb 18, 2026 ·11m
Feb 11, 2026 ·45m
Nov 12, 2025 ·35m
Oct 17, 2025 ·40m