Roblox has announced “Roblox Moments,” a new short-form video feature similar to TikTok, allowing users aged 13 and above to capture, edit, and share 30-second gameplay clips. Users can add music, write descriptions, and browse a scrollable feed of community-shared clips, reacting with emojis or jumping directly into the featured experiences. All content is moderated before posting, and users can report inappropriate videos. Roblox plans to introduce API-based tools later this year, enabling creators to build custom in-game creation and discovery systems. To support its creator community, Roblox is increasing the Developer Exchange (DevEx) rate—100,000 Robux will now convert to $380 instead of $350. The platform is also rolling out new AI tools that allow creators to generate fully functional objects, starting with vehicles and weapons. Text-to-speech and speech-to-text APIs will enhance immersion, enabling NPC dialogue and voice commands with ten customizable English voice presets. Performance improvements include a new “Server Authority” mode to reduce cheating and improve physics realism. Avatars will soon move more naturally, with lifelike running, climbing, and object interaction. Roblox also plans to boost visual fidelity across devices without requiring extra effort from creators or sacrificing performance. These updates aim to enhance creativity, interactivity, and user experience across the platform.
Snap OS brings a native browser for its AR glasses; new Spotlight Lens anchors vertical video in space, overlaying creator content hands‑free while users move through daily tasks
Snap has unveiled the second version of its Snap OS, the software powering its AR glasses known as Snap Spectacles. The update includes a new native browser, WebXR support, and more. The launch comes as the company rolled out its fifth-generation Snap Spectacles for developers last year and plans to launch a consumer version sometime in 2026. Snap OS 2.0 introduces a faster browser that’s easier to use, the company says. Snap has optimized page loading speed and power usage to allow users to navigate quickly. There’s also a new home screen with widgets and bookmarks, alongside an updated toolbar that lets users type or speak a website URL, navigate history, and refresh the page. Plus, users can now resize windows just like they would on a laptop. Snap notes that the browser now supports WebXR, allowing users to access augmented reality experiences directly from any WebXR-enabled website. Additionally, there’s a new Spotlight Lens that spatially overlays content onto the real world. Now users can do things like wash dishes while watching videos from their favorite creators. In addition, Snap is making it easier to relive and share favorite memories with its new Gallery Lens that lets you view your Spectacles captures in an interactive layout. You can scroll through a carousel of your videos and organize your favorites before sending them to a friend or posting to your Story. There’s also a new Travel Mode feature that stabilizes AR content and tracking systems while you’re on the move, such as in a car or on an airplane.
Warby Parker and Google’s smart glasses with multimodal AI could streamline ‘spatial commerce’ tasks by connecting the smart glasses directly to a user’s phone, without requiring to download separate apps
Warby Parker and Google are jointly developing AI-powered glasses intended for all-day wear. The companies intend to launch a series of smart eyewear products that will incorporate multimodal AI with prescription and non-prescription lenses as soon as 2026. The glasses will be built on Google’s Android XR eyewear platform and quipped with a camera, microphones and speakers, Android XR glasses work in tandem with a user’s phone, giving them access to their apps and an optional in-lens display privately provides information when the user needs it. The Warby Parker-Google model may prove more successful with shoppers. Here are a few reasons why. Both companies cited Warby Parker’s history of stylish design in announcing their joint initiative. If their new AI-equipped glasses look smart as well as act smart, a major hurdle to widespread adoption will be removed. By connecting their smart glasses directly to a user’s phone, Warby Parker and Google are streamlining the process a user has to go through to perform spatial commerce tasks. With smart glasses no longer seen as the stuff of science fiction and Apple Vision Pro and Meta Quest generating real sales figures, Warby Parker and Google are entering a developing but established market. For now, whether they can crack the code that will lead to large numbers of consumers engaging in spatial shopping remains the stuff of future speculation.
Epic Games offers data API to let creators access several data points about Fortnite islands including engagement and retention metrics without requiring any authentication
Epic Games is launching the Fortnite Data API, which makes several data points about Fortnite islands public for creators to access. This data includes engagement and retention metrics, allowing other creators to better understand what makes certain kinds of games successful with players. According to Epic, it plans to reveal more about the data’s uses at the upcoming Unreal Fest in Orlando. The Fortnite Data API includes various data points on each island, including: Total minutes played by all players; Average total minutes per unique player; Number of game sessions played; Number of times players favorited the island; Number of times players recommended the island (previously “Likes”); Peak concurrent player count; Number of distinct players; Day-over-day retention; Week-over-week retention. Each of these data points is available for third-party analysis and insights, for a period of up to seven days. It’s publicly accessible and doesn’t require authentication to access.