Pinwheel, a kid-friendly tech company, is introducing a new solution for parents who want to stay connected with their children without giving them a phone. The Pinwheel Watch is a recently launched smartwatch designed specifically for kids aged 7 to 14, offering a child-safe alternative that prevents access to social media and the internet. It features parental management tools, GPS tracking, a camera, voice-to-text messaging, fun mini-games, and — here’s a surprise — an AI chatbot. The smartwatch itself features a sleek black design and a screen that is slightly larger than that of an Apple Watch. In addition to a more standard set of parental controls, the feature some parents might be wary of is the watch’s AI assistant, “PinwheelGPT.” PinwheelGPT is designed as a safer alternative to typical AI chatbots, enabling kids to ask questions about various topics, including everyday curiosities, social interactions, and homework-related questions. In addition to the AI feature, kids and tweens can make calls and send texts on the watch by using voice commands or a keyboard. There’s also a camera for video calls and selfies, along with a voice recorder app. The parent-monitoring features are available through the “Caregiver” app. This allows parents to create a “Safelist” of contacts that their children are permitted to talk to, as well as reject certain phone numbers from being added to the list.
Truv, a provider income, employment, and asset verification solutions integrates with Blue Sage Solutions, a cloud-based digital lending platform for mortgage originators; of direct-to-source verification improves processing turn times and reduces time to close
Truv, a provider of direct-to-source income, employment, and asset verification solutions, announced a strategic integration with Blue Sage Solutions, a cloud-based digital lending platform for mortgage originators. The integration gives lenders access to Truv’s advanced verification capabilities within their existing workflow in the Blue Sage Loan Origination System (LOS), creating a streamlined verification process that significantly reduces costs and improves efficiency for mortgage lenders from application to closing. The integration delivers substantial benefits to mortgage lenders and borrowers: Significant Cost Savings: Lenders using Truv save 60-80% on verification costs compared to traditional solutions, increasing margins per loan file; Accelerated Loan Processing: Direct-to-source verification data coupled with Blue Sage’s automation capabilities improves processing turn times and reduces time to close; Enhanced Borrower Experience: The fully digital verification process reduces the paperchase and cumbersome manual process for borrowers; Streamlined Implementation: Go live quickly with minimal effort, using straightforward configurations. Carmine Cacciavillani, CEO of Blue Sage Solutions, remarked, “This partnership with Truv aligns perfectly with our mission to modernize mortgage lending through technology. The integration provides our clients with instant access to critical verification data, eliminating manual processes while ensuring compliance and accuracy.” Andrew Badstubner, CIO at First Community Mortgage said, ”The integration between Truv and Blue Sage supports that mission by eliminating document clutter and verification delays by providing fast, reliable data straight from the source.”
New technique for detecting tampering in PDF documents uses Python to generate hashes and access intricate PDF structures such as metadata and images, embedding them as hidden key in the relevant file’s page objects
Researchers from the University of Pretoria have developed a new technique for detecting tampering in PDF documents by analyzing the file’s page objects. The new prototype uses Python to detect changes to a PDF document, such as text, images, or metadata. PDFs are increasingly used in various industries and are a target for criminals who want to affect contracts or aid in misinformation. Current techniques for detecting changes in PDFs rely on watermarking and hashing, which can only detect visible parts of a PDF. However, these methods do not analyze hidden elements like metadata or background data, making it difficult to identify exactly where or what was changed. The new prototype uses the hashlib, Merkly, and PDFRW libraries to generate hashes and access intricate PDF structures. It performs two primary functions: protecting a PDF and assessing a PDF for forgery. To protect a PDF, the prototype reads the PDF document and calculates unique digital fingerprints, known as hashes, from various elements. These hashes are secretly embedded as new, hidden keys into the relevant file page object and the PDF’s main “root” object. The PDF tampering prototype works well with Adobe Acrobat, but it does not yet detect all possible PDF changes, such as altering a document’s font without changing the actual content or adding JavaScript code.
Naext’s indoor spatial computing enables people with visual impairments to navigate large-scale, complex buildings independently through smartphones and smart glasses
Naext, founded by Lukas van Delft and Victor van Dinten, is a European startup focused on creating a more accessible world through innovative, privacy-friendly technology. The company uses AI, computer vision, and immersive experiences to enable people to navigate complex buildings independently, using smartphones and smart glasses. Naext’s most visible application is making Dutch public transportation more accessible for people with and without visual impairments. The startup has raised €1.5 million in funding and plans to roll out its technology across Europe through European mobility hubs. The company competes with companies like Be My Eyes, GoodMaps, and Niantic, but sees them as ecosystem partners. Naext’s biggest success is the largest hospital in the Netherlands, which now has over 1,000 active users every month. The Brainport ecosystem supports Naext’s development, with investors, partners, and talent. However, there is room for improvement in large clients willing to adopt startup technology on a large scale.
Apple’s AI agent can provide accessible interactions using Street View imagery, analyze what is seen on a route, and describe the details of the elements to offer contextual clues for visually impaired users
A paper released through Apple Machine Learning Research talks about SceneScout, a multi-modal LLM-driven AI agent that can be used to view Street View imagery, analyze what is seen, and to describe it to the viewer. At the moment, pre-travel advice provides details like landmarks and turn-by-turn navigation, which do not provide much in the way of landscape context for visually impaired users. However, Street View style imagery, such as Apple Maps Look Around, often presents sighted users with a lot more contextual clues, which are often missed out on by people who cannot see it. This is where SceneScout steps in, as an AI agent to provide accessible interactions using Street View imagery. There are two modes to Scene Scout, with Route Preview providing details of elements it can observe on a route. For example, it could advise of trees at a turning and other more tactile elements to the user. A second mode, Virtual Exploration, is described as enabling free movement within Street View imagery, describing elements to the user as they virtually move. In its user study, the team determined that SceneScout is helpful to visually impaired people, in terms of uncovering information that they would not otherwise access using existing methods. If the research pans out, it could become a tool to help visually impaired people virtually explore a location in advance.
Dynamic Island again rumored to change with iPhone 17
The iPhone 17 range has, again, been rumored to use a new Dynamic Island, changing how the UI elements appear for the new smartphone line. Serial leaker Digital Chat Station has posted a series of details about the iPhone 17 collection includes a mention about the Dynamic Island. “The system has a brand new Smart Island UI,” the leaker says, according to a computerized translation. The Dynamic Island’s UI change is only part of the short list of changes that are on the way, according to the leaker’s post. The rest of the list includes a mention of how the standard iPhone 17 will have a fine-tuned design, but without saying what’s happening. The Pro series will have a new-design “horizontal large matrix,” but again, there is no explanation for what this specifically applies to. There is also the expectation of LIPO screens with narrower bezels, the use of a high-resolution 5x optical zoom camera on the Pro models, and the previously rumored camera bump changes. In January, analyst Ming-Chi Kuo said that the Dynamic Island’s size will “remain largely unchanged across the 2H25 iPhone 17 series.” This does give a little leeway for the Pro Max to be changed while others stay the same, but Digital Chat Station’s latest claim seems to apply to more than one model.
iOS 26 lends a frosted glass appearance to the Lock Screen clock, allows users to select lighting effects for any of the clock fonts, and choose a color to tint the glass for a realistic glass effect and can be resized to better match iPhone’s wallpaper
Liquid Glass is everywhere in iOS 26, and it starts right when you pick up your device. Here’s what you’ll see first when you upgrade to iOS 26. The two customizable control buttons on the Lock Screen are larger and have a floating, glass-like appearance like the other Liquid Glass interface options in iOS 26. The clock has a frosted glass appearance with the new “Glass” option, using lighting effects to make it look like glass in the real world. Glass can be selected for any of the clock fonts, and you can choose a color to tint the glass. Apple has multiple preset options, or you can select your own. When you tilt your iPhone, light reflects and glints with the movement, for a realistic glass effect. Notifications that are on your Lock Screen have a Liquid Glass aesthetic with a frosted glass look that leaves your wallpaper visible behind them. In addition to having a Liquid Glass aesthetic, the clock can be resized to better match your iPhone’s wallpaper using a new adaptive feature. When you’re customizing your Lock Screen, you can grab the corner of the time and drag it down to expand it. Adjusting the size of the time only works with the first font option, and only with the standard Arabic, Western numbering. With photo wallpapers, the time can automatically expand to fill in missing space, and it can change based on the image if you have Photo Shuffle set. The subject in photo wallpapers is meant to always be visible, and can overlap the time in unique ways in iOS 26. There is a new default wallpaper that was designed for iOS 26. It’s multiple shades of blue, with the same floating glass aesthetic that the rest of iOS 26 features. The wallpaper can subtly shift with iPhone movement. Aside from the Liquid Glass time, Spatial Scenes are the biggest change to the Lock Screen. 2D photos that you set as wallpaper can be turned into 3D spatial images that separate the subject of the photo from the background using depth information. When you move your iPhone, Spatial Scenes shift and move along with it, making the images feel alive. Spatial Scenes is a feature in the Photos app too, and it can be added to any image that you’ve taken with your iPhone, including older ones. Lock Screen widgets can be placed on the top of the display under the time, or at the bottom of the display. With the adaptive clock and new wallpaper options, widgets can also shift down automatically to ensure the subject of an image is always visible. Apple added a new Lock Screen widget for Apple Music search, but there are no other new Lock Screen widget options. What is new, though, is a new full screen Now Playing interface that shows album art. Artwork expands and animates right on the Lock Screen.
Google enables launching AI Mode with one-tap search on Android and iOS that does away with the homepage; adds slick animation with color glows to encompass entire screen for iOS
Besides the widget shortcut, Google is making AI Mode faster to access with one-tap search on Android and iOS. Previously, launching AI Mode from the shortcut beneath the Search bar in the Google app or widget would bring you to an introductory homepage. You’d then have to touch the “Ask AI Mode” field before you could start typing. Opening AI Mode now immediately takes you to the input box with the keyboard open. The header just shows the ‘G’ logo (and close button), while the suggested queries carousel disappears after you enter text for a minimalist look. With the previous homepage no longer available, you cannot quickly access conversation history. Google tells us to soon expect direct access from the text field. One-tap AI Mode access is live on both Google for Android and iOS. On the latter platform, Google has introduced a very slick animation. Tapping the AI Mode button will expand the usual Search field to encompass your entire screen as the keyboard pops up. As this occurs, there’s a four-color glow around the expanding perimeter that looks very nice. It fades out just as everything settles, while closing AI Mode also results in a visual effect. There’s no equivalent animation on Android right now, but there are other colorful touches.
Eppo, a feature flagging and experimentation platform offers “confidence intervals” to make it easier to understand and interpret the results of a randomized app experiments and different versions of apps and models
Datadog has acquired Eppo, a feature-flagging and experimentation platform. Despite the demand for tools that let developers experiment with different versions of apps, the infrastructure required for product analytics remains relatively complex to build. Beyond data pipelines and statistical methods, experimentation infrastructure relies on analytics workflows often sourced from difficult-to-configure cloud environments. Eppo will continue supporting existing customers and bringing on new ones under the brand “Eppo by Datadog.” Eppo offers “confidence intervals” to make it easier to understand and interpret the results of a randomized app experiment. The platform supports experimentation with AI and machine learning models, leveraging techniques to perform live experiments that show whether one model is outperforming another. Eppo co-founder and CEO Che Sharma said “With Datadog, we are uniting product analytics, feature management, AI, and experimentation capabilities for businesses to reduce risk, learn quickly, and ship high-quality products.” For Datadog, the Eppo buy could bolster the company’s current product analytics solutions. “The use of multiple AI models increases the complexity of deploying applications in production,” Michael Whetten, VP of Product at Datadog, said. “Experimentation solves this correlation and measurement problem, enabling teams to compare multiple models side-by-side, determine user engagement against cost tradeoffs, and ultimately build AI products that deliver measurable value.”
Google redesigning the Search bar widget on Android taking after Circle to Search revamp earlier this year with an overarching pill-shaped container
Google is rolling out a redesign of the Search bar homescreen widget on Android that better emphasizes the optional shortcut. The previous design was a pill with the Google ‘G’ logo at the left. Next up is a custom shortcut, voice input microphone, and Google Lens shortcut. This new design takes after Circle to Search revamp earlier this year with an overarching pill-shaped container. It’s slightly taller than before, which aligns with Material 3’s preference for thicker search fields. At the left is a large Search bar that’s unchanged. What’s new is how Google moved the optional shortcut to a standalone circle at the right. This results in the custom button standing out much more, and is easier to tap. The available options are: None, AI Mode, Translate (text), Song Search, Weather, Translate (camera), Sports, Dictionary, Homework, Finance, Saved, and News. The minimum width to have everything appear is 4×1, instead of 3×1, which might disrupt some layouts. When you adjust the transparency slider, the outer container is what changes the most. We’re seeing this Search bar redesign with Google app 16.17 (latest beta). If you don’t have this change yet, highlight the widget on your homescreen and tap the pencil icon.