AI visionary Sam Altman is leading a project to distinguish real people from software fakes on the internet using eye scans. The World identification project, which uses eye scans to distinguish people from machines, is entering the money transfer and financial services business. Users can send money to friends and family free of charge via the World app and will have an account number for interactions with the banking system. The project aims to make it increasingly difficult to distinguish people from software online. Users create a profile called “World ID” using an eye scan on World scanners called Orb. As an incentive, World is launching its own digital currency. The project is also targeting online dating markets, such as gaming specialist Razer and dating platform operator Match Group. With these new functions, World is moving closer to the vision of a super app that covers all possible areas of everyday life, similar to WeChat in Asia. World, a web3 project started by Altman and Alex Blania that was formerly known as Worldcoin, is based on the idea that it will eventually be impossible to distinguish humans from AI agents on the internet. To address this, World wants to create digital “proof of human” tools; these announcements are part of its effort to get millions of people to sign up. After scanning your eyeball with one of its silver metal Orbs — or now, one of its Orb Minis — World will give you a unique identifier on the blockchain to verify that you’re a human.
Wells Fargo to bring Operation HOPE in-branch SME financial coaching program to more neighborhoods in Los Angeles, and Charlotte; targets expansion to 50 markets across the U.S. by 2026
Wells Fargo in collaboration with Operation HOPE, a national nonprofit dedicated to financial empowerment for underserved communities, today introduced HOPE Inside for Small Business to provide financial coaching and support to small business customers in key markets at no cost. Starting in Baldwin Hills and Van Nuys neighborhoods of Los Angeles, Calif., and Charlotte, N.C., this expansion builds on the existing HOPE Inside program that helps empower community members to achieve their financial goals through financial education workshops and personalized coaching.
Wells Fargo first launched HOPE Inside centers in 2022 to serve individuals with their personal finances and help build financial resilience through guidance in areas like budgeting, credit building, and money management. The new Small Business HOPE Inside centers will now offer those same specialized resources for small business owners in addition to business plan development, access to capital education and more. Through one-on-one coaching, entrepreneurs will receive personalized support to help start, grow, and sustain their businesses. “We know that having access to trusted financial guidance is invaluable,” said April Schneider, head of Small and Business Banking at Wells Fargo. “This kind of continued community investment showcases how important small businesses are to our communities and our commitment to help them thrive.”
HOPE Inside centers are located inside Wells Fargo branches in select markets, and feature Operation HOPE financial coaches who foster financial inclusion and economic empowerment. The branches feature redesigned and updated spaces created to deliver one-on-one consultations, improve digital access, and offer financial health workshops. “Small businesses are the backbone of our communities, and we want to provide them with the tools and resources they need to succeed,” said Michael Martino, head of Banking Inclusion Initiative at Wells Fargo. “Expanding HOPE Inside to support small businesses is a natural evolution of our work with Operation HOPE and reinforces our shared vision of building stronger, more financially resilient communities.” The expansion is part of a broader national effort through Wells Fargo’s Banking Inclusion Initiative to bring HOPE Inside to 50 markets across the U.S. by 2026. Currently at 30 HOPE Inside centers, serving over 100 branches, the impact of the program has helped more than eleven thousand clients since launching in 2022. All services offered through HOPE Inside are at no cost and available to community members, whether they are Wells Fargo customers or not.
Sakana’s Continuous Thought Machines (CTM) AI model architecture uses short-term memory of previous states and allows neural synchronization to mirror brain-like intelligence
AI startup Sakana has unveiled a new type of AI model architecture called Continuous Thought Machines (CTM). Rather than relying on fixed, parallel layers that process inputs all at once — as Transformer models do —CTMs unfold computation over steps within each input/output unit, known as an artificial “neuron.” Each neuron in the model retains a short history of its previous activity and uses that memory to decide when to activate again. This added internal state allows CTMs to adjust the depth and duration of their reasoning dynamically, depending on the complexity of the task. As such, each neuron is far more informationally dense and complex than in a typical Transformer model. CTMs allow each artificial neuron to operate on its own internal timeline, making activation decisions based on a short-term memory of its previous states. These decisions unfold over internal steps known as “ticks,” enabling the model to adjust its reasoning duration dynamically. This time-based architecture allows CTMs to reason progressively, adjusting how long and how deeply they compute — taking a different number of ticks based on the complexity of the input. The number of ticks changes according to the information inputted, and may be more or less even if the input information is identical, because each neuron is deciding how many ticks to undergo before providing an output (or not providing one at all). This represents both a technical and philosophical departure from conventional deep learning, moving toward a more biologically grounded model. Sakana has framed CTMs as a step toward more brain-like intelligence—systems that adapt over time, process information flexibly, and engage in deeper internal computation when needed. Sakana’s goal is to “to eventually achieve levels of competency that rival or surpass human brains.” The CTM is built around two key mechanisms. First, each neuron in the model maintains a short “history” or working memory of when it activated and why, and uses this history to make a decision of when to fire next. Second, neural synchronization — how and when groups of a model’s artificial neurons “fire,” or process information together — is allowed to happen organically. Groups of neurons decide when to fire together based on internal alignment, not external instructions or reward shaping. These synchronization events are used to modulate attention and produce outputs — that is, attention is directed toward those areas where more neurons are firing. The model isn’t just processing data, it’s timing its thinking to match the complexity of the task. Together, these mechanisms let CTMs reduce computational load on simpler tasks while applying deeper, prolonged reasoning where needed.
Adoption of developer-focused AI tools surging at the expense of , freelance platforms such as Fiverr and Upwork; AI writing tools, crowdsourcing and search are fading fast
A new report released by the publicly traded market research and intelligence firm SimilarWeb—covering global web traffic patterns for AI-related platforms for 12 weeks through May 9, 2025—offers a helpful look for enterprises and interested users into the current landscape of generative AI usage online. Using proprietary analytics based on site visits, the report tracks trends across sectors including general-purpose AI tools, coding assistants, content generators, and more. Here are five key findings from the report: 1) Usage of developer AI and coding tools is rising fast: Developer-focused AI tools are surging in adoption, with traffic to the category up 75% over the past 12 weeks. That growth includes Lovable, which exploded with a jaw-dropping +17,600% spike, and Cursor, which grew steadily month over month. 2) We all know DeepSeek had a moment earlier this year — but so did Grok — and now both have fallen back into low plateaus: Grok traffic skyrocketed more than 1,000,000% in March—driven by its branding as an uncensored yet powerfully intelligent platform and Elon Musk association—before falling more than 5,200% by early May. DeepSeek saw a similar arc, peaking at +17,701% growth before crashing -41%. The takeaway: virality can’t replace retention, especially compared to AI leader OpenAI and also legacy tech brand Google. 3) AI writing tools are fading fast: Category traffic fell 11% overall, with platforms like Wordtune (-35%), Jasper (-19%), and Rytr (-23%) all trending downward. Only Originality.ai bucked the trend with steady traffic gains, likely due to its focus on AI detection rather than generation. This plateau suggests content saturation and possibly growing skepticism over quality or usefulness. 4) AI image generators and design tools show extreme volatility: Design-focused AI remains a mixed bag. While overall category usage dipped slightly (-6% over the 12-week window), some platforms made outsized gains. The erratic pattern may reflect a crowded landscape of tools that offer similar functionality but compete on novelty or aesthetics. 5) AI is eating up legacy tech such as crowdsourcing and search: Freelance platforms such as Fiverr (-17%) and Upwork (-19%) are losing traffic, possibly as users turn to AI tools for tasks like design, writing, and code. Search engines such as Yahoo (-12%) and Bing (-14%) continue a multi-quarter drop in visits, while consumer EdTech companies like Chegg (-62%) and CourseHero (-68%) are in free fall. The signs point to early-stage AI disruption beginning to erode the utility of some legacy platforms. It also offers a hint to enterprises that either leverage or create such services — the time may be coming to reduce dependency on them, either from a revenue generation, marketing, or overall business perspective.
J.P. Morgan Payments supports Amtrak for enhanced cash reconciliation and centralized treasury management on SAP’s S/4 HANA |
Amtrak needed a way to better forecast the company’s cash flow.4 Most cash flow forecasts in the industry rely on a generally accepted accounting principles view of accounting and transactions. However, Amtrak’s treasury team knew that a liquidity-based view would be more effective for its unique situation. “We needed to become a data-driven treasury department,” says Ashmore. “And we recognized that the best source of data was the bank.” While meeting with J.P. Morgan Payments in October 2022, Ashmore requested a forecasting tool and learned that the bank already had such a solution in beta mode.4 Amtrak then implemented the beta solution and provided feedback as J.P. Morgan Payments finished enhancing and building out the tool. Amtrak went live using the Cash Flow Intelligence* tool, which is built into the J.P. Morgan Access® platform, in May 2023. After increasing visibility into its cash flow position, the Amtrak treasury team wanted to further strengthen their financial controls by sharing insights with their colleagues in finance. The company began using the J.P. Morgan Payments SAP plug-in for real-time treasury (SAP RTT) to manage station cash reconciliation as part of an ongoing effort to centralize its treasury management system through SAP implementation.4 Leveraging SAP RTT’s real-time cash position, reporting, reconciliation and real-time payment tracking, Amtrak could track payments end-to-end across the company and enable real-time processing for accounts payable and receivable. Ashmore shared that after working closely with J.P. Morgan Payments, Amtrak implemented the solution with only a few clicks. Doing so meant Amtrak had access to cutting-edge artificial intelligence and machine learning technology without having had to invest significant internal resources. Right away, Amtrak put the Cash Flow Intelligence tool to use to find previously invisible patterns in the company’s cash flow and used the tool to segregate cash flows into categories.4 For example, it separated daily credit card receipts, monthly receipts from state and agency partners, and more infrequent federal receipts throughout the year. Ashmore shared that this significantly increased projection accuracy because the large payments no longer interfered with daily and monthly forecasting. This accuracy helped Amtrak free up balances that they had set aside to cover cash flow issues. By investing these idle balances, Amtrak generated better returns for the company. It also provided an opportunity to receive more grant-based funding for large infrastructure projects. Through its relationship with J.P. Morgan Payments, Amtrak discovered a cash forecasting solution that supported the company’s overall business goals of increasing ridership and maintaining excellence in customer service. With J.P. Morgan’s digital dashboard within SAP, Amtrak can view balances and transactions received from the bank during the day and more precisely manage its overall cash position.4 The company plans to continue pursuing digital evolution and system enhancement as the company expands its routes and improves its services.
Dell leverages ‘customer zero’ model—using internal teams as first adopters—to refine AI services in real time and accelerate scalable enterprise deployment
Customer zero is becoming a strategic advantage in the age of AI-powered services. Enterprises deploying artificial intelligence at scale are learning that the real advantage isn’t just in new tools, it’s in being their own first customer. This “customer zero” approach lets them test AI in-house, fine-tune it in real time and apply those insights externally. By embedding intelligence into workflows from day zero, they gain speed, precision and a repeatable model for real-world impact, according to Doug Schmitt, chief information officer of Dell Technologies Inc. and president of Dell Technologies Services Inc. Acting as customer zero gives Dell greater control over AI’s impact while grounding innovation in real business needs, not theory. By testing and refining AI internally, Dell builds credibility and sharpens its services-led approach, guiding customers from strategy to deployment. That firsthand experience helps reduce friction and deliver smoother, faster AI transformations across the enterprise, according to Scott Bils, vice president and general manager of product management, professional services, at Dell Technologies. “When you take a look at what we’re doing from a professional services standpoint, it’s really to help customers on their end-to-end journey and helping them drive AI transformation. It’s around helping them deploy,” he said. “We provide the consulting services and manage services around it all the way from day zero to day two plus. As we go through the journey internally at Dell, [it] provides us a tremendous amount of insight that we can then take to our customers and help accelerate their journey.” This internal-first mindset has also enabled Dell to digitize and refine its own processes, creating a blueprint that customers can follow. AI is layered into workflows that are already disciplined and well understood, allowing for both rapid experimentation and reliable results. That strong foundation of data, automation and process discipline provides fertile ground for scalable AI and LLM deployment, Schmitt noted. Customer zero is more than a model; it’s a mindset that blends internal accountability with innovation, giving organizations the confidence to build, test and deliver real-world AI outcomes. As AI factories mature and agents begin to automate decisions across the enterprise, that feedback loop between internal use and external delivery will be essential.
Marqeta plans white label app that will allow customers to establish a track record for card programs without having to embed Marqeta’s solution into its app or website right out the gate
Marqeta has been making moves to add new revenue sources in addition to Jack Dorsey’s Block under the leadership of Mike Milotich. The plan is diversification through growth, a task easier said than done against the backdrop of macroeconomic and regulatory uncertainty. Milotich currently serves as Marqeta’s interim CEO as well as its chief financial officer. Broadly, flexible planning and an emphasis on execution will help Marqeta as an organization reach those goals, he said. The fintech has started planning in quarterly chunks, with mid-quarter check-ins to assess market conditions. Specifically, Marqeta is looking to broaden its customer base with new products, Milotich said, including an expansion into credit card issuing, the addition of more value-added services, such as tokenization and risk services, and new program management services, where Marqeta runs the card program on behalf of the client. “Before, [program management services] used to be more of a bundle, and now we’re breaking them up into more a la carte services, which allows our customers a little more flexibility to pick and choose,” he said. The card issuing fintech is also looking to expand abroad, with its pending acquisition of TransAct Pay, a European-based BIN sponsorship, e-money licensing and virtual account services company that will allow Marqeta to offer more robust card programs to its multinational clients. “Non-Block [total payment volume] saw continued strength and little to no macro-disruption”, Keybanc Capital Markets analyst Alex Markgraff wrote in a research note. “We view the print as generally positive with respect to new-business, non-Block growth, and macro-related resilience to date. Non-Block TPV grew roughly twice as fast as Block.” Block-related revenue was less than half – 45% – of Marqeta’s total revenue at the end of the quarter, down from 74% at the end of 2022. Marqeta’s biggest bets to increase the diversity of its clientele revolve around creating tools that make doing business with the fintech easier. To that end, Marqeta is launching a white label app that will allow customers to stand up a card program without heavy integration, Milotich said. The white label app will allow customers to establish a track record for the card program without having to go through the process of embedding Marqeta’s solution into its app or website right out the gate, The white label app is built with the tools that power its UX Toolkit, a selection of application programming interfaces released in 2024 that are designed to allow customers to more easily embed the card solutions into their app or website. At its core, the white label app is a time-to-market tool.
Google’s contribution to vibe coding is Stitch, a platform that designs user interfaces (UIs) with one prompt
Google is releasing Stitch, a new experiment from Google Labs, to compete with Microsoft, AWS, and other existing end-to-end coding tools. Now in beta, the platform designs user interfaces (UIs) with one prompt. With Google Stitch, users can designate whether they want to build a dashboard or web or mobile app and describe what it should look like (such as color palettes or the user experience they’re going for). The platform instantly generates HTML, CSS+ and templates with editable components that devs and non-devs can customize and edit (such as instructing Stitch to add a search function to the home screen). They can then add directly to apps or export to Figma. Users can choose a ‘standard mode’ that runs on Gemini 2.5 Flash or switch to an ‘experimental mode’ that uses Gemini Pro and allows users to upload visual elements such as screenshots, wireframes and sketches to guide what the platform generates. Google also plans to release a feature allowing users to annotate screenshots to make changes. Stitch is “meant for quick first drafts, wireframes and MVP-ready frontends.”
Senator Warren urges Fed to reconsider Capital One deal for Discover as it would inflict “serious harm” on consumers and the banking system
The top Democrats on congressional banking committees called on the Federal Reserve to reconsider its decision to approve Capital One Financial Corp.’s purchase of Discover Financial Services, saying it would inflict “serious harm” on consumers and the banking system. The decision sounds like the Fed “had predetermined it was going to approve the transaction and either ignored relevant facts or explained them away with baseless assertions copied and pasted from Capital One’s application,” Senator Elizabeth Warren and Representative Maxine Waters said in a letter sent to the Fed. “Treating the transaction as a traditional bank merger was deeply misguided,” the lawmakers wrote. “These are not two traditional banks — they are card giants.” Warren and Waters emphasized the Fed’s review failed to appropriately assess the competitive effects on the credit-card market and didn’t take into account the views of the Consumer Financial Protection Bureau and the Federal Deposit Insurance Corp. The Fed said in an order last month that it consulted with other regulatory agencies including the FDIC and CFPB. Capital One said the deal’s approval follows “an exhaustive, fact-based 14-month examination where legal and regulatory experts examined the deal’s competitive impact, financial stability considerations, community needs, and all other relevant factors.”
IBM’s small models- Tiny Time Mixers — tackle network automation challenges where traditional large language models fall short and have an understanding time-series data
IBM Corp. is leaning into compact, specialized models — such as its new Tiny Time Mixers — to tackle network automation challenges where traditional large language models fall short. The key lies in understanding time-series data, something most large language models simply weren’t built to handle, according to Andrew Coward, general manager of software networking at IBM. “There’s new models, and IBM’s built one called Tiny Time Mixer. Very small parameters, million parameters, and they understand time. We can take network data, and then we can apply it to weather information or TV schedules. Then we can make predictions about what’s likely to happen. What we are seeing is the democratization of AI,” he said. “It’s almost free to put data in and run it against AI models, but if you need to train it, that’s the expensive bit. The training piece is coming down massively in costs.” Using small models, IBM helps address telco infrastructure problems, such as bandwidth congestion and poor network coverage. This explains why AI model accuracy takes center stage, Coward pointed out.