M0 Foundation, a platform building infrastructure for crypto apps and protocols to create application-specific stablecoins, has raised $40 million in early funding to allow developers to deploy custom digital currencies. M0 aims to offer developers the opportunity to create their own applications using a stablecoin, without the need for a centralized issuer. The platform provides infrastructure that allows them to launch application-specific digital currencies. To do this, M0 separates stablecoin reserve management from programmability. When stablecoins are created, they must be backed by assets held by the issuer, such as U.S. dollars or government bonds, that support the stablecoins in circulation. This reserve acts as collateral, allowing users to redeem the stablecoin for the underlying asset and provides confidence in the coin’s stability. Using the platform, developers can do more than place their own apps on top of another stablecoin, Prosperi said. It allows them to create and manage their own digital money by giving them the building blocks to specialize according to their business and application needs.
Truist claims competitor Colliers’ raid on its real estate finance arm led to layoffs and major revenue loss
Truist is pushing back against a competitor it says orchestrated a sweeping raid of its mortgage unit, claiming the move drained the bank of revenue and forced layoffs. Charlotte, North Carolina-based Truist Financial Corp. and its mortgage banking arm resisted a bid for a pretrial win by its former executives’ new employer, arguing that troves of evidence sustain its claims that over 50 employees were illegally poached, costing the bank tens of millions of dollars in losses. Truist argued in the North Carolina Business Court that it’s sufficiently alleged — or at least, facts remain at issue — that former executives Matthew Rocco, Joe Lovell and John Randall were the architects of a scheme to cripple Truist and its real estate finance arm Grandbridge Real Estate Capital LLC. In “weaving its story,” competitor Colliers Mortgage Holdings LLC ignored evidence showing that it directed the trio to share trade secrets and divert employees. “In short, in purporting to outline the material facts, Colliers ignores all evidence of its unlawful collusion with executive defendants to concoct a plan that, in both design and effect, would leave plaintiffs’ mortgage banking business a shell of itself,” according to a partially unsealed version of Truist’s brief filed Monday. Truist sued the three former Grandbridge executives and Colliers in early 2023, alleging a coordinated conspiracy to recruit employees and sabotage the bank’s mortgage business. All three resigned in December 2022 and joined Colliers. According to Truist, the executives were bitter over Colliers’ failed buyout bid for Grandbridge. The trio countersued in May 2024, alleging Truist took control of Grandbridge, seeking to fold it into the larger business, and tried to force them out to avoid severance. That the executives may have made an earlier decision to leave the company is immaterial, as that has no bearing on whether Colliers encouraged them to breach their NDAs or non-solicitations, Truist said. Further, Colliers’ arguments that there’s no evidence to prove it are patently wrong, Truist said, citing an order from the court last year trimming the suit, in which Judge Louis A. Bledsoe III determined Truist’s allegations are sufficient to assert “Colliers’ purposeful conduct,” according to the brief. The facts show that Rocco, Lovell and Randall were motivated by “prodigious financial incentives,” and, at Colliers’ direction, contacted Grandbridge employees about making the jump, Truist said. Colliers’ alternative theory for why all the employees left doesn’t acknowledge the plain evidence, Truist said. For example, much of the executives’ effort to solicit Grandbridge employees were made weeks before Collier posted any jobs, it said. Colliers’ argument for summary judgment also ignored “Project Hornet,” which Truist said is a framework for Colliers, Rocco, Lovell and Randall to recruit everyone at Grandbridge. What Collier pans as “[o]rdinary industry competition” was far from it, as the company had the advantage of working with Grandbridge’s top executives who provided a detailed solicitation roadmap, it said. “No other firm had executive defendants feeding it information and assisting it in violation of their fiduciary duties. No one else constructed a master plan to recruit and hire virtually every person who worked at Grandbridge,” Truist said. Phone and text conversations show the executives preparing names and “walk over funds,” it said.
CFOs adopt “generative finance” to shift from backward‑looking data compilation to forward‑thinking strategic modeling; creating tailored forecasts that simulate infinite business evolution possibilities.
Generative AI is introducing a whole new paradigm in the form of Generative Finance. With it you have a forward-looking capability that takes you from reporting on what happened to a model that more strategically considers what could happen next. It helps you refocus on the windshield, simulating the infinite possible ways your business can evolve. It means you can finally begin looking ahead, testing strategies for what happens next, and making assumptions and decisions based on a more complete picture of everything ahead of you. The shift from hindsight to foresight equips your organization with an edge in strategic thinking. It doesn’t just organize the numbers you have; it intelligently creates new data to show you what’s possible. It learns the unique patterns and rhythms of your specific business from your own data. It lets you explore hundreds of potential outcomes for any decision you’re considering. It presents a range of probable futures, not just a single, rigid prediction. It continually refines its understanding, getting smarter and more accurate over time. With generative finance, you can get detailed, data-supported answers in minutes, not months. When you adopt generative finance, your FP&A professionals are freed from the tedious work of manual data entry and report building. They can stop spending their time compiling spreadsheets and start investing their time in analysis and strategy. They become true business partners who can interpret complex scenarios, stress-test business plans, and provide the insightful guidance your leadership team needs to make better decisions. The real power of this technology is unlocked when it is explicitly trained on your company’s own data. By learning from your unique financial history, sales cycles, and operational details, the AI becomes a true expert on your business. This deep customization is what makes its forecasts and scenarios so reliable. The insights you get from a tailored generative finance model are directly relevant and immediately actionable because they are rooted in your reality.
Nvidia’s new GPU is designed for long context inference targets video and coding workloads with million‑plus token windows; shipping end‑2026
Nvidia announced a new GPU called the Rubin CPX, designed for context windows larger than 1 million tokens. Part of the chip giant’s forthcoming Rubin series, the CPX is optimized for processing large sequences of context and is meant to be used as part of a broader “disaggregated inference” infrastructure approach. For users, the result will be better performance on long-context tasks like video generation or software development. Nvidia’s relentless development cycle has resulted in enormous profits for the company, which brought in $41.1 billion in data center sales in its most recent quarter. The Rubin CPX is slated to be available at the end of 2026. Inference consists of two distinct phases: the context phase and the generation phase, each placing fundamentally different demands on infrastructure. The context phase is compute-bound, requiring high-throughput processing to ingest and analyze large volumes of input data to produce the first token output result. In contrast, the generation phase is memory bandwidth-bound, relying on fast memory transfers and high-speed interconnects, such as NVLink, to sustain token-by-token output performance. Disaggregated inference enables these phases to be processed independently, enabling targeted optimization of compute and memory resources. This architectural shift improves throughput, reduces latency, and enhances overall resource utilization.
Plaid has agreed to pay JPMorgan for customer data amid ‘open banking’ feud
Plaid Inc. agreed to pay JPMorgan Chase & Co. for its consumer data, the latest accord in a battle between financial technology firms and banks over who can access the sought-after information. Plaid and JPMorgan confirmed the fees but wouldn’t disclose the amount. In addition to a fee structure, the updated pact includes commitments from both firms “to ensure that consumers can access their data safely, securely, quickly and consistently into the future.” The two firms already had a data-sharing deal in place that didn’t include fees. Plaid said the updated agreement wouldn’t affect current customer deals and pricing. “This extended agreement ensures ongoing access for the millions of Chase customers who rely on Plaid every day to connect with the products and services they trust,” Plaid Chief Operating Officer Eric Sager said. In the past, the bank has said Plaid and its peers “endlessly” access the bank’s customer data for free and then profit off it by charging others to use it. Others view the charges as an effort to limit innovation and potentially shut out up-and-coming competitors that threaten the bank’s customers. Plaid’s deal with JPMorgan, reached last week, also derailed a lawsuit fintechs were set to file against the bank over the data-access fees. The Financial Data and Technology Association of North America was set to file the lawsuit on behalf of its members and is now reconsidering its options.
M&T Bank is making its data AI-ready with software that speeds up the production of data lineage, provides a single repository and enables interrogation and analysis that before would not have been possible
“Data and AI come very tightly coupled, because it’s quite hard often for AI deployment to be successful without the trusted data that you need for it to be successful,” Andrew Foster, chief data officer at M&T Bank in Buffalo, told American Banker. Like some other data chiefs in the industry, Foster’s remit includes defining and executing both an AI strategy and a data strategy for the bank. He chose Microsoft Copilot. Today, 16,000 of the bank’s 22,000 employees use the gen AI model for first drafts of emails and reports, and to summarize call center conversations. “For anything involving capturing and using and interrogating text, it’s a starting point,” Foster said. Generative AI can also interrogate SQL databases, he noted. M&T’s software developers use GitLab to help generate code. In most such use cases, “gen AI gets you 60% of the way, then a human reviews it and takes it the other 40%,” Foster said. The benefit is an “uplift in human efficiency, which is obviously useful,” Foster said. “It makes everyone’s work better, faster, stronger.” Having generative AI summarize calls, for instance, saves about six minutes per call. Employees quickly grow fond of the tools, according to Foster. At one point, M&T ran a pilot with 800 people, then got pushback when it considered shutting down the gen AI model. “People say, ‘it’s transcendent, I can’t go back to the way things were,'” Foster said. But he also noted one challenge of large language models: the problem of having multiple right answers. “If you ask Copilot, help me craft an email or help me craft a press release, you could get three different versions, and each of them is right for its own version of rightness,” he said. “So we’ve put human decision-making, critical thinking, at the center of AI adoption. You’re not deferring your own judgment to the machine through the adoption of Copilot. It’s giving you more tools to be effective, but the human being retains that accountability.” When Foster arrived at M&T in March 2023, after 12 years in a similar job at Deutsche Bank, he started a data academy providing in-person and remote training on data governance. So far, 2,000 people have gone through the training. And he began a data lineage initiative. “This wasn’t in response to gen AI,” Foster said. “I saw it as a core capability: Do we know where our data comes from and how we use it, how do we bring it to a level where we can interrogate it, how all the data goes from point A to point B?” His team created a repository called Edison that contains authoritative documents and data on all bank policies. The bank deployed data lineage software from Solidatus and from Monte Carlo. The Solidatus software speeds up the production of data lineage, Foster said. It also provides a single repository for the bank’s data, which enables interrogation and analysis that before would not have been possible. It’s helping to make M&T’s data AI-ready. Solidatus integrates with databases and applications, and it retrieves metadata and lineage from within them, explained Tina Chace, vice president of product at Solidatus.
Salt Security introduces MCP Protect and Agentic AI Governance controls integrated with CrowdStrike SIEM to secure proliferating agent-driven API interactions
With the rise of agentic AI, API exposure has proliferated. Agents fan out call paths and amplify traffic, effectively turning APIs into the enterprise “plumbing” of operations, according to Michael Callahan, chief marketing officer of Salt Security. This has created the “API fabric” — a complex, constantly moving mesh of connections that enterprises struggle to see, let alone secure. A large part of the API security conversation is on the role of MCP, an open standard championed by Anthropic PBC, and A2A, Google’s protocol for agent-to-agent interactions, according to Nicosia. Both sit atop existing APIs, acting as brokers to manage data retrieval and collaboration between agents. “For us, the visibility of the AIs and the MCPs … the protocols are so paramount because you can’t protect what you don’t know,” Nicosia said. “Having that visibility from either a zombie API or a zombie MCP protocol server, we give you that visibility. At least you’re aware of all of this proliferation that’s going on with the organization. And then how do you govern it? And then how do you protect against it?” Salt’s momentum has been bolstered by its close partnership with CrowdStrike Holdings Inc. The company is a Falcon Fund portfolio company and has integrated its API security solutions with CrowdStrike’s Falcon platform and next-generation security information and event management. Together, they provide customers with unified visibility across APIs and AI-driven workflows, Nicosia added.
Block joins S&P 500 a sign of the growing mainstreaming of digital payments and cryptocurrency
Block, owner of Square and Cash App, is preparing to join the S&P 500. The move is a milestone for Block and a sign of the growing mainstreaming of digital payments and cryptocurrency in the finance world. Block will replace Hess on the S&P beginning July 23 at the start of trading. Block has transformed itself from a payments processing company to a FinTech that offers peer-to-peer transfers, merchant services and consumer lending, with Square getting the OK from the Federal Deposit Insurance Corp. (FDIC) earlier this year to offer consumer loans through its Cash App Borrow product. In addition, the company is integrating bitcoin payment capabilities into its Square terminals, which is characterized as a reflection of CEO Jack Dorsey’s longtime advocacy for the popular cryptocurrency. “When a coffee shop or retail store can accept bitcoin through Square, small businesses get paid faster, and get to keep more of their revenue,” Block Bitcoin Product Lead Miles Suter said. “This is about economic empowerment for merchants who like to have options when it comes to accepting payments.”
Treasury and Trade Solutions (TTS) and embedded finance platforms driving growth for banks in cross-border transactions and deposit balances amid operational uncertainty, FinTech fragmentation and growing demand for streamlined, data-rich payments
Financial tools once limited to large firms are now accessible to SMBs via APIs and embedded finance. From large lenders like Citi and JPMorgan, to systemically important banks like BNY and Lloyds, as well as major market institutions like Truist, bank executives all stressed to their respective investors the importance of back-office units. Against a backdrop of operational uncertainty, FinTech fragmentation and growing demand for streamlined, data-rich payments, these FIs’ TTS and embedded finance platforms are becoming strategic growth engines. Done right, embedded finance shifts from a buzzword to a reliable infrastructure. Citi’s Services business, for example, posted record second-quarter 2025 revenues of $5.1 billion, up 8% year over year. Market share gains of 40 basis points in TTS were driven by a 7% rise in cross-border transaction value and higher deposit balances. BNY’s Treasury Services offerings were likewise up year-over-year. Nearly two-thirds would switch providers to access embedded finance solutions. For banks, this underscores the opportunity within treasury and payments services. At the same time, innovations like stablecoins are reshaping what treasury management and payments might look like in the future.
A stablecoin is a blockchain token backed 1:1 by cash or cash-like assets, used as a substitute for fiat in on-chain trade with use cases in trade settlement, remittances and online purchases, whereas tokenized deposits are bank-issued tokens backed by dollars held in client accounts
With the GENIUS Act now law, U.S. banks are expected to increasingly explore issuing blockchain-based assets. While many tout stablecoins for faster, cheaper payments, most banks are actually eyeing tokenized deposits — not stablecoins — as the more viable product. Though both are digital tokens tied to fiat value, their nature and implications differ greatly. Stablecoins (like USDC) are backed 1:1 by cash or equivalents, circulate on public blockchains, and are used broadly as money for trade, savings, and remittances. Tokenized deposits, by contrast, are representations of client bank deposits, issued and moved within a bank’s private network, with value transfers still tied to bank-controlled ledgers. Unlike stablecoins, they’re non-fungible across institutions and don’t circulate freely. Their purpose is to modernize existing bank services, not create new monetary systems. The difference lies in function and intent: stablecoins are a new form of money, while deposit tokens are tools for enhancing traditional banking. As banks increasingly mention blockchain innovations, it’s vital to distinguish between the two.