The lines between software vendors, payment processors and merchant services are blurring, leading to changing buying decisions for merchants who are asking for more from their payment providers. Merchants, particularly small and mid-sized enterprises, are looking for service providers to help them run every aspect of their business, including automated employee time tracking, data management, streamlined B2B payments and marketing tools, going beyond transaction services. The merchants themselves are expanding their own business models, which typically involve selling direct to consumers. In doing so, they are leveraging the power of their brand for digital direct-to-consumer experiences, which require the use of payment facilitation or marketplace models. While these models reduce storefront overhead, they drive a new need in helping contend regulatory changes, security concerns and meeting the demands of burgeoning populations of end customers. Value-added services enable those merchants to garner repeat business, to collect data as consumers buy goods and services online — and merchants can offer loyalty and rewards as well as individualized engagement with consumers. Bank of America, for its own part, has conducted its own surveys of small- to medium-sized businesses (SMBs), to ask what merchant services meant to those clients. The customers said they wanted offerings from service providers that help them run every aspect of their business. “They’re looking to these merchant processors and acquirers to provide more than just a transaction service,” Bank of America Merchant Solutions Head Wally Mlynarski said. These growing companies want automated employee time tracking, data management to optimize inventory and streamlined B2B payments — along with marketing tools to enhance their presence across digital and mobile channels. For Bank of America, Mlynarski said, “We don’t stop at integrating into the merchant’s business — we want to integrate into the financial lives of the merchants well. We try to bring the bank to our customers and operate in their territory,” integrating, for example, lending services so that capital is available as needed, with payments tied to back-office functions so that merchants can pay their suppliers efficiently with virtual cards. Bank of America, Mlynarski said, has used a partnership approach to deliver those value-added service to merchant clients in the retail, restaurant and healthcare verticals, among others. “Building those partnerships fill any gaps we might have,” he said, and underpin long-term sustainable growth. In some cases, he said, client firms are seeking data and analytics functions from those clients, with integrations into enterprise resource planning (ERP) systems. Healthcare is a key example here, where back-office money movement needs to break free from paper-based communications and paper checks. Bank of America, he said, has been working to digitize those operations. Looking ahead, no matter whether they are consumer-facing or emanating from the back office, payments themselves should be intuitive and experiential, said Mlynarski, adding that transactions “should be in the background and automatic,” and they soon will be with artificial intelligence (AI) and biometrics.
LLMs can still be prohibitively expensive for some, and as with all ML models, LLMs are not always accurate. There will always be use cases where leveraging an ML implementation is not the right path forward. The key considerations for AI project managers to evaluate customers’ needs for AI implementation include: The inputs and outputs required to fulfill your customer’s needs: An input is provided by the customer to your product and the output is provided by your product. So, for a Spotify ML-generated playlist (an output), inputs could include customer preferences, and ‘liked’ songs, artists and music genre. Combinations of inputs and outputs: Customer needs can vary based on whether they want the same or different output for the same or different input. The more permutations and combinations we need to replicate for inputs and outputs, at scale, the more we need to turn to ML versus rule-based systems. Patterns in inputs and outputs: Patterns in the required combinations of inputs or outputs help you decide what type of ML model you need to use for implementation. If there are patterns to the combinations of inputs and outputs (like reviewing customer anecdotes to derive a sentiment score), consider supervised or semi-supervised ML models over LLMs because they might be more cost-effective. Cost and Precision: LLM calls are not always cheap at scale and the outputs are not always precise/exact, despite fine-tuning and prompt engineering. Sometimes, you are better off with supervised models for neural networks that can classify an input using a fixed set of labels, or even rules-based systems, instead of using an LLM.
Gyan is an alternative AI architecture built on a neuro-symbolic architecture, not transformer based, to create hallucination-free models by design
Gyan is a fundamentally new AI architecture built for Enterprises with low or zero tolerance for hallucinations, IP risks, or energy-hungry models. Gyan gives businesses full control over their data, keeping it private and secure — making it the trusted partner for enterprises in situations where reliability and accuracy are mandatory. Unlike with LLM’s, with Gyan, businesses can use an AI model without worrying about it making things up. Built on a neuro-symbolic architecture, not transformer based, Gyan is a ground-up hallucination-free model by design. “If the cost of a mistake is high, you certainly don’t want your AI causing it,” says Joy Dasgupta, CEO, at Gyan. “We built Gyan for companies and processes with zero tolerance for hallucination and privacy risks, with compute and energy requirements orders of magnitude lower than that of current LLM’s.” Gyan’s State of the Art performance in two key life sciences benchmarks (PubMedQA and MMLU) is proof of efficacy of its language model. Every inference by Gyan is traceable with full reasoning to exact ideas and arguments in the result, making them readily verifiable. This is not the case for any of the others on the Leaderboard. Gyan provides precise and accurate analysis which users can depend on.
Santander plans to grow Openbank, the digital deposit-gathering platform into a fully transactional online bank in the U.S.; Verizon deal provides the bank with a new pipeline of deposits
While other European banks have largely retreated from the ultra-competitive U.S. retail market, Santander appears to be firmly staying put. Twenty years after entering the U.S., the company is doubling down on its commitment to invest in and expand its stateside operations. Much of its growth plan is pinned on Openbank, the digital deposit-gathering platform it rolled out in the U.S. in conjunction with a company-wide focus on being a “digital bank with branches.” Openbank, which has been available in parts of Europe for years, offers Santander a way to attract low-cost deposits nationwide to help fund the bank’s sizable auto loan book. Overseeing the evolution of Openbank into a fully transactional online bank is the top priority for Christiana Riley, who took over as CEO of Santander’s U.S. operations. Riley is also charged with unifying Santander’s previously disconnected U.S. business units to increase the company’s revenue and profitability. The years that her predecessors spent laying the groundwork mean that Santander can now grow quickly, Riley told. Early results are promising. Openbank, whose high-yield savings account attracts customers with a premium interest rate, has reeled in nearly $4 billion of deposits since its launch in late October, Riley said. That’s “well in excess” of what management had expected, she noted. The pace at which Santander grows may depend on how the U.S. economy fares in the coming months. If the Trump administration imposes global tariffs as planned, and a trade war ensues, the economy could fall into a recession, some economists have warned. Any headwinds the Spanish company faces “will be related to the U.S. economy,” said Arnaud Journois, an analyst at Morningstar DBRS who covers European banks. In addition, it will take time to see if Santander’s deposit-gathering strategy leads to cheaper funding for its auto loans, he said. “I think they need a track record to see how it will unfold,” Journois said. The company has taken a different pathway from its overseas peers, Journois said. At the company’s 2023 investor day, executives laid out plans for higher growth and profitability, largely by leveraging the firm’s global scale and business diversification. The U.S. market factored heavily into the overall strategy, with a focus on reducing funding.
Keys to creating an agentic AI business- Vertical specialization, chasing the labor spend as against enterprise IT budgets, focusing on cognitive reasoning and human judgement and designing end-to-end workflows
Startups that succeed in the agentic AI space are betting on vertical specialization, digital labor and new kinds of software primitives. Rather than broad platforms, these companies are zeroing in on deep domain challenges and embedding AI agents where judgment, context and autonomy matter most. Instead of retrofitting yesterday’s SaaS models, HOAi focuses on a labor-intensive, highly contextual domain: Homeowner association management. That clarity of focus enables the company to design agentic systems with three core components: cognitive reasoning engines, seamless integration with existing workflows and a flexible orchestration layer for agents. By targeting labor spend rather than IT budgets, startups such as HOAi create new categories of digital workers that operate alongside humans. This shift enables access to budgets that are 10–20 times larger than traditional enterprise IT, according to Haoyu Zha, founder and chief executive officer of HOAi. To distill the lessons from HOAi and similar innovators, here are five keys to building a successful agentic AI startup, according to Zha: Go vertical in nuanced markets: Specialized agents can capture untapped value in industries with unique operational needs. Follow the labor spend, not the IT: Labor budgets are significantly larger than IT budgets and far less saturated. Empower decisions over tasks: Build agents that enhance human judgment, not just automation. Decision intelligence is the new strategic edge. Rethink software: Go agentic: Don’t retrofit software-as-a-service blueprints. Design end-to-end workflows with autonomous, context-aware agents from the ground up. Visibility fuels viability: In a crowded market, discovery matters. Build brand awareness early or risk being invisible, regardless of how advanced your tech is.
Databricks acquisition of Neon to offer enterprises ability to deploy AI agents at scale by rapidly spinning up databases programmatically without coupling storage and compute needs, through a serverless autoscaling approach to PostgreSQL
Databricks announced its intent to acquire Neon, a leading serverless Postgres company. Neon’s serverless PostgreSQL approach separates storage and compute, making it developer-friendly and AI-native. It also enables automated scaling as well as branching in an approach that is similar to how the Git version control system works for code. Amalgam Insights CEO and Chief Analyst Hyoun Park noted that Databricks has been a pioneer in deploying and scaling AI projects. Park explained that Neon’s serverless autoscaling approach to PostgreSQL is important for AI because it allows agents and AI projects to grow as needed without artificially coupling storage and compute needs together. He added that for Databricks, this is useful both for agentic use cases and for supporting the custom models they have built over the last couple of years after its Mosaic AI acquisition. For enterprises looking to lead the way in AI, this acquisition signals a shift in infrastructure requirements for successful AI implementation. What is particularly insightful, though, is that the ability to rapidly spin up databases is essential for agentic AI success. The deal validates that even advanced data companies need specialized serverless database capabilities to support AI agents that create and manage databases programmatically. Organizations should recognize that traditional database approaches may limit their AI initiatives, while flexible, instantly scalable serverless solutions enable the dynamic resource allocation that modern AI applications demand. For companies still planning their AI roadmap, this acquisition signals that database infrastructure decisions should prioritize serverless capabilities that can adapt quickly to unpredictable AI workloads. This would transform database strategy from a technical consideration to a competitive advantage in delivering responsive, efficient AI solutions.
Fed’s study confirms widespread use and acceptance of ACH; 60% of businesses used standard ACH in 2024, up from 48% a year earlier vs 56% using Same Day ACH, an increase from 45% in 2023
The number of businesses using both standard and Same Day ACH grew significantly from 2023 to 2024, a new Federal Reserve report found. 60% said they use standard ACH, up from 48% a year earlier. And 56% reported using Same Day ACH, an increase from 45% in 2023. Additionally, 47% of businesses said they encourage using ACH. One study respondent, identified as a “very large diversified service business,” told researchers, “We are using Same Day ACH more—it’s a good value for the price.” Still, even as both forms of ACH continue to gain usage, checks use in fact rose from 68% to 73%. It was highest among small (83%) and very small (78%) firms. “One key takeaway is that checks are unlikely to be disappear completely in the near future—a trend to monitor,” researchers noted. “Nacha’s own figures show that ACH volume is rising,” said Michael Herd, Nacha Executive Vice President, ACH Network Administration. “Given this widespread use and acceptance of ACH, plus the increasing amount of check fraud, the industry needs to focus on why businesses of any size are still writing and receiving checks.” When it comes to pain points for business payments, high costs/fees was the top issue cited at 48%. Speed was tied for a distant second with security issues, cited by 32%.
Capital One Auto Refinance division uses ‘Swiss cheese’ approach to fraud prevention – a combination of risk prevention software and alternative data used to verify transactions
Capital One is using a “Swiss cheese” approach, for which a combination of risk prevention software and alternative data is used to verify transactions, Head of Auto Refinance Allison Qin said. “You have to have a multilayered approach. One slice might have a hole in it, but if you have 20 slices stacked up, you’re less likely to make it through the stack of cheese.” — Allison Qin, Capital One. The lender uses Capital One credit card transaction history and biometric data in its proprietary fraud prevention models, Qin said. Auto lenders’ total estimated loss exposure from fraud reached $9.2 billion in 2024, a 16.5% year-over-year rise, according to risk management platform Point Predictive’s March 25 report. “Fraud is continuously evolving and getting harder to spot, so it’s imperative that dealers and lenders work together to solve [industry fraud],” Qin said. Like lenders, dealerships are implementing multiple fraud prevention systems. Morgan Automotive Group, with more than 75 retail locations in the state, uses an “eyes wide open” approach in which dealers are vigilant about identifying scams, Justin Buzzell, finance vice president of the group, said. For Morgan Automotive Group, those protections include: A red flags check, which looks at customer identification;
A Department of Highway Safety and Motor Vehicles check; A synthetic fraud check, which looks for mixes of real and fake information; A biometric scan; and Video records of all interactions with customers to show “we’ve done everything we could.” “If you pass all of that, we’ll sell you a car,” Buzzell said. Dealerships and lenders agree that notifying each other about fraudulent encounters helps the industry; however, there’s no easy place to do that yet, West American Loan Chief Executive and President Sean Murphy said. If a centralized portal, similar to e-contracting platform RouteOne, allowed dealers and lenders to share potential fraud signs, industry players could work together to stop scams, Murphy said.
Tokenized deposits and stablecoins are vying for share in the $120 Bn cross-border friction market with both expected to hold ground as competing, interoperable, increasingly invisible to end users, and governed by evolving standards
BIS expects limited‑scale Agorá pilots by early 2026; J.P. Morgan plans to open Kinexys to third‑party banks later this year; Circle is lobbying U.S. regulators to let federally chartered banks hold USDC as cash equivalents. The race is now about reach. The prize is enormous: McKinsey puts annual cross‑border friction at US $120 billion. Trim even a third and the savings rival the revenue of a top‑ten global bank. No wonder both sides are spending heavily on standards bodies, custody integrations and developer toolkits. Real‑world proofs of concept are multiplying. Under Project Guardian, MAS is exploring smart contracts that bundle both legs of an SGD‑USD swap so settlement is instantaneous and counter-party risk diminishes. Kinexys now supports conditional logic, release payment when IoT sensors confirm delivery, while Circle’s new network lets a multinational pay suppliers on Solana, collect receipts on Stellar and sweep surplus to a regulated custodian in New York. Trade‑finance platforms are testing tokenized deposits as real‑time collateral for letters‑of‑credit. Whether banks or fintech issuers capture the lion’s share will depend on who can scale liquidity, satisfy regulators and embed programmable dollars into everyday commerce first. The quiet war for the money pipes is already under way, and while consumers may never see the plumbing, the savings, or losses, will flow straight through corporate balance‑sheets.
Integration challenges requiring asynchronous transaction processing flow with multiple steps for additional layer of authentication is a key factor behind low adoption of EMV 3DS
While EMV 3DS has many benefits, adoption may be slow in the regions where EMV 3DS is not mandatory. Reasons may include the following: Data inconsistency. The quality of merchant data provided in EMV 3DS plays a critical role in issuer fraud detection. Merchants may be reluctant to share more data and may decide to provide a minimum set of data elements excluding optional data elements. There are cases when the data provided is not accurate, causing issues in fraud engines. Approval rates and cardholder friction. Shopping cart abandonment has been one of the major reasons that EMV 3DS adoption is low. Many enhancements have been added to the protocol from EMV 3DS 1.0 to EMV 3DS 2.x to challenge the cardholder only when needed. Complexity of integration. EMV 3DS integration is complex and adds an additional layer of authentication flow before authorization, resulting in higher implementation costs. Most systems are built with synchronous authorization request and response; EMV 3DS is a major change since it requires an asynchronous transaction processing flow with multiple steps. Liability shift. EMV 3DS is designed to help with fraud. However, determining if, how and when a liability shift occurs for merchants is not a simple answer and depends on several factors. Payment network and local regulatory requirements should be checked for specific use cases to assess any applicable liability shifts. Some factors are: Region and payment network. It is important to be familiar with payment network rules for EMV 3DS usage. Merchant category code (MCC). Not all MCCs are allowed for a liability shift.