Warehouse club retailer Sam’s Club, a division of Walmart, is now delivering hot Member’s Mark baked pizzas for online orders at most stores across the U.S. and will have the offering available from all stores by the end of May 2025. The pizzas cost $8.98 and come in pepperoni, cheese and four-meat flavors. Online pizza orders can be combined with other items and are eligible for express delivery, arriving less than three hours after an order is placed. Pizzas are also still available for in-store or curbside orders. According to Sam’s Club, the launch of pizza delivery is the latest step in a bigger shift to providing shoppers digital ease in ways that save time, add value and deepen customer connections, such as its new digital-first store format. “When we talk about innovation, it’s not just about what’s new — it’s about what makes life easier for our members,” said Kurt Hess, group director, operations and implementation at Sam’s Club. “Pizza delivery is a perfect example.”
IBM’s two-pronged approach to modern application management involves automating applications with AI and managing them through observability, aided by AI-generated problem summaries in plain English to simplify triage
AI, observability and automation at scale are converging to redefine how modern applications are built, monitored and optimized. IBM Corp.’s approach is two-pronged — automating applications with AI and creating a conducive environment, through observability, to manage them. Chris Farrell, group product manager of Instana observability at IBM. “We’re focused on both those things at the same time, simultaneously. One of the things that we’re doing is putting AI into the observability aspect of managing the applications. We have recently released integration with watsonx to create summarizations of problems in plain English so that anyone can get a summarization and print it out.” Central to IBM’s approach is the integration of AI into observability tooling, particularly through Instana and its connection with watsonx. This powerful combination enables AI-generated problem summaries in plain English, simplifying issue triage for both technical and non-technical teams. Additionally, IBM is taking steps toward AI-based remediation. With watsonx, problems can be detected and suggestions — or even automated actions — can be triggered to resolve them. This shift reduces the time between incident detection and resolution, enhancing uptime and operational efficiency, according to Farrell.
Sifflet’s AI-native data observability platform replaces manual triage, alert sprawl, and static rule sets with context-aware automation to help data teams scale data quality and reduce incident response times
Sifflet, the AI-native data observability platform, has shared an early look at their upcoming system of AI agents designed to help modern data teams scale data quality and reliability, reduce incident response times, and stay ahead of complexity. The new agents extend Sifflet’s core observability capabilities with a new layer of intelligence: Sentinel analyzes system metadata to recommend precise monitoring strategies; Sage recalls past incidents, understands lineage, and identifies root causes in seconds; Forge suggests contextual, ready-to-review fixes grounded in historical patterns. Sifflet’s AI-native approach is already helping customers to handle these workloads with existing functionality. Sifflet’s AI agents address the growing challenge and go one step further by replacing manual triage, alert sprawl, and static rule sets with context-aware automation that augments human teams. Sanjeev Mohan, founder of SanjMo and former VP Analyst at Gartner “Rather than relying on static monitoring, these agents bring memory, reasoning, and automation into the fold, helping teams move from alert fatigue to intelligent, context-aware resolution.” The agentic system is fully embedded in Sifflet’s AI-native platform and will soon be available to select customers in private beta.
JPMorgan Chase’s AI coding assistant has delivered “a 10% to 20% productivity increase”; Gen AI is able to clearly demonstrate the value of data modernization efforts
While most banks hesitated on generative AI, JPMorgan Chase led early adoption with three major back-office use cases boosting employee productivity. Chase’s strategy emphasizes learn-by-doing training, rigorous ROI measurement, and preparing data infrastructure for firm-wide AI integration across 450+ proofs of concept. Chase’s active Gen AI use cases are currently concentrated in the back office, focusing more on efficiency and productivity plays rather than augmenting customer-facing deployments, according to Katie Hainsey, Managing Director and Head of AI/ML and Data & Analytics for Digital, Marketing, and Operations at JPMorgan Chase.
- The bank’s call center employees handle millions of customers annually across a variety of functions, ranging from Customer Service, Fraud & Claims, Home Lending, Wealth Management and Collections. Employees on these teams dedicate a significant amount of time getting to know Chase’s policies and documentation. To cut down on this resource- and time-intensive process, the firm launched EVEE Intelligent Q&A, a Gen AI-powered tool that allows specialists to ask questions and receive concise answers. The Gen AI solution integrates with existing tools for call center employees and has improved efficiency, call resolution times, and employee and customer satisfaction, per Hainsey. One of the great use cases where we’re using Gen AI is to be able to better equip our agents with the information to answer customer inquiries, they can chat with the interface to ask a question and get an answer,” she said.
- Released in the summer of 2024, the LLM Suite is the bank’s proprietary generative AI platform that acts as a knowledge base for the firm’s employees. The resource can also be used for content and idea generation, as well as for querying on specific documents and PowerPoints, according to Hainsey. The LLM Suite has been widely adopted at Chase, with 200,000 employees onboarded within the first 8 months of its launch. JPMC also launched a coding assistant that it says has been playing a significant role in improving the firm’s efficiency in software and technology development. Coding assistants like these help firms cut down on development time and may also contribute to cost savings, as repetitive and mundane tasks are addressed through the assistant. “I see a lot of employees utilizing code creation and code conversion through LLMs. We have seen a 10% to 20% productivity increase,” she added.
- Chase’s tools like the LLM Suite and EVEE Intelligent Q&A are the result of a firm-wide Gen AI-friendly outlook and a strategy that is readying the bank to evolve more and more proof of concepts into live features.
- Employee training: Chase is going with a “learn by doing” approach for Gen AI. Hainsey shares that they want these tools to be in the hands of their employees, as the firm believes there is no better way to learn than by actually utilizing the tools themselves. This approach is also evident in the wide roll out for the LLM Suite.
- Measuring ROI: The bank has been reported to have 450 proofs of concepts in the works, a number which is expected to climb to a thousand next year. With three major initiatives already underway and many more lined up, Chase is focusing on developing clear and concrete KPIs and goals to analyze the success of each project. “We are setting very clear goals of success and KPIs for each one of these rollouts. We also have very good experimentation, so we can actually measure the incremental benefits by giving the tool to some agents, and setting up test and control groups. We compare these results with clear metrics of success, and it helps us learn what’s working and what’s not working and what we need to do to drive adoption,” she said.
- Planning for firm-wide integration and data readiness: While the firm has bet big on Gen AI and launched use cases spanning the firm’s entire employee base, Hainsey is now working on building a holistic strategy that integrates Chase’s Gen AI philosophy end to end. At the moment, she is also focusing on two major strategy pieces that the firm wants to enact to unlock more value from its Gen AI initiatives: 1) Data readiness: Gen AI is new but data modernization efforts are not. However, Gen AI is able to clearly demonstrate the value of data modernization efforts in ways that weren’t before. For Chase, this means that a deep focus on data products, and unstructured and structured data has become non-negotiable. 2) Changing behaviors: As people get more used to Gen AI, their expectations and the questions they ask may evolve with time. As a leader at Chase, Hainsey has to prepare for this change in user behavior and ensure that the model is able to keep pace with its users with the right amount of data and a process pipeline to support this change.
USAA is working closely with IBM to support its gen AI strategy, using watsonx and the Granite model family to extract structured elements from complex documents
USAA is applying generative artificial intelligence across core operations to pull strategic insight from unstructured data, according to Ramnik Bajaj, senior vice president, chief data, analytics and AI at USAA. Its approach prioritizes scale, trust and adaptability across the enterprise. Rather than treating AI as an isolated experiment, USAA is building toward an enterprise-wide model that balances innovation with governance. Centralizing its data and gen AI capabilities has helped the company increase development velocity and model reliability, according to Bajaj. “I call it first-class data now that it is possible to do those types of analytics,” he said. “Having all of those in one place in the enterprise … really supercharges our ability to move fast and move responsibly. With a technology like gen AI, it’s important to also learn our way into the right architectural patterns that can give us not just speed to market, but also reliability [and] accuracy, preserving trust and … privacy. All of those core data management functions tie very well into that AI genre.” USAA doesn’t greenlight AI for AI’s sake. Each proposed use case goes through a structured evaluation process to ensure it will generate measurable business value, whether through revenue, efficiency, risk reduction or improved member experience, according to Bajaj. USAA is working closely with IBM to support its gen AI strategy, using watsonx and the Granite model family to extract structured elements from complex documents. This collaboration is key to turning claims data and underwriting materials into usable, trustworthy inputs for AI, according to Bajaj.
United Bankers Bank partnered with Pidgin, a real-time payments platform provider, to create a bridge for a network of small banks to easily offer instant payment services via FedNow, RTP or the ACH
United Bankers Bank, a correspondent financial institution serving more than 1,000 community banks across the U.S., recognized that smaller banks often lack the internal resources of larger institutions for in-house development or complex integrations. To address this, UBB partnered with Pidgin, a real-time payments platform provider, to provide the necessary infrastructure for its network of banks to offer instant payment services via FedNow, RTP or the Automated Clearinghouse. The result has since earned it recognition from American Banker as an Innovation of the Year. Pidgin is designed as a secure, real-time payments platform that acts as the “connective tissue and plumbing” for financial institutions. It enables banks to offer instant payment services by routing transactions through FedNow, RTP and ACH, providing a comprehensive solution for diverse payment needs. The platform offers 24/7/365 access, ensuring payments can be sent and received at any time, with funds posted directly to the customer’s account for immediate availability. “The UBB–Pidgin partnership directly addresses many of the barriers community banks face — simplifying integration, reducing costs and providing the support smaller institutions often lack,” Perry said. A key security feature of Pidgin is it’s designed to keep funds within the financial institution, rather than routing them through a third-party holding account or virtual wallet. This approach enhances security, reduces interbank settlement risk and strengthens customer retention by maintaining the bank as the primary financial custodian, according to the fintech. Pidgin supports a wide range of transaction types for both consumers and businesses, including peer-to-peer (P2P) transfers, consumer-to-business (C2B) payments for services or e-commerce, business-to-consumer (B2C) disbursements like immediate payroll or insurance payouts, and business-to-business (B2B) payments for on-demand supplier invoices. The platform also facilitates government and municipal payments, such as instant tax payments or refunds. By supporting both “receive” and “send” capabilities, Pidgin allows banks to fully leverage real-time payments for competitive advantage. United Bankers Bank, as a correspondent institution, provides its network of community banks with access to technologies without competing with its member banks. UBB initiated a pilot program for faster payments with Pidgin approximately one year before the official launch of the FedNow service. This involved thorough testing with a customer bank, including liquidity management transfers and other real-time transaction capabilities. This preparatory phase allowed UBB to seamlessly integrate the solution, enabling them to go live with Pidgin the day after FedNow service officially launched. Since going live, UBB has observed a consistent increase in the number of partner banks participating in the FedNow service through the Pidgin platform, growing by 5%-10% each month. UBB anticipates this trajectory to accelerate as more institutions recognize the benefits of the new real-time rail. One thing that the UBB-Pidgin partnership does not solve, according to Perry, is the demand gap.
In markets with older or rural customer bases, customers may not feel the urgency of real-time payments, according to Perry. As such, the pressure on these customers’ banks to adopt remains low, even if the technology is now within reach. “Nevertheless, for small banks that want to modernize their payments infrastructure and stay competitive, real-time payments would simply not be possible without this kind of collaboration,” Perry said. The UBB-Pidgin project underscores the critical role of partnerships with fintechs in enabling traditional financial institutions, particularly smaller ones, to adopt advanced technologies efficiently and cost-effectively. The initiative also serves as a blueprint for how correspondent banks can effectively empower smaller financial institutions to adapt to the rapidly evolving payments landscape — namely, with a fintech partner. The UBB-Pidgin partnership also gives community banks a way to stay relevant in their local economies, offering modern financial services that meet the growing demands of both consumers and businesses.
J.D. Power’s survey reveals 41% customers cite family and friends using a different P2P transfer account as the most likely reason to switch P2P brands for both sending and receiving money
According to new J.D. Power data, network effects, security and ease of use play a large role in determining which “additional” brands consumers are using. Customers say the most likely reason to switch P2P brands for both sending and receiving money is family and friends using a different P2P transfer account (41%). Security concerns (27% for sending money, 25% for receiving) were also among the top reasons. how banks integrate Zelle into their mobile and electronic platforms has a large effect on satisfaction. Zelle integration is largely customizable, so how and where Zelle’s features appear in each bank’s tool vary. Capital One’s P2P customer experience, for example, is enhanced by strong discoverability from the home screen, a pay/move screen featuring a Zelle-centric money movement experience, and a final send screen that displays the recipient’s information to reconfirm money is being sent to the right person. While P2P users are steadfastly loyal to their primary brand, competing providers have a real opportunity to expand their customer base by turning existing users into advocates. Many users are receptive to opening secondary accounts to ensure they can send money across their entire social network. This means an incumbent—or even a new disruptor—doesn’t need to break brand loyalty to make meaningful gains. Sometimes, all it takes is one friend or family member requesting a transfer via another service, and suddenly, that competitor has gained a new user. As brands build out their platforms, it is incumbent on them to understand what differentiates the top performers.
BCG report: Incumbent banks are losing their grip on industry growth; capital-light noninterest income grew by 1.8%; but the relative amount generated per asset decreased by 18%
Traditional banks are losing their grip on industry growth, according to a new report from Boston Consulting Group (BCG). The global banking industry has grown at a compound annual growth rate (CAGR) of 4% over the past five years, but traditional banks are ceding the most valuable ground to fintechs, digital attacker banks, private credit funds, and nonbank market makers. Incumbent banks have relied on balance-sheet-driven net interest income to contribute roughly 85% of the growth. Yet they struggle to generate capital-light noninterest income: while it grew by 1.8% in absolute terms, the relative amount generated per asset decreased by 18%. The report reveals that these shifts pose a structural challenge to traditional banks and calls for bold transformation and a rethinking of the relationships that link banks, regulators, and wider society. Looking at leading banks, the report notes three patterns that the market rewards: scale (not size), specifically in terms of domestic market leadership; the ability to generate a superior share of fee income; and market-leading productivity. The report also highlights four strategic approaches that today’s banking leaders pursue: front-to-back digitization; customer centricity; focused business models; and M&A champions. Banks can take more than one of these approaches, but all require strong digital capabilities. “Banks need to look boldly at their business portfolio and make hard calls to focus on fewer areas where they can win,” said Andreas Biffar, a BCG managing director and partner and a coauthor of the report. “Simplification of products and processes with comprehensive front-to-back digitization is a nonnegotiable element in the current context.” In addition, successful strategic implementation of AI could be a game changer, although many banks still struggle on this front. According to the report, banks need to adopt a vigorous and focused approach to AI implementation. If AI has not yet delivered for all banks, that may be due more to challenges in scaling and lack of holistic adoption by employees and customers than to issues with the technology. As agentic AI and machine voice emerge as even more powerful productivity levers, winners will take effective action to incorporate them. Nevertheless, AI alone may not be sufficient. Much of the potential value may be captured by nonbank players, which are currently better positioned to benefit from its applications.
IBM’s two-pronged approach to modern application management automates applications and manages them through observability, simplifying triage with AI-generated problem summaries
AI, observability and automation at scale are converging to redefine how modern applications are built, monitored and optimized. IBM Corp.’s approach is two-pronged — automating applications with AI and creating a conducive environment, through observability, to manage them. Chris Farrell, group product manager of Instana observability at IBM. “We’re focused on both those things at the same time, simultaneously. One of the things that we’re doing is putting AI into the observability aspect of managing the applications. We have recently released integration with watsonx to create summarizations of problems in plain English so that anyone can get a summarization and print it out.” Central to IBM’s approach is the integration of AI into observability tooling, particularly through Instana and its connection with watsonx. This powerful combination enables AI-generated problem summaries in plain English, simplifying issue triage for both technical and non-technical teams. Additionally, IBM is taking steps toward AI-based remediation. With watsonx, problems can be detected and suggestions — or even automated actions — can be triggered to resolve them. This shift reduces the time between incident detection and resolution, enhancing uptime and operational efficiency, according to Farrell.
Elastic’s database is to offer hybrid search capabilities for Microsoft’s Semantic Kernel allowing users to combine multiple search techniques and run them at scale for enhanced retrieval quality
Elastic has announced the availability of its hybrid search capabilities for Microsoft’s Semantic Kernel project, making Elastic’s vector database the first to feature this capability. Users can now combine multiple search techniques when using Elasticsearch with Semantic Kernel, enhancing information retrieval and delivering the most significant results to queries. “What sets Elastic apart is our powerful hybrid search capabilities, which our customers can run at vast scale,” said Ken Exner, chief product officer at Elastic. “As the first vector database to bring hybrid search capability to Microsoft Semantic Kernel, our .NET users benefit from significantly improved retrieval quality, making it easier for them to build enterprise-grade AI applications.”
