Tailor, a headless ERP platform for modern retail businesses, announced an additional close of its Series A round, bringing the total raised to $22 million. The new investors include JIC Venture Growth Investments (JIC VGI), a Japanese government-backed investment fund, and New Enterprise Associates (NEA). Y Combinator, which participated in Tailor’s seed round, also increased its investment. This second close reflects growing global demand for flexible, API-first business systems that help enterprises move beyond the limitations of monolithic ERPs. “Tailor’s platform serves an increasingly complex supply chain landscape and we believe Tailor has the potential to rethink the ERP systems that power global commerce and operational agility,” said Andrew Schoen, Partner Technology Investing Team, at NEA. Tailor enables mid-market and enterprise companies to build and evolve their own ERP stack with speed and flexibility. Its composable, headless architecture decouples the data and logic layer from the user interface, allowing for highly customizable workflows and easy integration with best-of-breed SaaS tools. With Tailor, companies can: Orchestrate cross-system workflows with customizable modules for inventory, purchasing, fulfillment, finance, and more; Replace or integrate with legacy systems without re-architecting core infrastructure; Give developers and AI agents programmatic access to business logic and operational data; Deliver internal tools or customer-facing experiences with custom UIs
Fund managers like Vanguard and State Street looking to tap into higher-cost private markets to boost revenue by selling funds that charge higher fees, to offset the years-long trend of near-zero
Vanguard Group grew into a $10 trillion financial colossus by pioneering simple, ultralow-cost investing. Its wildly popular index funds proved that people don’t need expensive portfolio managers to pick their investments. These days, the company’s most exciting new product is a striking departure from that playbook—a foray into the world of private markets, where investors pay steep fees for access to complex deals that promise high returns. Wall Street is feverishly embracing private markets and Vanguard, like other giant money managers, wants a foothold in this booming business. A new fund it is developing with Blackstone and Wellington Management will offer a mix of public and private assets. State Street, another big fund manager, joined with Blackstone rival Apollo Global Management to create a 401(k) target-date fund with approximately 10% exposure to private markets. BlackRock is aiming to launch something similar next year. The companies say they want to democratize investing, letting people place bets in areas like private equity and private credit that have long been restricted to pensions, endowments and plugged-in elites. With fewer publicly traded companies to invest in, people need to put money in private markets, they say. The investment firms have something to gain for themselves in this shift. After years of driving down fees, retail fund managers are nearing a wall as what they charge approaches zero . The chance to sell more funds that can charge higher fees would boost revenues for the likes of Vanguard and State Street, which pioneered exchange traded funds. “They have been preaching low fees and competition, and now they’ve come around to the value of alternative investments in client portfolios,” said Morgan Stanley analyst Michael Cyprys . The companies that create and manage the private funds—Blackstone, Apollo, KKR and others—are hitting their own wall as their privateequity funds , which own unlisted companies, stall. They have focused on private credit , which involves issuing loans to those same sorts of companies, to generate growth. The trillions in savings held by retail fund managers looks like enticing fuel for their businesses. “Everyone’s looking at Vanguard and saying ‘wow, they have a tremendous pile of gold, I wonder if they’d let us have a small slice’,” said Bob Brinker , publisher of a mutual-fund newsletter.
Margarita Finance launches agentic stablecoin technology with 20% APY yield-bearing token, allowing users to access institutional DeFi investment strategies through a single token purchase
Margarita Finance launched its agentic stablecoin technology to an invite-only user base this week, introducing a DeFAI protocol that automates AI-based institutional investment management through yield-bearing stablecoins. The company’s new protocol autonomously builds and issues institutional investment products, wraps them into proprietary vaults with automated monitoring, and makes them investable through yield-bearing stablecoins. The first curated offering, SOL20, provides a 20% expected return (APY) to users powered by AI-based options trading. This way, DeFi users get access to institutional trading strategies usually reserved to Wall-Street hedge funds. The DeFAI protocol operates by creating custom investment strategies, packaging them into monitored vaults, and wrapping them into easily accessible stablecoins that are backed by the underlying yield-generating assets. This approach allows users to access institutional DeFi investment strategies through a single token purchase. Margarita Finance’s end-to-end value capture model now encompasses both consulted custom investment products and curated yield-bearing stablecoins, providing users with multiple entry points into AI-managed DeFi strategies.
Walmart’s second freestanding, 3D-printed store in Huntsville with 16-foot concrete walls to serve as the online grocery pick-up and delivery location
Walmart has partnered with 3D concrete printing company Alquist 3D and general contractor FMGI to complete construction of its second freestanding, 3D-printed store addition. Working with the two firms, Walmart printed the 16-foot concrete walls of the structure, which will serve as an extension of the grocery pickup area in Walmart’s supercenter in Huntsville, Ala. Set to open the week of May 5, the completed Huntsville addition will serve as the retailer’s online grocery pick-up and delivery location as part of an overall store remodel. Other companies working on Walmart’s Huntsville commercial 3D printing project included Sika USA, which supplied customized concrete mixes formulated to address varying environmental conditions. In addition, Alquist’s robotics partner RJC Technology, which furnished robotic systems designed to achieve high-precision printing with reduced labor requirements. “In a commercial construction world that pays so much attention to project timelines and costs, our work with Walmart shows that 3D printing isn’t just a novelty – it’s an innovation ready to scale for retail and other industries,” said Patrick Callahan, CEO of Alquist 3D. “This second project clearly demonstrates how retail expansions can be faster, more cost-effective and less wasteful, paving the way for broader adoption for large-scale commercial builds.”
Payment processors looking at platformization to offer an end-to-end product stack adjacent to payments such as advanced fraud prevention, network tokens, real-time account updates, and acceptance rate enhancement tools
“We’re seeing a shift where businesses are now looking for a payment processor that is more inclusive of a product stack, so a one-stop shop for everything,” Justin Downey, vice president of product at Maverick Payments, said. “Payment processors are looking for services that are adjacent to payments. That could be advanced fraud prevention, network tokens, real-time account updater, other tools that can increase the acceptance rate while reducing fraud,” Downey said. He highlighted the quest for a “frictionless checkout experience” — the new gold standard for merchants and consumers alike, as “something that truly makes it easy for customers to submit payments,” he added. The future Downey envisions, and the picture of the present he has painted, is neither purely competitive nor fully collaborative. It’s both. Processors will need to be architects — building unique, defensible intellectual property at their core — as well as curators, integrating complementary services to offer breadth and agility. The platformization trend means processors are stretching beyond payments into tangentially related domains — sometimes encroaching on territory once exclusive to FinTechs or even banks. “Payment processors are expanding into areas that are close to payments, but not exactly payments, like financial services, alternative payment methods, embedded finance,” Downey said. “Processors are in this unique position where, generally, they have a very strong distribution network, and they’re expanding into new product offerings that they can offer to their businesses, all as a one-stop shop. That’s a win-win for everybody,” he added.
Community banks and credit unions can enable extensibility through an internally built, custom middleware system, or by using external vendors with capabilities that stand on top of existing core systems
Through extensible capabilities, community banks and credit unions can punch above their weight by connecting to modern third-party apps and features — without swapping out their core systems. Extensible systems allow FIs to integrate with third-party apps with minimal friction, enabling easier access to account data and quicker pivots, says Christian Ruppe, SVP and chief innovation officer at Fitzgerald, Georgia-based Colony Bank. It lets banks and credit unions add modern apps and features — including quicker onboarding and transaction capabilities — without changing their core systems. Yet despite these benefits, many are falling behind on adoption. According to Ryan Siebecker, a forward deployed engineer at Narmi, a banking software firm, the route to extensibility can be enabled through an internally built, custom middleware system, or institutions can work with outside vendors whose systems operate in parallel with core systems, including Narmi. Other FIs that work with vendors — including Colony Bank and Grasshopper Bank — say using outside partners with capabilities that stand on top of existing core systems allow them to maintain lean internal operations without sacrificing the quality of the integrations. Luther Liang, SVP of product at Grasshopper Bank, told that by working with a vendor, the bank didn’t have to hire additional staff to manage software integrations enabled by extensibility. Colony Bank is starting to see results two years since it began its extensibility rollout. It’s enabled three major use cases: a modern account opening solution; an app that improves call center efficiency by allowing call center reps to co-browse with customers; and a client data visualization tool. Colony’s core provider “charges us per integration, and so now we’re not having to pay per integration — we have one integration, and we pay for that,” says Ruppe. “If you do it right, you can make it make sense immediately, but the long term is where you really win.” While Colony Bank might not be looking to compete with the top megabanks, “we know our communities better than they do…we can then provide technology that is specific to those customers,” he says.
Meta’s new Llama API to use Cerebras ultra-fast inference tech that would allow developers build apps that require chaining multiple LLM calls while offering generation speeds up to 18X faster than traditional GPU-based solutions
Meta announced a partnership with Cerebras Systems to power its new Llama API, offering developers access to inference speeds up to 18 times faster than traditional GPU-based solutions. What sets Meta’s offering apart is the dramatic speed increase provided by Cerebras’ specialized AI chips. The Cerebras system delivers over 2,600 tokens per second for Llama 4 Scout, compared to approximately 130 tokens per second for ChatGPT and around 25 tokens per second for DeepSeek, according to benchmarks from Artificial Analysis. This speed advantage enables entirely new categories of applications that were previously impractical, including real-time agents, conversational low-latency voice systems, interactive code generation, and instant multi-step reasoning — all of which require chaining multiple large language model calls that can now be completed in seconds rather than minutes. The Llama API represents a significant shift in Meta’s AI strategy, transitioning from primarily being a model provider to becoming a full-service AI infrastructure company. By offering an API service, Meta is creating a revenue stream from its AI investments while maintaining its commitment to open models. The API will offer tools for fine-tuning and evaluation, starting with Llama 3.3 8B model, allowing developers to generate data, train on it, and test the quality of their custom models. Meta emphasizes that it won’t use customer data to train its own models, and models built using the Llama API can be transferred to other hosts—a clear differentiation from some competitors’ more closed approaches. Cerebras will power Meta’s new service through its network of data centers located throughout North America, including facilities in Dallas, Oklahoma, Minnesota, Montreal, and California. By combining the popularity of its open-source models with dramatically faster inference capabilities, Meta is positioning itself as a formidable competitor in the commercial AI space. For Cerebras, this partnership represents a major milestone and validation of its specialized AI hardware approach.
Anthropic’s new feature update to enable Claude to incorporate data from SaaS applications into its prompt responses while its Research tool to allow preparing detailed reports about user-specified topics with more thorough analysis
Anthropic updated Claude with a feature called Integrations that will enable the chatbot to access data from third-party cloud services. The company rolled out the capability alongside an enhanced version of Research, a tool it introduced last month. The latter feature enables Claude to prepare detailed reports about user-specified topics. Research can now perform the task more thoroughly than before. The new Integrations capability will enable Claude to incorporate data from software-as-a-service applications into its prompt responses. If customers wish to connect Claude to an application for which a prepackaged integration isn’t available, they can build their own. Anthropic estimates that the process takes as little as 30 minutes. According to the company, developers can further speed up the workflow by using a set of tools that Cloudflare introduced in March to ease such projects. Claude’s new connectors are powered by MCP, a data transfer technology that Anthropic open-sourced. It provides software building blocks that reduce the amount of work involved in connecting a LLM to external applications. OpenAI, Anthropic’s top competitor, rolled out MCP support to its Agents SDK last month. Anthropic added MCP to Claude immediately after open-sourcing the technology last year. Until now, however, the chatbot only supported connections to applications installed on the user’s computer, which limited the feature’s usefulness.
Salesforce’s new benchmark for tackling ‘jagged intelligence’ in CRM scenarios shows leading agents succeed less than 65% of the time at function-calling for the use cases of three key personas: service agents, analysts, and managers
To tackle “jagged intelligence” one of AI’s most persistent challenges for business applications: the gap between an AI system’s raw intelligence and its ability to consistently perform in unpredictable enterprise environments —Salesforce revealed several new benchmarks, models, and frameworks designed to make future AI agents more intelligent, trusted, and versatile for enterprise use. The SIMPLE dataset, a public benchmark featuring 225 straightforward reasoning questions designed to measure how jagged an AI system’s capabilities really are. Perhaps the most significant innovation is CRMArena, a novel benchmarking framework designed to simulate realistic customer relationship management scenarios. It enables comprehensive testing of AI agents in professional contexts, addressing the gap between academic benchmarks and real-world business requirements. The framework evaluates agent performance across three key personas: service agents, analysts, and managers. Early testing revealed that even with guided prompting, leading agents succeed less than 65% of the time at function-calling for these personas’ use cases. Among the technical innovations announced, Salesforce highlighted SFR-Embedding, a new model for deeper contextual understanding that leads the Massive Text Embedding Benchmark (MTEB) across 56 datasets. A specialized version, SFR-Embedding-Code, was also introduced for developers, enabling high-quality code search and streamlining development. Salesforce also announced xLAM V2 (Large Action Model), a family of models specifically designed to predict actions rather than just generate text. These models start at just 1 billion parameters—a fraction of the size of many leading language models. To address enterprise concerns about AI safety and reliability, Salesforce introduced SFR-Guard, a family of models trained on both publicly available data and CRM-specialized internal data. These models strengthen the company’s Trust Layer, which provides guardrails for AI agent behavior. The company also launched ContextualJudgeBench, a novel benchmark for evaluating LLM-based judge models in context—testing over 2,000 challenging response pairs for accuracy, conciseness, faithfulness, and appropriate refusal to answer. Salesforce unveiled TACO, a multimodal action model family designed to tackle complex, multi-step problems through chains of thought-and-action (CoTA). This approach enables AI to interpret and respond to intricate queries involving multiple media types, with Salesforce claiming up to 20% improvement on the challenging MMVet benchmark.
AI agents could usher in a paradigm of DeFAI wherein a blockchain-powered, verifiable trust-centric model could enable secure, free and compliant AI interactions between autonomous agents across DeFi ecosystems
As AI agents take on more responsibility, and especially as the convergence between crypto and TradFi accelerates, worries around transparency and market manipulation will grow. DLT offers a solution. The Identity Management Institute reported companies that integrated blockchain identity systems have already cut fraud by 40% and identity theft by 50%. Applying these guardrails to AI-driven finance can counter manipulation and promote fairness. Moreover, the use of DLTs with fair ordering is growing rapidly, ensuring transactions are sequenced fairly and unpredictably, addressing MEV concerns and promoting trust in decentralized systems. A blockchain-powered, trust-centric model could unlock a new paradigm, “DeFAI”, in which autonomous agents can operate freely without sacrificing oversight. Open-source protocols like ElizaOS, which have blockchain plugins, are already enabling secure and compliant AI interactions between agents across DeFi ecosystems. As AI agents take on more complex roles, verifiable trust becomes non-negotiable. Verifiable compute solutions are already being built by firms like EQTY Lab, Intel and Nvidia to anchor trust on-chain. DLT ensures transparency, accountability and traceability. This is already in motion; on-chain agents are now operating that offer services ranging from trade execution to predictive analytics.
