RavenDB, a multi-model NoSQL document database trusted by developers and enterprises worldwide, announced the launch of its AI Agent Creator, a first-to-market feature fully integrated into the database, that reduces the time required to build AI agents from weeks, months, or even years to just days. RavenDB’s AI Agent Creator runs agents natively inside the database, giving developers secure, direct access to operational data. By keeping agents close to the data and automating integration, RavenDB turns months of uncertainty into days of reliable, context-aware AI delivery. Unlike scripted bots or AI-assisted chatbots limited to generic knowledge, with the AI Agent Creator, developers can deploy intelligent agents with built-in guardrails. Developers define the scope in which each agent can operate on behalf of specific users, while seamlessly connecting to existing validation, authorization, and business logic. To give developers stronger control, enhanced safety, and greater precision over how data and operations are accessed, the large language model (LLM) follows a zero trust, default deny approach, where no data or operations are accessible unless explicitly approved. When an end-user submits a request in natural language, RavenDB invokes the agent to process the request and communicates the tools and actions it has within the scope defined by the developer. RavenDB then orchestrates the entire flow, referencing existing business logic to perform approved operations. This process provides accurate, personalized responses without ever exposing the full database, moving data to external servers, writing complex code, or compromising security. The feature supports all LLMs and uses smart caching, summarizing agent memory and history, reducing redundant requests for reasoning-intensive tasks. This significantly cuts AI spend and optimizes costs without sacrificing accuracy, making it a critical efficiency tool in agentic AI workflows.
Cognition’s Devin AI, which enables natural language coding, acquires Windsurf to integrate its multi-agent task execution with IDE and multi-model support for faster enterprise AI coding for its clients including Goldman Sachs and Citi
Cognition has raised more than $400 million at a $10.2 billion valuation and that the acquisition has paid off handsomely in terms of boosting revenue. Following the March 2024 debut of its AI coding agent Devin, Cognition quickly gained traction among AI-forward developers looking to provide natural language instructions and have Devin generate code suggestions for them automatically with minimal human intervention. Earlier this year, Cognition agreed to acquire Windsurf’s remaining team and tech in July 2025 for an undisclosed sum (estimated to be $250 million). Now, it turns out that deal has paid off quite well for Cognition. “Before acquiring Windsurf, Cognition’s Devin [annually recurring revenue] ARR grew from $1M ARR in September 2024 to $73M ARR in June 2025, as usage increased exponentially. Our growth remained efficient throughout, with total net burn under $20M across the company’s entire history. Our acquisition of Windsurf more than doubled our ARR. More importantly, it gave us the complete product suite for AI coding. ” Writing on X, Jeff Wang, CEO of Windsurf, said “our customers can benefit from the opportunities of both products having synergies between local and cloud agents,” and envisioned a future of “enabling engineers manage an army of agents to build technology faster.” In addition, since acquiring Windsurf, major customers such as Goldman Sachs, Citi, Dell, Cisco, Palantir, Nubank, and Mercado Libre are now using the combined platform, according to the company. Analysts note that Cognition’s strategy — pairing Devin’s multi-agent task execution with Windsurf’s integrated development environment (IDE) and multi-model support — directly aligns with these evolving priorities. With $400 million in fresh capital and backing from prominent investors, Cognition signals that it has the financial strength to continue scaling Devin and Windsurf rather than facing near-term consolidation or shutdown risk.
Claude can now create and edit files, including Excel spreadsheets, Word documents, PowerPoint decks and PDFs, directly within Claude.ai and the desktop app, as Anthropic focuses on enterprise productivity
Anthropic launched a feature preview that allows users of its artificial intelligence (AI) assistant Claude to create and edit files, including Excel spreadsheets, Word documents, PowerPoint decks and PDFs, directly within Claude.ai and the desktop app. Per the announcement, the new capability represents a significant expansion of Claude’s functionality, moving beyond text-based interactions to hands-on document creation and manipulation. For financial services teams, the update could streamline workflows that traditionally require switching between multiple applications. Users can now build spreadsheets with formulas and variance calculations, generate financial models with scenario analysis, convert invoices into structured data and produce formatted reports or presentations from raw notes, the company said. The feature eliminates the friction of moving between Claude and productivity software, allowing teams to complete complex analytical tasks within a single interface. The file-creation feature runs in a secure computing environment where Claude can write code and execute programs to deliver finished outputs. Beyond spreadsheets, the company said Claude can clean data, perform statistical analysis and convert documents across formats, such as turning a PDF report into a slide deck. The file creation capability complements Anthropic’s broader financial services strategy.
Webisoft’s AI-powered no-code platform builds and deploys Web3 apps with wallet integration and prebuilt contracts for NFTs, DAOs, DeFi, and identity
Webisoft has launched Web3Fast, an AI-powered platform that makes blockchain application development accessible to a wide range of users, including startups, enterprises, and independent creators. Web3Fast is designed to accelerate Web3 adoption without requiring deep technical knowledge. The platform offers a no-code/low-code environment supported by an intelligent AI assistant, allowing users to design and deploy on-chain applications, integrate wallets, and utilize prebuilt smart contract templates for common use cases like NFTs, DAOs, tokenization, decentralized finance, and digital identity. The platform enables projects to move from concept to launch in a fraction of the traditional time. Web3Fast is currently in beta for selected partners, with a wider rollout expected in the coming quarter. The platform is chain-agnostic, security-focused, and business-ready, making it suitable for experimentation and enterprise adoption at scale.
BBVA chooses Ripple’s Metaco‑born custody to run in‑house, MiCA‑compliant crypto services for retail clients in Spain, starting with bitcoin and ether trading and secure storage
Ripple is expanding its banking partnerships in Europe through a new agreement with BBVA in Spain. The deal will see BBVA integrate Ripple’s digital asset custody technology into its recently launched retail service for trading and holding bitcoin and ether. The move comes as European banks adapt to the Market in Crypto Assets (MiCA) regulation, which sets a framework for offering digital asset services across the EU. “Now that MiCA is established, the region’s banks are emboldened to launch the digital asset offerings that their customers are asking for,” said Cassie Craddock, Ripple’s managing director for Europe. Ripple Custody was born out of the blockchain firm’s acquisition of Swiss crypto custody specialist Metaco, which had signed up BBVA. Francisco Maroto, BBVA’s head of digital assets, said the integration allows the bank to “directly provide an end-to-end custody service” with the security customers expect from a major financial institution. The partnership extends Ripple’s prior work with BBVA, which already uses its custody technology in Switzerland and Turkey. For Ripple, Spain represents another foothold in Europe’s regulated digital asset market. Ripple holds more than 60 regulatory licenses globally. The deal signals a gradual shift in how traditional banks approach crypto. Instead of relying on third-party providers, institutions like BBVA are opting to build in-house services using established infrastructure providers.
KuCoin’s $ART token brings AI‑driven luxury RWA valuation and staking governance to tokenized art, watches, wine and cars
KuCoin is set to introduce the $ART token, native to LiveArt, on September 9, 2025, on the BASE-ERC20 network. This token will enable the tokenization of $10 trillion in luxury assets using AI, aiming to make high-valued collectibles accessible to a broader audience. The tokenization market is estimated to be worth $16 billion by 2030, indicating significant growth potential. LiveArt’s integration of AI and blockchain technology addresses traditional real-world asset (RWA) market challenges, such as subjective valuation and transparency. Machine learning models will help in real-time asset valuation and market trend prediction, providing fair and accurate pricing. $ART staking will enable token holders to vote on assets curation and protocol upgrades, fostering a more democratic and decentralized governance structure. The token’s listing on BASE-ERC20 will allow for quick and efficient transactions, enhancing its utility and appeal to investors. However, investors should remain cautious due to potential initial price booms and regulatory oversight.
Only about six stablecoins exceed $1B, with most subscale where 5% reserve yields barely fund compliance and security, jeopardizing payment rails, treasury and customer balances
As USD-pegged stablecoins increasingly flood the marketplace competing for users, liquidity splinters across platforms and may prevent most from ever achieving operationally effective adoption. As a result, countless stablecoins, whether privately issued or state-backed, are coming face to face with a brutally simple business model challenge: if circulation doesn’t reach scale, issuers cannot generate enough revenue from their reserve assets to cover operational costs. For financial executives evaluating payment integrations or treasury strategies, this dynamic should set off alarms. The situation grows more precarious when yields fall, as they inevitably might. The fragmentation of the stablecoin market matters because liquidity is self-reinforcing. A coin that’s widely used in trading pairs, accepted across exchanges and integrated into payment rails attracts more users. The inverse is also true: illiquid stablecoins languish in obscurity, unable to build the velocity required for sustainable revenue. For CFOs and payment firms, this creates risk at the integration layer. Choosing to accept or settle in a subscale stablecoin exposes the firm to the possibility of abrupt issuer retrenchment, de-pegging events or simply a business model that fails quietly when reserves can no longer fund operations. Many firms are weighing whether to settle cross-border payments, manage treasury liquidity or offer customer accounts denominated in stablecoins. The irony is that the very attribute stablecoins promise around stability depends less on cryptography or regulatory oversight than on something far more mundane: scale economics. Without sufficient user base and circulation, the math simply doesn’t work. Even if reserves are fully collateralized, an issuer unable to fund compliance or security is a weak link. At the same time, holding multiple stablecoins might spread regulatory risk but can also dilute liquidity and expose firms to subscale issuers with fragile business models. The viability of a stablecoin can be as much a question of sustainable business modeling as it is of collateral management.
Itaú Asset launched a dedicated crypto division to scale regulated products, adding to a Bitcoin ETF, index fund, and pension fund totaling $156 M in assets
Brazil’s largest private bank Itaú has established a specialized unit within its asset management arm, Itaú Asset, to develop cryptocurrency investment products, according to a report by Valor Globo. Itaú Asset has appointed João Marco Cunha, formerly managing director at crypto asset manager Hashdex (AUM $1.5bn), to head the new division. The move signals the institution’s commitment to building out its cryptocurrency offerings as demand grows in the Brazilian market. The bank currently manages three regulated cryptocurrency products with combined net assets of R$850 million ($156m). These include a Bitcoin exchange traded fund, the Itaú Bitcoin Index unit trust fund and the Itaú Flexprev Bitcoin pension fund. Through its “íon Itaú” investment platform, customers can also directly trade ten cryptocurrencies, including Bitcoin, Ethereum and the USDC stablecoin.
Chinese researchers add hybrid parallelism to Q2Chemistry simulations — batch‑buffered overlap and dependency‑aware gate contraction — delivering speedups and HPC scalability on CPU and GPU simulators
Quantum simulation is crucial for developing practical quantum algorithms, as limitations in current hardware necessitate robust classical methods for testing and refinement. Researchers from the University of Science and Technology of China have developed a scalable approach to simulating quantum circuits within the Q Chemistry software package, delivering substantial performance gains on both conventional CPUs and powerful GPUs. This research demonstrates a significant leap forward in simulation speed and portability, consistently outperforming existing open-source simulators across a range of quantum circuit designs and paving the way for more complex algorithm development. Key technologies underpinning these advancements include multi-core CPU parallelization, distributed computing, and the use of tensor network methods to efficiently represent quantum states. State vector simulation alongside techniques like matrix product states are employed to balance accuracy and computational cost, enabling researchers to tackle increasingly complex quantum systems. The Q2Chemistry software package has significantly enhanced the performance of full-amplitude quantum circuit simulation within the software package, enabling accurate and efficient simulations of complex quantum circuits. The team implemented Batch-Buffered Overlap Processing, a multi-buffering strategy that partitions quantum state amplitudes into smaller batches, and Staggered Multi-Gate Parallelism, a two-dimensional thread block strategy for GPU execution. These optimizations enable researchers to tackle increasingly complex quantum circuits and explore the potential of quantum chemistry with greater efficiency and accuracy.
D‑Wave releases developer toolkit and demo to accelerate quantum AI exploration, enabling seamless ML integration and practical RBM (GenAI-like) training
D-Wave Quantum recently released a collection of offerings to help developers explore and advance quantum artificial intelligence (“AI”) and machine learning (“ML”) innovation, including an open-source quantum AI toolkit and a demo. Available now for download, the quantum AI toolkit enables developers to seamlessly integrate quantum computers into modern ML architectures. The demo shows how developers can use the toolkit to experiment with using D-Wave(TM) quantum processors to generate simple images, reflecting what D-Wave believes is a pivotal step in the development of quantum AI capabilities. By releasing this new set of tools, D-Wave aims to help organizations accelerate the use of annealing quantum computers in a growing set of AI applications. The quantum AI toolkit, part of D-Wave’s Ocean(TM) software suite, provides direct integration between D-Wave’s quantum computers and PyTorch, an ML framework widely used to train and create deep learning models. The toolkit includes a PyTorch neural network module for using a quantum computer to build and train ML models known as a restricted Boltzmann machine (“RBM”). Used to learn patterns and connections from complex data sets, RBMs are employed for generative AI tasks such as image recognition and drug discovery. Training RBMs with large datasets can be a computationally complex and time-consuming task that could be well-suited for a quantum computer. By integrating with PyTorch, D-Wave’s new toolkit aims to make it easy for developers to experiment with quantum computing to address computational challenges in training AI models. “With this new toolkit and demo, D-Wave is enabling developers to build architectures that integrate our annealing quantum processors into a growing set of ML models,” said Dr. Trevor Lanting, chief development officer at D-Wave.
