prakhar@affmantra.com

Cache-to-Cache(C2C): Direct Semantic Communication Between Large Language Models via KV-Cache Fusion

Can large language models collaborate without sending a single token of text? a team of researchers from Tsinghua University, Infinigence AI, The Chinese University of Hong Kong, Shanghai AI Laboratory, and Shanghai Jiao Tong University say yes. Cache-to-Cache (C2C) is a new communication paradigm where large language models exchange information through their KV-Cache rather than…

Read More

Studio Ghibli and other Japanese publishers want OpenAI to stop training on their work

A Japanese trade organization representing publishers like Studio Ghibli wrote a letter to OpenAI last week, calling for the AI giant to stop training its AI models on their copyrighted content without permission. Studio Ghibli, the animation studio behind films like “Spirited Away” and “My Neighbor Totoro,” has been especially impacted by OpenAI’s generative AI…

Read More

Lawmakers say stolen police logins are exposing Flock surveillance cameras to hackers

Lawmakers have called on the Federal Trade Commission to investigate Flock Safety, a company that operates license plate-scanning cameras, for allegedly failing to implement cybersecurity protections that expose its camera network to hackers and spies. In a letter sent by Sen. Ron Wyden (D-OR) and Rep. Raja Krishnamoorthi (D-IL, 8th), the lawmakers urge FTC chairman…

Read More

Anyscale and NovaSky Team Releases SkyRL tx v0.1.0: Bringing Tinker Compatible Reinforcement Learning RL Engine To Local GPU Clusters

How can AI teams run Tinker style reinforcement learning on large language models using their own infrastructure with a single unified engine? Anyscale and NovaSky (UC Berkeley) Team releases SkyRL tx v0.1.0 that gives developers a way to run a Tinker compatible training and inference engine directly on their own hardware, while keeping the same…

Read More

a16z pauses its famed TxO Fund for underserved founders, lays off staff

Andreessen Horowitz is pausing its Talent x Opportunity (TxO) fund and program, according to four sources familiar with the matter, including more than one founder in the program.  The firm announced TxO in 2020 to support founders who do not have access to traditional venture networks. Many of TxO’s participants were women and minorities who, overall, receive very slim amounts of venture…

Read More

How to Build Supervised AI Models When You Don’t Have Annotated Data

One of the biggest challenges in real-world machine learning is that supervised models require labeled data—yet in many practical scenarios, the data you start with is almost always unlabeled. Manually annotating thousands of samples isn’t just slow; it’s expensive, tedious, and often impractical. This is where active learning becomes a game-changer. Active learning is a…

Read More