Redis Expands AI Strategy in India, Unveils LangCache and Acquires Decodable

Sameer Dhamane, Business Leader, BFSI India, Redis, and Rowan Trollope, CEO
Bengaluru, September 17: Redis, the world’s fastest data platform, announced a significant expansion of its artificial intelligence strategy at its flagship event Redis Released 2025, unveiling new tools, strategic acquisitions, and India-focused initiatives.
On his maiden visit to India as Redis CEO, Rowan Trollope outlined the company’s vision for AI, highlighting the strategic acquisition of real-time data platform Decodable and the public preview of LangCache, a semantic caching solution designed to slash large language model (LLM) costs by up to 70%.
“As AI enters its next phase, the challenge isn’t proving what language models can do; it’s giving them the context and memory to act with relevance and reliability,” Trollope said. “Our investment in Decodable will enable developers to build and expand pipelines that convert data into actionable context, making Redis the essential memory layer for intelligent agents.”
India’s fast-growing AI ecosystem formed a key focus of Trollope’s address. “India is not only a fast-growing market for Redis, it is helping to shape the future of AI. With one of the world’s largest startup ecosystems and over 17 million developers, the scale and ambition here is unmatched,” he said, adding that Redis is working to make AI-powered systems more affordable, responsive, and reliable for enterprises and startups across the country.
LangCache: Lower costs, faster responses
Redis’ LangCache, now in public preview, enables chatbots and AI agents to reuse semantically similar queries to LLMs, cutting latency and drastically reducing token usage. According to the company, the service can lower API costs by up to 70%, deliver 15x faster response times, and improve consistency in outputs.
Broader AI integrations
Redis also announced new integrations with AI frameworks such as AutoGen and Cognee, as well as enhancements for LangGraph, helping developers simplify memory management, planning, and reasoning for AI agents.
Additional updates include hybrid search enhancements for unifying text and vector results, support for int8 quantized embeddings for faster, memory-efficient search, and major performance improvements in Redis 8.2.
India in Redis’ global roadmap
Redis has steadily expanded its footprint in India, including investments in a 105,000-square-foot R&D facility in Bengaluru and the launch of its India Data + AI Academy. The company said its tools will help India’s enterprises and startups—where cost optimization is a pressing need—scale AI deployments more efficiently.
With its latest strategy, Redis aims to cement its role not just as a data platform but as an infrastructure backbone for the AI era, bridging the gap between LLMs and the real-time, persistent memory needed for enterprise-grade AI applications.