Close Menu
    Facebook X (Twitter) Instagram
    Wednesday, September 17
    Facebook X (Twitter) Instagram
    VidaPaper
    • Adventure and Food
    • Business and Finance
    • Education and Career
    • Fashion and Beauty
    • Health and Fitness
    • Home and Garden
    VidaPaper
    You are at:Home»Business and Finance»Redis bets on India’s AI momentum
    Business and Finance

    Redis bets on India’s AI momentum

    Redis unveils LangCache for advancing AI innovation for enterprises and startups, to cut LLM costs up to 70%
    News WriterBy News WriterSeptember 17, 2025No Comments3 Mins Read
    Facebook Twitter Pinterest LinkedIn Tumblr Email
    Share
    Facebook Twitter LinkedIn Pinterest Email

    Redis, the world’s fastest data platform, today announced a major expansion of its AI strategy at Redis Released 2025. During his maiden visit to India as CEO of Redis, Rowan Trollope highlighted the company’s AI strategy, and announced new tools and capabilities for the company’s platform, as well as the strategic acquisition of Decodable, while underscoring India’s growing role in Redis’ global innovation roadmap. He emphasised India’s AI-led innovation and its role as a hub for engineering talent, enterprise adoption, and customer growth.

    While addressing the media, Redis CEO Rowan Trollope announced the acquisition of real-time data platform Decodable, the public preview of Redis’ new LangCache service, and several other improvements to Redis for AI that make it easier for developers to build agents with reliable, persistent memory. Together, these moves accelerate Redis’ evolution from the fastest in-memory data store to an essential infrastructure layer for AI, delivering the context and memory that intelligent agents depend on.

    “As AI enters its next phase, the challenge isn’t proving what language models can do; it’s giving them the context and memory to act with relevance and reliability,” said Rowan Trollope, CEO of Redis. “As technology becomes ever more reliant on LLMs, the strategic investment we made in Decodable’s platform will make it easier for developers to build and expand data pipelines and convert that data into context within Redis, so it’s fast and always available in the right place at the right time.”

    “India is not only a fast-growing market for Redis, it is also helping to shape the future of AI. With one of the world’s largest startup ecosystems, and millions of developers building intelligent applications, India represents the kind of scale, ambition, and innovation where Redis thrives. As enterprises and startups here embrace AI at unprecedented speed, our focus is on giving them the context, memory, and real-time infrastructure their systems need to be more capable, responsive, and reliable,” Trollope further added.

    Redis also announced the public preview of LangCache, a fully-managed semantic caching service which cuts the latency and token usage for LLM-dependent applications by as much as 70%, and announced several updates to its AI infrastructure tools, including hybrid search enhancements, integrations with agent frameworks for AutoGen and Cognee. For India’s third largest startup ecosystem and the fastest-growing developer community with over 17 million developers, where cost optimization and scalability are crucial, LangCache helps build more affordable AI-powered experiences for chatbots, agents, and enterprise applications.

    LangCache public preview

    LangCache is Redis’ fully-managed semantic caching solution that stores and retrieves semantically similar calls to LLMs for chatbots and agents, saving roundtrip latency and drastically cutting token usage.

    The performance and cost improvements are substantial:

    1. Up to 70% reduction in LLM API costs, especially in high-traffic applications
    2. 15x faster response times for cache hits compared to live LLM inference
    3. Improved end-user experience with lower latency and more consistent outputs

    LangCache is in public preview today.

    New agent integrations and agent memory

    It’s now easier to use Redis with existing AI frameworks and tools. Our ecosystem integrations let developers store their data the way they want, without needing to write custom code. New integrations with AutoGen, Cognee, plus new enhancements with LangGraph expand how developers can use Redis’ scalable, persistent memory layer for agents and chatbots.

    Build with:

    1. AutoGen as a framework while getting the fast-data memory layer of Redis and build with existing templates
    2. Cognee to simplify memory management with built-in summarization, planning, and reasoning using Redis as your backbone
    3. LangGraph with new enhancements to improve your persistent memory and make your AI agents more reliable
    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    News Writer

    Related Posts

    Livspace Achieves 35% Cost Efficiency in Recruitment

    September 17, 2025

    Astrophel Aerospace Indigenously Develops Cryogenic Pump

    September 17, 2025

    CynLr and IISc Partner to Decode Biological Vision for Real-World Robotics

    September 4, 2025
    Leave A Reply Cancel Reply

    Facebook X (Twitter) Instagram Pinterest
    © 2025 ManipalBlog Media. Designed by ManipalBlog.

    Type above and press Enter to search. Press Esc to cancel.

    Go to mobile version