Power up your LLMs: Gemini CLI and Model Context Protocol (MCP)

77 354
10.4
Следующее
245 дней – 12 3891:18
Teamwork decoded ✅🚩
Популярные
Опубликовано 29 августа 2025, 16:00
Level up your LLM game with Model Context Protocol (MCP)! Learn how MCP allows your Large Language Models to interact with traditional computer programs, expanding their functionality. This tutorial shows you how to build an MCP server, connect it to Gemini CLI, and use it to retrieve additional information, take actions on your behalf, or even generate videos.

GitHub repositories for video creation:
Genmedia MCP server → goo.gle/4c4bWMh
Genmedia MCP server integration with Gemini CLI → goo.gle/3NK0Pig

Chapters:
0:00 - What is Model Context Protocol (MCP)?
0:55 - Building an MCP Server
2:18 - Connecting Gemini CLI to your MCP Server
3:21 - Connecting Gemini CLI to Linear using OAuth
6:55 - Veo 3 MCP Server


Subscribe to Google for Developers → goo.gle/developers

Speaker: Luke Schlangen
Products Mentioned: Google AI, Gemini, Generative AI
автотехномузыкадетское