Power up your LLMs: Gemini CLI and Model Context Protocol (MCP)

71 243
9.9
Следующее
124 дня – 12 3541:18
Teamwork decoded ✅🚩
Популярные
92 дня – 43 9306:29
How to fine-tune LLMs for with Tunix
Опубликовано 29 августа 2025, 16:00
Level up your LLM game with Model Context Protocol (MCP)! Learn how MCP allows your Large Language Models to interact with traditional computer programs, expanding their functionality. This tutorial shows you how to build an MCP server, connect it to Gemini CLI, and use it to retrieve additional information, take actions on your behalf, or even generate videos.

Chapters:
0:00 - What is Model Context Protocol (MCP)?
0:55 - Building an MCP Server
2:18 - Connecting Gemini CLI to your MCP Server
3:21 - Connecting Gemini CLI to Linear using OAuth
6:55 - Veo 3 MCP Server

Subscribe to Google for Developers → goo.gle/developers

GitHub repository for video creation: github.com/GoogleCloudPlatform...

Speaker: Luke Schlangen
Products Mentioned: Google AI, Gemini, Generative AI
автотехномузыкадетское