Writing Our First Classifier - Machine Learning Recipes #5

360 791
26.9
Следующее
Популярные
55 дней – 1 2340:40
Learn how to extend Wordcraft
Опубликовано 8 июня 2016, 16:57
Uncover the intricacies of creating our first classifier from the ground up, making this tutorial an essential milestone for newcomers to the field. In 8 comprehensive steps, we'll guide you through the process of writing the classifier, starting with the code introduced in episode #4 and comment out the classifier we imported. Then, we'll code up a simple replacement - using a scrappy version of k-Nearest Neighbors. Explore the pros and cons of this approach as you dive deeper into understanding the mechanics of writing a classifier. Don't miss this opportunity to expand your knowledge and master the art of crafting classifiers from scratch!

Chapters:
0:00 - Intro
0:00 - Outline
0:56 - Step 1: Comment out imports
1:22 - Step 2: Implement a class
1:38 - Step 3: Understand the interface
2:08 - Step 4: Get pipeline working
3:14 - Step 5: Intro to k-NN
4:38 - Step 6: Measure distance
6:13 - Step 7: Implement nearest neighbor algorithm
7:22 - Step 8: Run pipeline
7:47 - Pros and Cons for this algorithm
8:24 - Wrap up

You can follow twitter.com/random_forests for updates on new episodes, and have fun!

Here's our playlist →goo.gl/KewA03

Subscribe to Google for Developers → goo.gle/developers

#ML #machinelearning
автотехномузыкадетское