Amazon Web Services776 тыс
Опубликовано 12 апреля 2018, 16:00
For more information on Greengrass ML Inference, please visit - amzn.to/2HeP7b9.
AWS Greengrass ML Inference is now generally available. Since we launched our preview at re:Invent, we have added feature enhancements to improve your experience while using AWS Greengrass ML Inference. We have made it easier for you to deploy and run machine learning models on your IoT devices. In addition to Apache MXNet, AWS Greengrass ML Inference now includes a pre-built TensorFlow package so you don’t have to build or configure the ML framework for your device from scratch. These ML packages support Intel Atom, NVIDIA Jetson TX2, and Rasperry Pi devices.
AWS Greengrass ML Inference is now generally available. Since we launched our preview at re:Invent, we have added feature enhancements to improve your experience while using AWS Greengrass ML Inference. We have made it easier for you to deploy and run machine learning models on your IoT devices. In addition to Apache MXNet, AWS Greengrass ML Inference now includes a pre-built TensorFlow package so you don’t have to build or configure the ML framework for your device from scratch. These ML packages support Intel Atom, NVIDIA Jetson TX2, and Rasperry Pi devices.
Свежие видео