2 min

Tags in this article

,

AWS DeepLens, a ready-to-use camera specifically designed for deep learning, is now available in Amazon European stores. The camera costs 249 euros and has to convert the numerous theoretical examples into practical experiences.

With DeepLens, Amazon is launching a deep learning camera on the European market. AWS DeepLens is designed to run models via TensorFlow and Caffe in less than 10 minutes of developer start-up time.

The main motivation for AWS is to put their machine learning tool into practice with this camera. On the hardware side, the 4 megapixel camera can record 1080p video and a 2D microphone array is available. The device runs on an Intel Atom processor, 8GB of RAM and 16GB of internal storage (expandable via microSD).

The DeepLens camera has been available in the US since last year and according to The Register will get a small revision for the European market. This way it gets USB 3.0 instead of USB 2.0. The camera now also supports the Intel RealSense depth sensor such as the Intel Movidius Neural Compute Stick 2. Surprisingly, the camera still doesn’t have a microphone on board. There is an audio input available if you want to connect an external copy.

Teachers, students and developers

The camera runs on Ubuntu 16.04, AWS Greengrass Core and optimized versions of MXNEt and Intel cIDNN libraries. The device has 100 GigaFLOPS of computing power to process HD images in real time. Amazon has placed a number of use cases on its website to give an idea of the various possible applications of DeepLens.

With DeepLens, AWS targets teachers, students and developers. AWS DeepLens integrates into SageMaker and AWS Lambda. It has added computer vision models with Gluon, SageMaker imports and tools to optimize models.

This news article was automatically translated from Dutch to give Techzine.eu a head start. All news articles after September 1, 2019 are written in native English and NOT translated. All our background stories are written in native English as well. For more information read our launch article.