
The natural language processing (NLP) SmartVision application continuously detects keywords spoken by the user and showcases keyword-based dynamic switching between multiple vision tasks and/or changes the display properties.
No, the application does not require any experience in FPGA design.
This application is free of charge from AMD.
The NLP-SmartVision application has been primarily designed and tested with onsemi’s AR1335 image sensor. You will have to update the design and application if adding another MIPI sensor.
The application should work with most USB microphones.
The NLP-SmartVision application will only work for the pre-defined 10 keywords. You can train the model with custom keywords and modify the application using the released app sources.
Kria™ adaptive System-on-Module (SOM) devices from AMD play an important role in electric drive control. They can optimize performance, help a motor run more efficiently, reduce power consumption, mitigate noise, cut vibration, and detect potential failures before they happen. Download our new motor control eBook to learn more!
Learn all about adaptive SOMs, including examples of why and how they can be deployed in next-generation edge applications, and how smart vision providers benefit from the performance, flexibility, and rapid development that can only be achieved by an adaptive SOM.
Demand for robotics is accelerating rapidly. Building a robot that is designed to be safe and secure and can operate alongside humans is difficult enough. But getting these technologies working together can be even more challenging. Complicating matters is the addition of machine learning and artificial intelligence, which is making it more difficult to keep up with computational demands.
Roboticists are turning toward adaptive computing platforms, which offer lower latency and deterministic, multi-axis control with built-in safety and security features on an integrated, adaptable platform that is expandable for the future. Read the eBook to learn more.