Ctrl-Labs Gets USD28 Million From Google’s Venture Capital Arm GV And Amazon’s Alexa Fund For Neural Interfaces

Artificial Intelligence News

Ctrl-Labs-Gets-USD28-Million_From-Google’s-Venture-Capital-Arm-GV-And-Amazon’s-Alexa-Fund-For-Neural-Interfaces Ctrl-Labs Gets USD28 Million From Google’s Venture Capital Arm GV And Amazon’s Alexa Fund For Neural InterfacesNew York-based startup Ctrl-labs that is making a device capable of translating electrical muscle impulses into digital signals, has raised a sum of totalUSD28 million in a funding round for neural interfaces. The funding round led by Google’s venture capital arm, GV, along with other participants including Amazon’s Alexa Fund, Spark Capital, Lux Capital, Matrix Capital, Breyer Capital, and Fuel Capital. With this financing round, the company’s total valuation reached USD67 million.

On this move, Ctrl-labs CEO Thomas Reardon stated that the capital will support the company’s recently opened research and development lab in the region of San Francisco, as well as help its commercial partnerships. Further, the fresh capital will be utilized to develop and distribute Ctrl-labs’ developer kit, named Ctrl-kit, he added. The company introduced its Ctrl-kit at Slush in Helsinki, Finland in December. Currently, the developer kit is in the preview for select partners and is anticipated to initiate shipping by March this year. As Reardon noted, its intention of developing Ctrl-kit to provide the industry’s most ambitious minds the tools they want to re-imagine the relationship between humans and machines. The final version of Ctrl-kit will be in one piece, and it won’t be a completely self-contained affair. The developer kit has to be tied to a PC for some processing but is designed to get to the point where overhead is such that it can run on wearable system-on-chips. The company’s Ctrl-kit combined differential electromyography (EMG) to translate mental intent into action, especially by measuring changes in electrical potential caused by impulses traveling from the brain to hand muscles. It works individually of muscle movement; creating a brain activity pattern that Ctrl-labs’ solution can find requires no more than the firing of a neuron down an axon.

EMG devices draw from the cleaner and clearer signals from motor neurons and are limited only by the precision of the software’s Machine Learning model and the snugness of the contacts against the skin, consequently. According to the Ctrl-labs, it’s expecting that its early adopter to develop with Ctrl-kit, including video games, specifically in virtual reality games.