Artificial intelligence develops innovative breakthroughs, and the interactive mode of car cockpit usher in a period of change

From the “Internet of Things” most mentioned in 2017 to the “artificial intelligence” that has been hot at the beginning of 2018, it seems that the ICT industry is generally optimistic about the development process of artificial intelligence in 2018. In terms of market capital, according to statistics of Whale, the big data platform for venture capital, the artificial intelligence (Intelligence, hereinafter referred to as AI) segment of China's capital market in 2016 and 2017 is the highest in the field of computer vision. , deep learning, autopilot, and natural language processing show that both visual and language AI interactions are valued by capital.

In a sense, artificial intelligence can be said to be the intelligent interaction between machines and people (such as service robots), machines and environments (such as automatic driving). The change of the interaction mode is one of the indicators to measure the development process of artificial intelligence. Among them, in the scenario where the machine-to-person interaction is low in demand, the human-vehicle interaction modes in the car cockpit evolve particularly clearly, from the push-button interaction of traditional mechanical instruments to the touch screen interaction of the most popular virtual instrument digital screen. , and further developed into voice interaction, gestures and facial recognition. According to statistics, the scale of the global automotive instrument market was approximately US$7.7 billion in 2016, an increase of 9% from 2015. It is estimated that by 2020, the scale of automotive instrument market will reach US$9.5 billion, and there will still be room for growth. From the time axis, these interactive changes have become more and more compact from the emergence to the maturity of the market, such as the transition from mechanical instrumentation to virtual instrumentation, and over decades, while virtual instruments are heating up. The development of interactive methods such as voice, gesture, and human face has already begun to take shape in just a few years. It can be said that the touch interaction of virtual instruments is still in the rising period of the market, and several subsequent interaction methods have already emerged. The industry is amazed.

Fujitsu (FUJITSU, 2018 Messe Shanghai, E4 Hall 4318) 3D full virtual instrumentation solution has been used in mass production by Fujitsu (Fujitsu, 2018 Munich Electronics Show, E4 Hall 4118) for mass production. . The 3D full virtual instrument solution created by the Fujitsu Triton Cluster SDK uses the main chip Triton-C, Fujitsu's agent product line, a full virtual instrument dedicated graphics processor from Socionext, equipped with a powerful global 2D engine. This solution has up to 8 layers of multi-layer display and 6 channels of video input, 3 independent outputs, with graphic security features designed for the instrument. In addition, touch-interactive virtual instruments further become the hub of cockpit functions and will continue to the next range of interactions. For example, Tesla, which has always been labelled as "Changer," cancels the instrument panel and central control buttons across the Model 3 series, leaving only a 15-inch touchscreen and connecting the ADAS to the networking functions. In summary, the digitization of human-vehicle interaction has reached a level not found in the entire city. This is already an area that traditional human-vehicle interaction cannot reach.

In terms of voice interactions, statistics data from the venture capital big data platform, whale quasi, shows that in 2016, the number of “natural language processing” investment cases in the Chinese market exceeded 100, second only to “computer vision”. This seems to be 2017. The well blowout of the speech recognition application has foreshadowed the fact that the industry chain of software, algorithms, sensors, chips, and products related to voice recognition interaction has also confirmed this point. Alexa, GoogleAssistant, Cortana These familiar intelligent voice assistants need not be mentioned. Sensors such as Infineon, Bosch (BOSCH, 2018 Messe Munich, E4 Hall 4400), Ames (ams, 2018 Munich, Shanghai) Electronics Show, Hall E4 4512)) And other large manufacturers have launched high-performance MEMS microphones for creating far-field voice applications. While mobile processor chip giant Qualcomm has already won the order of Xiaolong 820A chip from Jaguar Land Rover Auto Platform, it has also launched two Qualcomm HomeHub platforms supporting Google AndroidThings, based on the Qualcomm SDA624 and SDA212 system-on-chip (SoC) respectively. ). At this point, Qualcomm's voice recognition and interaction can be described as a full-scale deployment. The Xiaolong series platform can support almost all mainstream voice service systems, including Alibaba AI voice service, Amazon Alexa, Baidu DuerOS platform, GoogleAssistant and Microsoft Cortana VirtualAssist etc. Mediatek’s voice chip battle has been strong.

Relative to the rapid expansion of the speech recognition market, gesture recognition market application is relatively slow, but there is still no lack of big plants in the innovation, such as Bosch (BOSCH, 2018 Messe Shanghai, E4 Hall 4400) subsidiary Bosch Sensortec in 2017 Munich Shanghai Electronics During the exhibition, a new BML050 interactive laser projection micro scanner was introduced. In conjunction with advanced time-of-flight technology (ToF), advanced sensing applications such as distance measurement, 3D scanning, and air gesture control can be realized. With advanced speckle suppression and precise control of scanning mirrors and laser diodes, the Bosch Sensortec solution provides excellent projection quality, and its local laser color space significantly exceeds industry standards including Adobe® RGB and becomes a relevant application area. A new generation of industry benchmarks. Sony, the world's largest manufacturer of CMOS image sensors (SONY, the 2018 Automotive Technology Day sponsor), after completing the acquisition of the Belgian company SoftKineticSA, developed the automotive-grade 3D gesture recognition image sensor in cooperation with Melexis. In the experience of automotive sensors, the advanced nature of ToF sensor technology is further enhanced. SONY's hand gesture recognition technology is called DepthSense?CARlib. It can be used in the car to recognize the driver's gestures and achieve the corresponding operation on the screen; the driver does not need to actually operate the on-board display, thereby improving the driving safety. .

Speaking of facial recognition, it is inevitable to mention the iPhone X mobile phone. The introduction of FaceID on this mobile phone has triggered consumers' enthusiasm for 3D facial recognition, and also in this trend brought the first based on Imager- produced by STMicroelectronics (Shanghai Electronics Show, 2018, Hall E4, 4104). Near-infrared camera image sensors for SOI substrates and color/ambient light sensors from Ames Semiconductors (ams, Shanghai Electronics Show, 2018, E4 Hall, 4512). Thanks to the acquisition of Heptagon and the entry into the supply chain of iPhoneX, ams achieved revenue growth in the third quarter of 2017, which represented a 79% increase in total revenue compared to the same period in 2016; Optics has reached a partnership to jointly develop imaging solutions for 3D sensing applications, further expanding the emerging opportunities for 3D sensor imaging systems in the automotive world beyond mobile devices and smartphones.

It is true that the industry generally believes that facial recognition will be the first to achieve a significant increase in shipments in the smartphone industry in 2018. However, facial recognition has the same market potential in automotive cockpit applications. This is because face recognition not only has the identity authentication function like fingerprint recognition, but also provides a certain degree of safety protection of the driving process. For example, facial recognition technology can detect whether the driver is in a state of fatigued driving and inability to concentrate. It is reported that the FZI Information Technology Research Center, a research institute affiliated with Karlsruhe Institute of Technology in Germany, is using facial recognition technology to create a driver condition monitoring system, and through the embedded and sensor technology, to monitor the driver during driving. Vital sign data.

It is worth mentioning that, according to the disclosure of CBInsights, Google’s latest research patent is using optical sensors to detect faces and skins, and can detect abnormalities in blood flow velocity and other vital signs data, and then determine the health status of users, such as whether they are facing the heart. Vascular disease risk and so on. This may become another breakthrough in facial recognition!

It can be seen from the above that the interaction mode of the car cockpit is transitioning from digitization to intelligence. In the edge computing application scenario where the AI ​​calculation demand is much lower than that of automatic driving, relying on breakthroughs in sensors, processor chips, etc., it is expected to realize more advanced AI applications. It is not difficult to imagine that human-vehicle interactions, or human-computer interactions, eventually tend to be organically combined in a variety of interactive ways. At this time, high-performance, high-reliability sensor products and processor chips, as the core of edge AI, will also usher in a new round of shipment opportunities.

Push In Terminal Block

With Push-in connection technology, you can connect conductors easily – both directly and without tools, insert forces reduced by 50%. it is suitable for various applications. Insert rigid conductors or conductors with ferrules from 0.25mm² into the conductor shaft.The contact spring opens automatically and provides the required pressure force against the current bar.

When installing smaller conductors from 0.14mm², use a standard screwdriver to push the orange button and actuate the contact spring.

Push-in Connection have test and passed various certificates. For example, Vibration resistance in accordance with railway standard DINEN50155,shock and corrosion resistance in accordance with current shipbuilding registers.

Wire Terminal Block,Spring Terminal Block,Fixed Terminal Blocks,Push Terminal Block

Wonke Electric CO.,Ltd. , https://www.wkdq-electric.com

This entry was posted in on