Slide

驅動無接觸數位轉型
AI語意邊緣運算晶片

讓智慧電器、智慧商店、消費性電子產品、行動隨身裝置具備自然語意互動聲控之能力。 歡迎洽談合作喔 !!
Slide

AI語意聲控晶片將可取代觸控介面,驅動企業進行數位轉型

Human-computer interaction interface

The human-machine interactive interface can be controlled by natural language voice control or by using the LineBot menu interface or inputting text dialogue. Other interactive interfaces can also be supported.

AI semantic engine develops on edge devices

It can be applied to edge or terminal devices, replacing touch with semantic voice control to avoid the infection of bacteria and viruses, and empowering devices to interact with humans in natural language.

Support chips from various brands

Ubestream can deploy a semantic engine for microcontroller which introduces ultra-low power consumption on edge. It can provide lightweight power consumption for IoT devices in a specific restricted resource platform.

AI semantic (NLP/NLU) STT(ARS)/TTS engine on edge devices

The National Development Council's research report pointed out that in the next three years, global AI will develop from cloud to edge (AI from Cloud to Edge). AI training (Training) will still be the main force of cloud AI, but AI inference will shift from cloud. To edge devices, that is, the popularization of AI-capable edge or terminal devices will become a trend. In the future, edge or terminal devices can also have AI responsiveness without connecting to the cloud and using GPU computing power. The key is The point is to use AI algorithm lightweight technology to enable AI edge computing to be implemented in micro servers at the edge or terminal devices, or even in chips.

Ubestream has successfully developed AI semantic edge computing technology, which can embed lightweight AI semantic algorithm into low- and medium-level low-power chips used in consumer electronics products. In the digital transformation driven by the post-epidemic era, the AI semantic voice control chip will replace the touch interface and be used in smart homes and smart hotels for air-conditioning, TV audio, light security control, smart stores and smart restaurants for multi-function ordering and ordering. Machines, autonomous vehicles or unmanned vehicles for smart transportation and smart transportation, mobile devices such as mobile phones, earphones, wearable watch bracelets, toys, video games, robots and even interactive devices such as virtual idols using XR technology.

AI semantic (NLU/NLP) chip applications

How NLP works?

NLP entails applying algorithms to identify and extract the natural language rules such that the unstructured language data is converted into a form that computers can understand.
The AI semantic engine is powered by Ubestream Inc. including the below features, The Characteristics of semantic engine on edge:
(1)Offline deployment on edge devices: Speech to text (ASR) for specific retrained command sets and text to speech (TTS) can be deployed in edge device in quickly. The specific command sets for a restricted resource in microcontroller system are important.
(2)AI Trending to a low power microcontroller: Ubestream can deploy a semantic engine for microcontroller which introduces ultra-low power consumption on edge. Therefore, it can provide lightweight power consumption for IoT devices in a specific restricted resource platform.
(3)Smart home & smart city: Technologies like as STT, TTS and semantic engine using on IoT devices which imply a smart appliance to smart home, eventually for smart city.
(4)Contactless: Ubestream provides a good application such as just talk to IoT devices. Devices will have some reply and action in contactless field to avoid COVID-19 for public infection.

Driving contactless digital transformation

In the digital transformation driven by the post-epidemic era, the AI semantic voice control chip will replace the touch interface and be used in scenarios such as smart homes, smart hotels, smart stores, and smart transportation.

Semantic AI on Edge Chip set embedded technology has achieved a leading position in global AI companies and established technical barriers. It currently supports natural language dialogues in Chinese and English, and will expand to Japanese dialogues in the future. At present, it has begun to negotiate and cooperate with well-known domestic and foreign company groups.

Why use AI semantic voice control chip ?

01

AI semantic voice control chip will replace the touch interface, safety and health are in line with the digital transformation driven by the post-epidemic era.

可應用在智慧家庭與智慧旅館之冷氣空調、電視音響、電燈安控,智慧商店與智慧餐廳之多功能訂購點餐機台等產品。

02

人機互動介面及穿戴裝置加上AI語意晶片以達到更佳的使用者體驗,增加使用者黏著度

智慧交通與智慧運輸之自動駕駛車或無人駕駛車,手機、耳機、穿戴式手錶手環等行動隨身裝置,玩具、電玩、機器人甚至是運用XR科技之虛擬偶像等互動式裝置都可加上AI自然語意嵌入式晶片做為co-processor,提供這些裝置在未連入雲智慧使用GPU算力的情況下,具備自然語意互動聲控之能力。

03

未來全球AI技術與應用市場將從雲端往邊緣端發展,儘早布局取得競爭優勢及商機

國家發展委員會的研究報告指出,未來全球AI技術與應用市場將從雲端往邊緣端發展,企業儘早布局AI語意聲控應用技術以利獲得市場先進優勢,搶進商機提高市占率。

Are you interested in our AI semantic voice control technology?

Welcome to inquire about business cooperation, technology introduction, strategic investment etc.