Visited Shenzhen "BIG-i"



I visited NXROBO corp. with Japanese tour and saw a demo of BIG-i. BIG-i is a communication robot which has a big eye. It is under development, and it might be on the market within a year.



It is intended to use at home like amazon's ECHO. Owner can give orders to the robot, like "swich TV on" or "turn the lights off" with his/her voice. It has functions of voice recognition and voice synthesis. This kind of robot might be called "communication robot".

There are several communication robots on the market. Pepper from Softbank is very famous in Japan. TAPIA, unibo, Robi, BUDDY and Sota are less famous and smaller than Pepper.

 

There is a survey about the favorite height of the robot.
The ranking of the sizes required for communication robot is
(1)120cm (The same as the seven years old child) --- 33%
(2)50cm (zero months baby) --- 29%
(3)20cm --24%
(4)160cm (adult) -- 12%

https://www.m2ri.jp/news/detail.html?id=30


Pepper's height is 120cm. Perhaps Softbank had preliminary survey. My robot's height is 80cm. It is between (1) and (2), so it is not bad. I am not sure but BIG-i also looks like around 80cm high.



With the same survey, the most required shape is "human like shape", humanoid. We could say Pepper is a human shape robot even though it does not have legs. It uses wheels to move around.

BIG-i does not adopt human shape. It has no arms and legs. It uses wheels to move around. The moving part of BIG-i is only the wheels and eyelid (eyeball?).



There are good things and bad things with this strategy. The more moving parts it has , the more failure it will cause. So it will cost less to build and will have less problem.

Pepper has many servo motors in arms and fingers but there are not many things pepper can do. The most use is for gestures. Pepper expresses emotions with them.

BIG-i seems thet it try to expresses emotions with big eyeball. Of course, it is not a simple circle. It has various shapes and movements. Eye is a very important part to express emotions. For example, some Japanese cartoonist never let the assistant draw eyes. They always draw eyes by themselves. Pepper has LED eyes. Regardless of the direction of viewing, Pepper seems to have facing the person. Obviously eye is an important element.

If the BIG-i can have the same degree of emotional expression as Pepper, I would say it is a success since the method is cheaper to implement and has
less possibility of mechanical troubles.

BIG-i has the same problem as Pepper. There are many errors with the speech recognition. I think that it will be appreciated by anyone who has faced the pepper, it is difficult to use practically. However the speech recognition technology is getting better thanks to deep learning. It is getting closer to the human level. By the time the BIG-i comes out on the market, it might have become something different.









@