Synaptics likes to keep in touch with the way people interact with devices. The company develops touch screens, touchpads, display components and fingerprint recognition technologies. Therefore, they like to think about the forefront of human-computer interaction. For example, Synaptics has just announced the latest fingerprint recognition and authentication technology, even through touch-screen glass.
This means that we may no longer need the home button on the smartphone. Is this a good idea? Rick Bergman, executive director of Synaptics, and his engineers must consider these challenges. You can invent a new user interface technology, but if people do not like or do not want to understand and learn, then this technology will not have market prospects.
Venturebeat interviewed Rick Bergman, executive director of Synaptics at CES 2017. Here are the interviews that Xiaobian compiled for everyone:
Synicks Executive Director Rick Bergman
It seems that there are many innovations in this area and many things have happened.
Rick Bergman: There are three aspects: touch (technology), display driver, and fingerprint (identification). The market did not move at all. We are also continuing to find ways to innovate and add value. I believe you have also discovered that OLED screens are becoming a big trend. There is also authentication technology, and almost no smartphones were equipped with such technology three years ago. This year, there may be 700 million or 800 million smart phones equipped with fingerprint readers.
You have just released a glass-based fingerprint recognition. The same is true for other start-up companies (Vkansee). I wonder if this is a fiercely competitive field. There are many start-up companies that want to enter.
Bergman: For fingerprinting, everyone sees the market trend. There are many companies that want to enter this field. Especially for glass solutions, this is a different requirement than previous solutions. No one has yet come up with an equivalent solution, at least as far as I know it. The Florida company Sonavation is the only company I know that is trying to develop fingerprints (identification technology) under glass.
It's fun because it can bring a new device. We are all accustomed to a home button, hold it down and see if it recognizes anything. Now you can get rid of this button and let the whole glass identify the fingerprint.
Bergman: Of course this is a goal. Samsung chose a borderless design. You can display every square millimeter on your phone. If you can get rid of the home button, this will reduce the possibility of moisture or dust entering, and at the same time reduce the cost from the system level. A single complete display brings visual benefits.
Synaptics' technology recognizes stress so you can draw thin or thick lines.
Is there a way to introduce new things and get people to adapt to them? If you get rid of the home button, it seems that there may be strong protests. Even if it is not a good idea, people have become accustomed to it.
Bergman: This is a security button for many people. Both Samsung and Apple have their own iconic home button. We may not really get rid of it. As a physical button, it may disappear, but you can still keep it electronically there. If you use haptic feedback properly, it will feel the same, but it is no longer just a physical button.
Now there is a combination of facial recognition and fingerprint recognition, as well as car fingerprints. It seems that two-factor authentication is important for some applications.
Bergman: Yesterday we released a two-factor authentication technology that can be said to be more convenient or more secure. For example, you are skiing and you really don't want to take off your gloves, so you open the face mode as an alternative to reading e-mail or text. Two factors are for safety. Many bank applications have already made such a request.
Is it possible to use fingerprints instead of keys for cars?
Bergman: We have not yet used fingerprint technology as a key substitute. We saw interest in fingerprint technology in two areas, and I believe this will increase. The first background is due to the influence of the iPhone and other mobile phones, people are accustomed to using fingerprint recognition technology. It becomes very natural. It can be a simple driver setup. You or your spouse or anyone can have separate settings that the system will recognize. Likewise, as the vehicle is becoming a rolling business interface, fingerprint identification technology will also grow. For example, take a meal at McDonald's and you will approve the transaction directly through the vehicle.
Does the car have a fingerprint button?
Bergman: Different OEMs have different opinions. (if any) may be located near the steering wheel or on the central console. More and more people want their vehicles to have all the features of a cell phone. If you pay $50,000 for a vehicle, you don't want it to be equipped with lagging consumer technology. Consumers want the most advanced technology. This is good for us, because becoming the leader in the consumer field is knocking on the door of the automobile industry.
Synaptics believes that you will use your fingerprint to start the car or confirm that the driver is not you.
How do you see the combination of tactile feedback and touch? In general, do people still want to see this combination?
Bergman: Almost all mobile phones use some degree of tactile feedback. Although, it is usually just an executor located on the home button. You feel like you pressed a button, but this is not the case. That click action does not really exist.
With the emergence of technologies such as virtual reality, people have said that they hope to receive tactile feedback. I don't know if you have considered this or it is still a long way off.
Bergman: No, AR and VR are our areas of greatest interest. Devices like Gear VR have lower-end touch. Our interest in VR is in the display. As you can see from our demo, we are developing high resolution displays. For VR, you have two displays, which brings a lot of interesting opportunities.
Fingers in the VR world are now being manifested in many forms. For example, Oculus Touch will represent your finger and show what they are doing, and you don't need to touch anything. You never get any real tactile feedback. I saw that some people are developing ultrasonic touch feedback and “blow†the sound through your many small speakers.
Bergman: You can use ultrasound, or you can use infrared and visible light. There are many companies developing 3D gestures. But so far we have not done anything in this area. We do not develop haptic feedback directly, but we are working with partners such as Immersion. We created a reference design to improve the touch experience. This is a potential area of ​​growth, but we currently do not plan to develop haptic technology.
Force feedback is not the next potential area of ​​innovation
Bergman: We have been providing feedback for several years. We started to see the phone launch such features. The reasons for stagnation for some time are the high implementation costs. We can't get it for free now, but now we don't need a separate chip or a separate sensor. We have already seen millet and other manufacturers interested in it.
An application for handwriting is very interesting, you can draw thin lines or thick lines according to the size of the force.
Bergman: This is the third dimension. We are working hard to develop force feedback and prepare a solution. If it is on a white paper, using force feedback is a very natural thing. This is a more natural operation, but it will take some time for people to adapt.
Do you think you are a visionary user interface?
Bergman: This is the way the company was born. We are still looking for opportunities to expand in this area. As a U.S. company, we are somewhat unique because we provide components for smartphones. Other manufacturers are also providing components for smart phones, but we are unique in the field of human-machine interfaces. There is no company in this area that is very close to us.
Will you say you have a dominant position in this area, such as touch-oriented technology or a larger user interface area?
Bergman: We want to dominate the larger HMI area. We have an advantage in the field of touch, and display driving is the second area we have entered. We are also continuing to observe other fields such as voice and audio.
It seems that voice control is gaining strength, especially at this CES conference. Google Assistant, Amazon Alexa, all these are integrated into devices such as NVIDIA's Shield. Voice control seems to make many products better or better use. How do you think voice control will work with other user interfaces?
Bergman: This is a good addition. I have an Amazon product, which is great for getting weather or other information and I don't have to type anything. Smart homes will take off in the coming years. I think that whether it's adding voice or adding a display, all of this will enhance the home experience.
I have interviewed scientific advisors from the Minority Report. He designed a computer interface for the movie and could operate it out of thin air. He later turned the idea of ​​the movie into a corporate product. Now you can set up three or four monitors in one room and use gestures to move documents or images. It looks like gestures will also occupy a place in the user interface world.
Bergman: Yes, although there is a limit. People don't like to keep hanging their hands. They will be very tired. This looks cool. There was a lot of discussion around this feature on the laptop for a while, but ultimately it didn't take off. Even using the touch function on the keyboard is tiring. Back to your question, people want a real keyboard to generate tactile feedback, but they don't want to do it in the air.
This is a science that understands people's behavior and psychology, such as what they want and what they can't stand.
Bergman: At Synaptics, back to your earlier question, we think of ourselves as a human-machine interface company. We have a dedicated team of human-machine interfaces. They will study what you just mentioned, but will end users be willing to adopt it? We are also asking this question, or sometimes OEM manufacturers will ask us such a problem. We once considered replacing the touchpad with 3D gestures, but we have found that this is not feasible.
I have noticed in many products that most users will remember something and then there are also things that senior users like and remember. I know some touch actions on the touch pad, but I don't remember anything for two fingers, three fingers, or four fingers. Is this designing for a universal group, but at the same time, the design of the core user group will happen often?
Bergman: We found that educating users is really hard. For example, once on the phone, there was a hovering function for a while, but this required too much re-learning. Things that do not need to learn are good. Force feedback is one such example. This is a real natural experience. You can say that it can save people a lot of time, but it is difficult to educate users.
I met Tobii at CES. The eye tracking company is also trying to bring eye tracking to the game. Some operations are quite easy, but some are difficult.
Bergman: If you came here three years ago, we have a demo with Tobii. If you have a laptop and a touchpad, this is a natural extension. If you look to the corner of the screen, you will want the cursor to move there. But this is too strange for people and it is costly to implement. So this did not achieve takeoff.
An interesting use seems to be that it really speeds up your task. For example, if you target a zombie, then they will explode.
Bergman: A feature that seems promising for the demo at the time is that if you have a lot of pictures, when you look at one of them, that picture will pop up right away. Or if you are browsing a content-intensive webpage, you just need to look at the details and it will automatically expand. This is a very cool feature.
Do you think this will be common in VR?
Bergman: Oh, yes. It will be. Unless they solve the problem of eye tracking, VR will never achieve soaring. This is why people cause motion sickness because you have this delay. If you want to track someone's head movement, you are already too late. You need to move with your eyes. You will see eye tracking in all AR and VR sets. This may be the reason Facebook acquired eye tracking company Eye Tribe.
Is this a very broad market for you, or is it a field with a potentially large market?
Bergman: Of course. We are very happy that we are in the human-machine interface because there are so many possibilities here. The challenge is to find the right people to invest. In a difficult market, it is not easy to continue to maintain our competitiveness and increase our share in display drives, fingerprints (identification) and all these other areas.
It now appears that different companies are targeting different expertise. There are companies that target eye technology and companies that target the fingers.
Bergman: You're right. But not many people will be successful. There are many other challenges. Compared with PCs and mobile phones, there is still a long way to go. What you want is a large-scale market penetration. The key is to stand at the forefront, but you can't go too far ahead.
Electrolyte Lithium Hexafluorophosphate (LiPF6)CAS:21324-40-3
Shandong Huachuang Times Optoelectronics Technology Co., Ltd. , https://www.dadncell.com