1
Department of Food Science, School of Food Engineering, University of Campinas, Campinas, São Paulo, Brazil
2
School of Interactive Arts and Technology, Simon Fraser University, Surrey, British Columbia, Canada
Corresponding author details:
Ivan Abdo Aguilar
School of Interactive Arts and Technology
Simon Fraser University,102 Avenue, V3T 0A3
British Columbia,Canada
Copyright: © 2019 Aguilar JGS, et al. This is an open-access article distributed under the terms of the Creative Commons Attribution 4.0 international License, which permits unrestricted use, distribution and reproduction in any medium, provided the original author and source are credited.
Technology is always evolving; new discoveries have been more specific and deeper levels
of interaction between the real and the virtual worlds have been reached. Nowadays, people
are increasingly interested in this immersion of sensations and feelings that technology can
provide. Besides being fundamental to life, the act of eating brings people together, causing
various interactions between them. Eating brings unique sensations that involve different
emotions and feelings in people, which can be leveraged to create greater experiences
with the use of human-computer interactions. Technology has brought the virtual world
closer to the real world in order to enhance culinary experiences, either by creating digital
foods or enhancing real ones, by developing devices and programs that are capable of
invoking or measuring sensations, with immersion tests where a participant is inserted
into a controlled virtual environment closely resembling the real environment, and by
measuring and evaluating emotions/responses. This review presents the relationships
between humans, computers and food, called human-food interactions, focusing on the use
of computational technologies, exploration of human senses, and digital interactions in food
experience design, showing the future challenges that need to be overcome.
HCI; Virtual food; Virtual reality; Multisensory experience; Virtual senses; Consumer
Human-food interaction (HFI) research has emerged as an area of great interest in the related field of human-computer interaction (HCI) in the last few years [1]. Digitizing techniques exist to bring the objects and elements of the real world into the virtual world, which can be done in many ways by using specialized sensors and scanners, photo imaging, computer assisted design software, among others [2-4].
The virtualization process covers stages of modeling, simulation, optimization, and dynamic studies. Several industries have benefited from such activities and the food industry is still lagging to use the potential offered by virtualization as a tool [5]. With the advancement of technology, someday, there will be a new kind of dinner table, where digital and computational aspects will be linked to design, science, technology and food engineering, providing new sensations from the fully digitized world [6].
The senses are responsible for external daily experiences [7,8]. In research, it is an
obstacle to conduct experiments involving the senses because of the difficulty in creating
a suitable environment which replicates the complexity of the factors contained in the
real environment, since laboratory tests are performed in controlled settings or in front
of a computer [7]. Some senses still cannot be completely utilized for interaction with
technology, posing a challenge to be overcome. Interaction through the senses is mainly
led by vision and hearing, touch can also be used, but taste and smell still remain as great
challenges [9].
To better understand the relationships between computer and human beings, some definitions need to be established. HCI is the study of how computers, and derived technologies, influence the interactions and relationships between humans and computers with their everyday activities [10]. HFI research, HCI research specifically on food, has emerged and attracted significant research interest on how to better understand the influence that technology has on the interactions and relationships between humans and food. Some topics of interest in HFI are: health and ecological sustainability of food production and consumption; social and cultural interactions around food; food practices (for example, eating, disposal, growing, cooking); consumer experience and product evaluation; food safety; and and entertainment [1,11-17].
The robust perception humans have of the world is based upon the processing, integration, analysis, combination and interrelation of the different human senses. Multiple senses are used to explore the environment and perceive information. Through the senses, sight, hearing, touch, and smell, it is possible to experience external stimuli [18]. Multisensory interactions can affect a person’s performance of everyday tasks and can lead to the creation of new immersive experiences between environments, products, services and users, which can be used to engage audiences, convey meaning, and enhance the overall user experience [19-23].
The systems used to evaluate these interrelations can work with one or multiple input forms and interactions between them. Unimodal systems use only one form of input, for instance, speech, vision, mouse, keyboard or pen. On the contrary, multimodal systems function in a more robust way, integrating different input modes together, aiming to provide users with specific tools to control output information, such as using multiple input forms to interact with visualizations and multimedia content [24]. The understanding of communication and interaction between humans and technology can be reached using multimodal interactions, through the combination of human’s natural capabilities of communicating via speech, facial expressions, touch, gesture, among others [18,25].
Cross-modal systems represent interaction methods where one or more senses influence the perception of another sense. For example, the effect of the food color on its taste/flavour and smell. Another form of cross-modal interaction is a form of synesthesia, a perceptual illusion, where a sensory mode (stimulated by the virtual environment) is perceived to stimulate another mode (not stimulated by the virtual environment) [26,27].
Research has shown that stimulating certain senses can lead to perception changes in others. Bi-directional influence on perception has been demonstrated to occur between the senses of: taste and smell [28,29]; taste and vision [30,31]; taste and touch [32,33]; taste and hearing [34,35]; smell and vision [36,37]; smell and touch [38,39]; smell and hearing [40,41]; vision and touch [42,43]; vision and hearing [44,45]; touch and hearing [46,47]; and a mixture of senses together [48-50].
Research has also revealed how these cross-modal stimulations
can also be applied to new technologies and devices in food-related
experiments to induce perception changes. For instance, how the
sense of touch, in the case of weight, alters the perception of taste
[51]. The sense of sound, of food texture, was shown to influence
food taste [52]. Induced vision and smell influenced the perception of
taste, using a visual and olfactory display system [53].
There are several types of technologies capable of promoting HCI. These technologies can be applied in different research fields. In the food area, a tendency to study HFI can be observed, promoting, mainly, the immersion of people in a virtual world completely closed off from the real world, providing new sensations and feelings. This topic brings some examples of technologies and their recent uses within HFI.
Virtual Reality
Virtual reality (VR) is a complete immersion technology where the user is connected to a computational device that can simulate computer generated, real and virtual environments. While immersed in this environment, the user cannot see the real world around them. These environments can be used for simulations, training, entertainment, education, evaluations, among others [54,55].
Immersive VR is a promising method of immersing people in an almost real environment involving the greatest number of senses. Such environments provide many similarities with the real world, allowing researchers to constrain experimental factors to obtain empirical data [7].
The Project Nourished is an example of VR use that provides a complete state of immersion. During a meal, multiple devices are used to enhance the dining experience, for instance: a VR headset, to visualize a simulated environment and change the aesthetics of food; an aromatic diffuser, to smell the virtual food; headphones, to transmit mimicked chewing sound and vibration from the mouth to the ear; food utensils and a glass cup, present in the real and virtual world for interacting with virtual food and beverage; and 3D printed food, used to articulate areal taste and feel to the virtual food. This project allows for a different eating experience with the virtual use of all the senses, giving an option to ingest or not actual foods with calories and an option of creating your own virtual food based on a fictional movie scene, for example [56].
Augmented reality
Augmented reality (AR), a system which combines real and virtual elements, is subject to real-time interactions and has a precise alignment and synchronism between virtual three-dimensional elements and the real world environment. AR seeks to establish an environment which perfectly integrates the real world with the virtual world, where augmented information is superimposed onto images and videos from the real world (captured through a camera). Though AR is most commonly used visually, it can be used to augment all five of the human senses [54,57].
Several studies apply AR for the study of foods, mainly using the sense of vision. These studies aim to reduce food consumption, by modifying the food intake and eating experience, having the potential to assist in the prevention and reduction of diseases related to food consumption, such as obesity, hypertension, cholesterol, heart disease and diabetes [53,58-60].
AR applications are constantly changing and being used more frequently in retail environments [61,62]. AR applications, for example, can present information, such as nutritional and reviews, and have gaming elements to bring entertainment experiences when a product is scanned by a mobile device [61]; personalization of the consumer experience, where customers can create and interact with content which is personalized to them; socialization, consumers can share products with others by taking photos of AR content which can be used as AR marketing; accessibility, consumers without technical skills can create their product; and novelty, the use of the technology being “something new and different” can lead to early adopters [57].
Developed by Narumi et al. [53], Meta Cookie is a prototype that exemplifies the use of AR with food. By using an AR marker, Meta cookie combines a head-mounted display (HMD) with an olfactory display, enabling the user the experience of viewing different cookies (chocolate, strawberry, tea, etc.) and provides a distinctive aroma that varies from each cookie chosen. Results showed that users experienced a change in taste, even though they were all eating just a plain cookie.
An AR mobile grocery shopping application was developed by Ahn et al. [63] to provide real-time customized recommendations of health products and warnings about specific products with health appeal, such as allergies to milk or nuts, low sodium, low fat and general caloric intake.
Inamo is an interactive restaurant which uses AR technology to provide the visitor with full control of the gastronomic experience through the projection of the menu directly on the table surface. The technology also allows the visitor to configure the temperature, make drawings on the table, learn about the local neighborhood, play games and view a live chef camera feed [64].
Virtual Lemonade is a multimodal AR gustation system used to transmit flavor (color and pH value in this case) from a real beverage to a simulated beverage in another location. The system contains three stages: capturing the color and pH value of the original beverage, transmitting the captured information, and simulating the beverage in a different location (using water, LEDs and controlled electronic pulses to augment real-world sourness sensations) [65].
Meibner et al. [66] studied user behavior in a virtual supermarket using an eye tracking technology. Their results showed that participants’ product decisions were greatly influenced by the use of AR information and that participants deemed the use this information as being helpful and desirable in future shopping experiences.
Quick Response code
Quick response code (QR code) is a two-dimensional matrix symbol used for encoding and storing data (7,000 characters). This symbol was invented in 1994 by Denso Corporation, one of Toyota’s major group companies, with the intended use in production control of automotive parts, but it later was recognized as an ISO international standard (ISO/IEC18004) and widespread into other fields. Today, mobile phones can capture a QR code with their camera, decode and present the information stored inside. QR code scanning technology, nowadays, is widely known and its readers are available across multiple mobile platforms [6,67-69].
According to Schöning et al. [6] food can be improved through computational techniques. An example of this is the German QR Cookie (QKies). The main idea of this product is to use QR code in cookies for digital marketing. QKies QR code is linked to a specific site and can deliver the desired message [70].
3D Printing
Food layered manufacture (FLM), also known as 3D food printing or food fabrication, is relatively new, expensive and difficult. Food printing is based on Additive Manufacturing (AM), also known as 3D printing, which joins materials, layer by layer, to create physical 3D objects [71].
The use of various foods in printing is still a challenge. Food printing can modify the shape, texture and flavor of food, changing the experience of cooking, using different formats and printed foods [16]. According to Schöning et al. [6], advancements in 3D printing technology help in the development of digitally produced foods. Some uses for food printing are food fabrication; multiscale design and creation of edible food structures; consumer-designed food fabrication; customized food forms and flavors; personal food factory; consumer-designed food fabrication in a domestic and usercontrolled experimentation environment [71].
The CandyFab project develops machines that inexpensively print objects from pure sugar using a process of melting sugar grains together with hot air. The production is based on the basic principle of any 3D printer stacking solid two-dimensional printed layers [72].
Chocolate can also be printed. The technology company, Choc Edge, a pioneer in 3D chocolate printing, has developed Choc Creator, which creates 3D edible chocolate models. Like CandyFab, Choc Creator uses traditional coordinate system technology, similar to plastic 3D printing. An idea is transformed into a 3D model and a code is generated. This code is loaded into the printer and the chocolate object is printed out layer by layer [73].
The ‘Insects au gratin’ project is a collaborative design project which uses insect flour to print a paste mix, used as a building medium, and combines it with other food products to shape food structures and create a sustainable source of food [74].
Another company specializing in food printing is Bocusini, providing new experiences to consumers, with personalized prints that can range from simple names to busts to cake toppers and are made with different food materials: pasta, chocolate, marzipan, cassis and fondant [75].
Humans perceive texture and food flavour based on the feel, sound, trigeminal stimulation, appearance, smell and taste [76]. The senses, or sensory systems, are necessary for the production of all external information and perceptions, mainly through the use of vision and hearing [8] which have dominated the field of HCI studies. Taste and smell are two particularly difficult senses to be studied, besides being chemical senses; they make direct contact with the neural substrates of emotion and memory [77,78]. Memories evoked by smell are often emotionally more potent than those by other senses [9]. New researchers have currently focused on the study of touch, taste and smell [78].
The category of physical senses of humans encompasses the sense of touch, sight and hearing, while the sense of taste and smell are chemical senses. The study of taste and smell (sensory study) is challenging, as they are neurologically interrelated and subjective [21]. Digital stimulation of the chemical senses has two “routes”: without the use of chemicals, by electrically and thermally stimulating taste buds; and through the use of chemicals, which can be released using digital control. To transmit the sense of smell and taste, from the computer to the human, both routes may be used [79].
Sense of sight
According to Ramic-Brkic and Chalmers [80], visual perception has become an increasingly important tool in the field of computer graphics. Researches in this area explore the knowledge of the human visual system to render and display three-dimensional graphics.
Sakurai et al. [59] studied how the amount of food affects food behavior, using a tabletop system that is capable of designing virtual dishes around food by altering the apparent volume of food. According to the research results, the amount of food consumed and the perceived volume of food placed on the dish can be influenced by the size of the dish.
The effect of AR to guide the amount of food served on the dish was evaluated by Rollo et al. [81]. Participants were divided into into 3 test groups: with no information (control); with verbal information about the size of the portion; and those that used ServAR, an AR tool created to assist in assembling the virtual portions using a tablet. According to participants, the developed tool was easy to use and useful. The authors concluded that the use of AR improved accuracy in estimating standard plate sizes compared to the other conditions. The tool demonstrated potential in the orientation of serving food.
Eye tracking and electroencephalogram sensor technologies have been used to measure and understand human behaviours and the visual attention consumers have during their shopping experience. This was used to understand how to stimulate sales, assist in marketing decisions, guide design and presentation of products, and help consumers make decisions [66,82].
Sense of hearing
The sense of hearing (auditory system) is a prominent social sense human beings have and can be employed to cue visual attention, create ambience and emotion, provide information on the environment and surrounding areas which are not in the immediate range, and assist other senses in establishing a feeling of presence and immersion [83,84].
According to Carvalho et al. [85], the use of sound is able to enhance the tasting experience. Using chocolate as a taste stimulant, the authors observed that the sound was able to modulate the flavor of the foods, being able to add significant hedonic value during the sensorial analysis.
The perception of sound is one of the main senses explored at The Fat Duck restaurant in the UK. The sound of the sea is served along with the seafood dishes. As the customers enjoy their dinner, their memories and sensations about the sound of the sea, waves crashing, and flying seagulls are transported to the dining table using an iPod [86].
The chewing jockey technology uses a light sensor to detect jaw movements and release a specific sound. A microphone in the jaw can also amplify the sounds of the bite. This type of technology can be used to relate a food to its brand, for example, where the consumer is able to hear the sound attached to that brand while experiencing the food [52].
Sense of touch
The touch (haptic) sense is the main and most used in daily human interactions. The use of hand or finger for haptic interfaces is one of the main research fields in VR [87].
One of the first studies to demonstrate the effectiveness of tactile augmentation as a simple, safe and inexpensive technique for physically touching and tasting virtual objects was performed by Hoffman et al. [88]. The authors conducted a pilot study where participants, located in an immersive environment - a virtual kitchen, physically bit a virtual chocolate bar that would capture the physical properties of a real chocolate. Participants described the experiment as more realistic and fun when physically biting a virtual chocolate bar than imagining biting a virtual chocolate bar in a virtual environment.
Iwata et al. [87] developed a food simulator for the development of a haptic interface for biting. The simulator generates a profile from a person’s bite in the real world and then is able to simulate/ generate the force applied by the bite representing the texture of the food. The equipment has a mechanical configuration that fits in the mouth, has an integrated force sensor with an audible and chemical display for multimodal sensations in a taste. The tests were carried out with two virtual foods: a virtual rice cracker with the sound of bite present and a virtual gummy candy with a chemical substance produced from the five basic tastes. In the case of the virtual rice cracker, a total of 21% of the interviewees recognized the virtual food without any instruction; 66% recognized the food after being informed that it was a virtual rice cracker and 13% did not recognize the food. With respect to virtual gummy candy, 40% of interviewees recognized the virtual food without any instruction; 56% recognized the food after being informed that it was a virtual rice cracker and 4% did not recognize the food.
The Straw-like user system allows the user to experience the sensation of drinking in a virtual environment. Sensations are created based on actual data collected from an ordinary straw attached to the system. The elements that allow the sensation are the change of pressure in the mouth generated when food blocks the straw; the sound of collisions and friction of food; and vibrations that occur with sound [89].
A system called “Electric Food Texture System” was developed by Nijima and Ogawa [90] to simulate chewing foods of different texture using electrical muscle stimulation (EMS). The system consists of a database with the texture of real foods; a part that provides the electrical stimuli; and a part for bite detection. There is no real food in the user’s mouth, however, because of the electrical stimulation; they feel as if they have been eating real food. The authors concluded that EMS was useful for presenting the elastic texture of foods (gummy candy and chocolate), however, it was not useful to present harder textures (potato chip and a rice cracker).
Sense of smell
Smell (olfactory system) is a primary chemical sense [80]. The sense of smell can evoke feelings, differentiate products, stores, brands and impact the psychological state of the consumer.
Olfactory perception is composed by intensity estimation, qualitative description, and hedonic tone [91]. Smells have been integrated into a VR system by means of olfactory displays that generate scented air from odor materials with the desired components and concentration presenting to the user a more realistic experience [92]. Olfactory displays are categorized in two types: ubiquitous, where a device (not attached to the user in any way) transmits scent into an entire room or directly to the user’s nose by tracking their body and facial movements; wearable, a device is attached to the user (e.g. on an HMD) for closer scent transmission and personal experience [93]. Olfactory displays, however, are difficult to load and are limited in terms of the number of scents that can be stored, created, issued, delivery methods and the distance they can diffuse [8,92].
Odor simulation in VR environments has the possibility of being a means to users to evaluate products. By using virtual prototyping and user evaluation in these environments, industries can reduce the need to develop a real one. The use of scents is shown to be an important factor in product evaluation and consumer intention, even in products not traditionally associated with odors, which can give the product a competitive advantage. The research was conducted with users evaluating products in a virtual multisensory environment, using a HMD with an olfactory display, and presented results similar to real environments, showing that it is possible to evaluate products in VR environments with the use of olfactory displays [92].
State-of-the-art Immersive VR technology coupled with an odor system was studied by Ischer et al. [7]. The “Brain and Behavioral Laboratory-Immersive System” is easy to use and control and provides an immersive, interactive and three-dimensional environment capable of limiting the contamination between odors that can still be linked to images and sounds.
Sense of taste
The sense of taste (gustatory system) can detect and distinguish between five different stimuli (bitter, salty, sweet, sour and umami) and is a very difficult response to be virtually converted [78,94]. It is a multimodal sensation made up of chemicals substances and involves the senses of smell, hearing and touch. Sound is simple to be synthesized virtually and smell can be achieved through vaporization, so the biggest problem is being able to convert the taste sensation [87]. The stimulation methods to simulate taste sensations digitally are divided in chemical and nonchemical approaches [95].
Virtual food uses electronics to emulate the feel and taste in the mouth of a real-world meal. This technology would be useful for providing sensory inputs and improving gastronomic experiences [96].
Ranasinghe et al. [97] studied the use of electricity to explore the taste sensation as a digital media. Multimodal bottle and spoon control system was used to trigger different taste sensations while users eat or drink food and beverages, by using visual stimulations (superimposing color through the use of alternating LEDs) and tongue stimulations (through controlled electrical pulses). The authors created an alternative way to make drinks taste better, by enhancing the sweetness, saltiness, bitterness and sourness, without changing the content or use of flavoring ingredients, like sugar, as a way of preventing excessive use of chemicals and, therefore, benefitting health nutrition
The Digital Lollipop was created to digitally simulate the taste sensation through electrical stimuli on the human tongue. Through the manipulation of the magnitude, frequency and polarity (properties of electric currents) it is possible to create different types of stimuli [98].
Based on the study of Cruz and Green [99], where it was shown that heating the tip of the tongue resulted in perceived sweet taste sensations, several other studies were carried out. These studies have shown that thermal stimulation on the tongue or nose resulted in sweet sensations without the need for chemicals like sugar in beverages [100-102].
Multisensory senses studies
Sester et al. [103] evaluated the effect of the environment on food choices, in particular, on beverages. The authors developed two virtual environments simulating a bar, an environment with blue furniture (cold) and another with wooden furniture (warmth) and both with visual and musical stimuli projected on the wall. Videos projected on the wall in the first study (46 participants) led to the conclusion that these elements were sufficient to influence declarative drinking options. Videos designed in a second study (120 participants) were used to evaluate the robustness of the method, in which case the participants had to choose one out of 5 beers and it was confirmed that the choices were made according to the environment.
Bangcuyo et al. [104] studied the use of immersive techniques in the hedonistic sensorial evaluation of coffee. At each session, 50 participants evaluated five coffee samples in a traditional sensory analysis booth and then in a virtual coffeehouse. The tests were repeated with a month apart from each other. It was observed that the preference for coffees was different according to the condition in which they were evaluated. The power of the hedonistic scale was improved with inclusion in the contextual (virtual) environment, since the environment influenced the evaluation of participants.
The consumer acceptability of coffee, affected by situational and involvement conditions, was studied by Kim et al. [105]. The authors wanted to understand the effect of evoking sensations in different environments on consumers and evaluated two approaches, simulated coffee and cognitive evocation (using or not using phrases). 200 participants took part in this study. For the simulated coffee condition, the room was made up to look like a common coffee shop. For the evocation condition, subjects were given with a sheet of paper with evocation phrases, and were instructed to read it and imagine the situation. After evaluating two samples, the results showed that the consumers had their tastes influenced by the factors studied, mainly by the environment, which had a greater effect.
Hathaway and Simons [106] evaluated the taste of four brands of soft chocolate chip cookies in two tests involving 49 participants. The first test was performed in the traditional way and the second was in two virtual environments contextualized with audio-visual and olfactory cues related to the cooking of cookies in a domestic kitchen. The first virtual environment had mixed immersion information displayed, on a computer screen and on a phone, and aroma was dispersed in booths, while the second environment was of total immersion, information was presented on video walls, there was use of loudspeakers and scattering of hidden aroma. In this study the hedonistic data was more discriminatory, reliable and consumer acceptance improved in the totally immersive environment. The ability to resolve differences in cookie preference and engagement of participants were improved in virtual environments.
The virtual food buffet where consumers could serve a meal (carrots, pasta and chicken) was studied by Ung et al. [107]. 34 participants partook in the study, being served two meals, one at a virtual buffet and another at a fake food buffet. The researchers observed that the human nutritional behavior was not modified in relation to the amount of energy (in kJ) present in the served food, suggesting that the virtual food buffet is a useful research method.
Despite the research progress in HCI and HFI [15,78,79,127], much work remains to be done before the use of technology with food can be incorporated in industry product creation, evaluation workflow and brought to the consumer’s dinner table. The multisensory experiences in HCI are not completely elucidated [78], the main senses studied, from both the aspect of input and output (feedback) mechanisms, are visual, auditory and tactile. The relationships between HFI and the five senses have been able to influence the user’s food perception; nevertheless, they require several improvements to overcome their limitations, as each sense has multiple unique limitations. Technology for digitizing chemical senses is still being explored, far behind the physical senses, and has many limitations and obstacles to surpass. Some of the limitations in the five senses are summarized in Table 1 and subsequently explained.
Smell: Generation, storage and diffusion of scents; synchronization, matching the intensity, duration, and distribution rate of scents with presented content; availability and access to olfactory displays; and scent alteration, removing an emitted scent in the environment prior to emitting a new scent [79,93,108].
Hearing: Lacks integration of other senses, focus is only on user’s food texture perception while chewing or drinking, which may hinder the effect on the user’s enjoyment, as only part of the eating experience is altered; foods without distinct sounds, not all foods can be enhanced by sound feedback, chewing sounds of certain foods are not distinct/pronounced enough to augment, e.g. soft foods like pasta; configuration, devices work only with preconfigured foods, systems should recognize what the user is eating and automatically filter and augment the adequate sound [52,109].
Taste: Small range of sensations, experiments can generate and evaluate certain taste perceptions (e.g. sourness), but this range must increase to multiple sensations (e.g. sweetness, saltiness, bitterness) and with varying intensities. There is also a lack of studies of food with different viscosities, elasticities, adhesiveness, pH values, temperature, mixture of solids and liquids at the same time (e.g. eating and drinking consecutively or eating a soup with pieces of solid food inside) [65].
Vision: Overwhelming perceived food proportions, if the perceived food size is drastically increased, it can become overwhelming for participants to eat, not only from a perceived quantity point of view but also from not being able to physically fit inside the mouth for a bite; eating methods, restricted to using hands or some preconfigured cutlery; environment restrictions, when shrinking the apparent food size, the environment behind the food must be seen, problems arise when the background is not in a real world controlled or virtual environment, as it is difficult to show what is behind food without distortion [58].
Touch: Hardness range, studies have been limited to relatively hard and soft foods (e.g. crackers and cheese), this range must increase to incorporate foods with hardness characteristics varying in-between these two or those which have a mixture of hardness and softness; vibration locations, vibrations are felt on the teeth, but real food is felt also on the lips and tongue; display shapes, devices have flat surfaces but food have varied surface shapes; lacks integration of other senses, more studies are needed on releasing chemical sensations (taste) with biting [52,87]. Electrical muscle devices to simulate bite have developed, but there is still a lack precise control in varying electrical signals during the course of a meal as the food changes shape and hardness [90]. Studies with direct contact with the tongue and lips have difficulties providing pleasurable experiences [110]. Though there are studies regarding lips stimulation [111,112], they are still in their early stages and need to be more explored with food experiments.
An overall concern in HFI research is that high-fidelity multisensing virtual environments have been used in a variety of applications; however, achieving this type of environment requires high costs. In an attempt to lower these costs, designers have considered the influence of simple and multisensory modalities [80]. Food choice is influenced by multiple determinants [113] and offers a multisensory rich experience, so it is interesting to explore the interactions caused by the act of eating within the virtual environment [15], while also trying to reduce associated costs.
For future studies, haptics is a field in constant development, which can be more incorporated into HFI for virtual food sensing, interaction and assisting with multimodal interactions with other senses. Studies have shown methods in which volume and texture can be felt and interacted with, be it through controllers with actuators [114], ultrasound [115], in air feedback [116] or wearable glove devices [117,118]. Altering perceived shape of elements, using the sense of vision, has been studied [119], for example: changing the perceived size, curvature, and, therefore, touch [120]; changing the perceived weight [121]; altering the perceived hardness [122,123]. With the advancements in VR and AR technologies and their recent affordability with consumer devices, e.g. HTC Vive, Oculus Rift, Google Cardboard and Microsoft Hololens [124,125], a mixture of these technologies could be more incorporated in food research for sensation simulation.
In this area of study, where every day new discoveries appear,
the challenges to be solved are constant, frequently challenging the
researcher, for instance: determining what technology and sensory
experiences to use and their intended effect on people; creation
of new multisensory designs; design creations that take into
account the relationships between the senses; and understanding
the limitations when multi-sense information is simultaneously
monitored [126,127]. The creation of immersive environments
similar to real environments and the development of methods to
facilitate the collection of qualitative and quantitative parameters, to
improve sensory stimulation, are other examples of what need to be
studied and improved in this area of knowledge involving computers,
food and people.
Table 1: Some limitations of research involving the human senses
Advances in the area of study of sensory digitization and virtual
food production can change the way people interact with technology;
how they eat; how they prepare food and what they feel. Technologies
can be used by food brands and in food stores to present their product
in an interactive and fun way and to modify the perception of the
consumer with respect to such products. The immersive consumer
studies, despite being difficult to perform, have been carried out in
virtual environments that are closer to the real world environments,
which facilitate the perception of consumers, and modifies the
opinion about the product, since the environment generates specific
emotions resulting in progressively more realistic and substantial
reports. Studies involving computational technology, whether using
computers, tablets or smartphones, are capable of improving and
promoting these aspects of consumer behaviors, representing an
emerging technology capable of changing the feelings, sensations and
choice of consumers.
The authors declare that they have no conflict of interest.
Copyright © 2020 Boffin Access Limited.