Future of Artificial Intelligence and Senses: Multisensory Messages in the Tech Industry

Paulina Sajna-Kosobucka
Information Architect, Communication & Media PhD Student
Automatic Summary

Understanding Multisensory Messages in Technology

Can multisensory messages be designed within the tech industry? Do they already exist? How does it relate to the future of artificial intelligence? As we dive deeper, we will analyze the potential and existing forms of multisensory messages and how they contribute to a new communication level within the tech industry, and primarily within artificial intelligence.

A Brief Introduction

The author, a Ph.D. student at Nikolaus Copernicus University, shares her journey into exploring the promising area of multisensory messages. Synesthesia, a condition where one sense is simultaneously perceived by one or more additional senses, sparked her interest in multisensory messages. She saw the possibility of transferring the concept of Synesthesia into technology, enhancing multisensory messages into our everyday digital experiences.

From Past to Present: Multisensory Messages in Different Fields

  • The use of Synaesthesia in literature- This technique is often used as a metaphorical combination of different senses, such as sight plus hearing.
  • Applications in advertisements and arts- Prominent figures like composer Franz Liszt and painter Vincent Van Gogh effectively utilized their synesthetic experiences into their works.

The Era of Synesthesia in the Tech Industry

Such instances lead to the concept of developing multisensory messages within the tech industry. How can we incorporate sight, taste, smell, etc. into our digital experiences using artificial intelligence?

This is where information architecture plays a prominent role - the art and science of designing information environments to be understood by users both online and offline. An effective information architecture is the pillar for well-structured multisensory messages.

How to Produce Multisensory Information

Several tools such as Two To IO, Osteocope can turn data into sound or visuals. There are also applications such as Seeing AI and Voice Dream Scanner that can turn visuals into sound.

Touching and Smelling Messages

Apart from seeing and hearing, touch is another critical sense that could be incorporated into multisensory messages. For example, the use of haptic feedback or "digital tattoos" is being explored further. Smelling messages can be made possible through applications like Scenti.

Tasting Messages

Developments in taste Technology, such as the digital lollipop or spoon tech, are looking to simulate different textures of food using electrodes, which can help those with impairments in their taste buds.

Final Thought

Despite the fascinating developments and prospects in the future, it's clear that there's a long journey ahead in the development of truly multisensory messages. The use of various senses in conveying information may seem like a whole new level of complexity, but it also opens the potential to revolutionize how we perceive and interact with technology.

Advancements in this realm can significantly help users with impairments and explore new territories in artificial intelligence and technology at large.

In conclusion, the multisensory messages are the future of artificial intelligence. The growth of multisensory Texts within the tech industry is still underway, indicating a promising turn in technology. Exploring this niche is indeed invigorating, especially considering its potential to radically transform interfaces and user experiences.


Video Transcription

Student at Nikolaus Copernicus University in tour in Poland in the field of communication and media studies. And it's an honor for me to take part in Women Tech Global Conference this year. Um I will speak about multisensory messages in the tech industry.Are they possible to create or not yet? We will think of it. And uh yeah, this is definitely a future of artificial intelligence. But let's just have a look how it's how it works right now. Um Yeah, but first uh let me introduce myself. Um I started as a technician in administration and uh I dealt with databases, processing information and so on and there uh to still deal with data, I started studying journalism because there I could write thesis uh about uh data journalism tools in general. And that was really very uh exciting because there I noticed how we can turn data into sound. It's incredible. Uh I really like the topic and still we have the multisensory message because we receive the uh the data through sight and hearing as well. Still uh staying with information, I started studying information architecture and then I was focused on student radio because the sound is transmit uh through the radio waves. And all was very interesting for me how it works. And there are as a solution to uh share data via radio.

So it was uh interconnected. And yeah, now I'm doing a phd in communication and media studies with the focus on global media. But why did I become interested in multisensory messages in tech industry because of Synestia? Um This is a phenomenon related to multiplies, neuronal connections.

And I have the s an anesthesia and maybe you have to, I don't know, but um I see colors and shapes of music and also I can see colors of temperature and also I can tend to watch uh horrors, for example, all the dramatic movies because when I look at someone suffers, uh I almost feel this, this is very difficult uh in general but uh seeing colors and shapes of music and so on.

That's also um nice, I think in general. Yeah, but with those movies, it's not really comfortable still we doesn't have, we don't have um technology that would uh show you how it feels when you don't feel it. But still we uh are going to uh design such uh such devices. But let's have a look how it looked in the past. Uh in literature, synaesthesia was used as a stylistic rhetorical Levis. Um It was used like metaphorical combination of different senses, for example, sight plus hearing, et cetera in one expression like velvet look or red laughter, et cetera. And in young Poland, as I am from Poland, I need to say uh it was used uh also in the modern period in Polish literature, it was called Jan Poland and uh but still it's used for example, in advertisements, right? And in art generally, uh for example, Franz Liszt, a composer also had synaesthesia and when he composed his music, he saw it, he really saw the colors of this music and shapes of this music and uh that let him create a nice melodies. And for example, Vincent Van Gogh as a painter, he uh or for example, here we have the starry night. Here is the painting that shows how he saw the uh senses in general, how he felt everything. So yeah, that helps him synaesthesia, helps him in creating his paintings.

And for example, Duke Ellington or Oliver Me also had synaesthesia and it really helped in composing music, information architecture. So the fields that I'm studying is the art and science of designing information environments to be understandable for users online and offline with online, we can understand a website or social media applications and so on. But with offline software, but not only because also hospitals or shopping malls, city halls or uh other different places, every area, each area where we need to find information faster. The way finding is enabled by well designed information architecture, Richard. So we was the first to use the terms information architecture and information architect in 1976. And uh information architecture is based on four groups of systems, organizational labeling, navigation and search systems. And uh I will just describe shortly that for example, with labeling, we can understand proper names of tops. And for example, names of hyperlinks needs to refer to the content they are leading to, right. And for example, search engine um where we will be able to, when we will be able to uh for example, search uh by other sensors uh that would maybe make it easier for users to understand the information, but it's not only about the search, it's also about navigation and other uh organization ways and and so on.

Um uh with navigation, for example, I mean uh the clouds or alphabetical indexes and still there there are solutions for people who have uh visual uh disabilities. Uh But uh yeah, it's still in, in progress. Um Yeah, well designed information is a success of creators. This is obvious and there is no more, in my opinion, enjoyable and effective UI design than one that integrates all possible senses. So yeah, this is everything interconnected, how to produce multi sensory information, how to organize it well, to be understandable and helpful.

Uh Yeah, for example, as I told uh already about data son application uh turning data into sounds, there is a tool called uh two to IO. And uh it's a tool that takes you on a journey uh via the guide which um let you create spoken um data or yeah, you can just choose a language and gender and voice in general. So you can uh set it to your uh to, to what you really like to hear. There are many, many options to um turn data into sounds. Um creating visuals with sounds uh is also uh enabled by the osteo uh application. It refers to the Devis Osteo Cope. Uh It is a type of uh electronic test instrument that graphically displays varying electric, electrical sorry voltages as a two dimensional uh plot of one or more signals as a function of time, right? So we can get uh visualizations of sounds and um voice, Dreamer scanner and CA I are um transmitting uh turning visuals into sounds. So for example, seeing A I here, we can uh use Devis camera to identify uh peoples and objects and then the application audio pre uh um describes, yeah, what's there.

And for example, voice Dreamer scanner also can scan documents and then just um you can listen uh using built in text to speech. Yeah. And Kana and sound Spectrum are um software that uh are music visualizers. For example, Kana can capture audio from microphone as well. And yeah, every application with audio description can be also helpful for people with empowerments and uh yeah, still um it's a little bit combined sign and hearing and uh yeah, the first medium that combines a sign and hearing was television. And yeah, this is obvious but still uh how to understand data. So education, for example, louder, stronger, higher sounds can mean a bigger number or whiter thinner, lower sounds can mean smaller number. And this way, uh wouldn't tell us in general the exact number, the exact value of data, right? But uh when we would like to uh just um just show the uh amount of data in general, not so exact, then it could be helpful but still uh the user that receives it would need to uh get instruction before uh implementing the the idea. But uh yeah, this is just uh just what we get now and maybe just pick those data would be even easier, right? So just a solution um interesting to, to have a little fun. Uh But yeah, still it's to be improved.

And yeah, is it possible to touch the messages? Uh Yeah, of course, we have uh the term data, physical organization here. Uh We are dealing with 3D printing and yeah. So when we touch something, we can say what we feel, but it's still uh could be a problem with uh show the exact amount of data. Uh It's a little bit different in the uh case of braille, not touch because there uh it's of course uh information that's transmitted into brain language. Um uh Yeah, for example, haptic feedback, also vibrations of many kinds of Vibra vibrations can give us information. Uh but uh it's not really very obvious when we don't know what it means. So for example, a tattoos can also give us information about, for example, battery or received of a message. But still, um it's a little bit different uh uh a little bit very limited uh in receiving the information and still it's a little bit like it's only the touch, not only the combination of other senses. So through those uh devices, we don't really uh can uh we, we don't really have the ability to uh imagine how it is to have the synaesthesia. What about smelling the messages? There is an application uh senti and it works when you have a special device.

Uh And of course, the application is correlated with the device and then you can just emit smells by technology. And for example, there are also a CD monitors with a sent uh with a system of uh send image enrichment. And here uh this was um created by Japanese and uh it worked uh um basing on my knowledge uh only in their country because it was not really very popular in other countries. I mean, and uh it was based on uh special fragrance, Granules and for, for fans uh directing the or the air stream to the right place on the screen, right? It uh generated one set as one cent at a time. And because of uh those limitations, I think it was not really very uh popular but still it's being improved. And maybe in the future we can have uh those uh L CD monitors with scents uh also popular. Uh Yeah, and here we could have a sight hearing and the smelling together. So this is a little bit like synaesthesia, but uh it's not really like that. Uh But it's very interesting, I think uh about historic cinema show. Uh like there was um in 1960 during the presentation of film, a Scent of Mystery, uh more than 30 fragrances adapted to different senses were emitted to the cinema. So uh the audience could feel uh like more information uh during the uh the um the cinema show.

But still we have uh for, for the five D six D cinemas where we can feel or all different uh stimuli, right? Uh during the show. But still, I am not sure if it could be uh treated like uh synesthesia. Um yeah, artificial, artificial uh noses and nostrils uh are also uh um already implemented uh in life. But uh it works like uh when you're uh connected to your body and then um and through the electrodes, uh it can transmit uh some sense and then you can feel it. Uh Yeah, and this is how uh the Davis uh device uh sent t uh looks like it's like uh when you uh just control it remotely, uh it emits smells and this is also nice. Yeah. What about tasting methods? Um There is something like a state technology and uh we have digital lollipop and spoon tech. Those technologies are also very interesting because spoon tech, for example, makes uh taste buds more sensitive. And for example, it's helpful for people who have problems with taste.

Um Yeah, there are some diseases that uh are connected with those problems with taste buds. Uh and the digital lollipop uh simulates uh different textures of food using electrodes. And here it's shown how it works, how it looks like it's not my video. And yeah, I just wanted to uh show three parts of the video of a new scientists first. Uh you need to, you need to press a button. Uh that is um that refers to uh um a taste, for example, bitter or sweet or salt, et cetera. And then through the cable and uh those little things that you can put on your uh thumb, it stimulates uh those taste buds and uh then you can just um check how uh many um uh many, many uh tests can be connected and then just have a little bit of fun I think because still if we can transmit information for uh through such a device uh to test, I don't know if it's really very possible.

It's just like um manipulating those taste buds and that's it. And yeah, the spoon tech, it looks like that uh ex sites the taste buds. Yeah, as it's written here. Yeah. Um Mhm. Uh OK. I am aware of synesthesia since 2030. Maybe you are a synesthete too. If not in the future, that technology will be able to help you uh experience what it's like to be a synesthete. I hope uh here are my ideas of multisensory messages in the future. Um I mean, for example, messages in the forms of sense uh or taste could be generated on the basis of images. Uh But still, it's not really very possible because you know, uh with sounds, it was a little bit different because sound is transmitted uh over a distance using radio waves. And here with those sense and taste uh information, there is a problem to to transmit it. And what about neurotechnology solutions? Um As far as neurotechnology, uh positron emission tomography and functional magnetic resonance emerging in 1996 have shown that uh in, in kind uh in the case of some SIDS, the visual areas in the brain were activated in response to sound stimulus.

And it allows researchers to check what will happen after using transcranial magnetic stimulation just to uh manipulate a research group's members brains. But still we need to remember about the ethics of our research and I'm not a specialist in this field, but still, I think it's very interesting that we can manipulate or neuronal connections in a way that we can um we can feel how the synaesthesia works. And yeah, I also heard that after some drugs we can uh have synesthesia in mind just to see how it, how it is. I have never tried it because I have synesthesia. So I don't need to try it. But yeah, uh it's just very interesting and yeah, what about applications for Synesthete? I think that for example, like dating apps, uh people who uh find uh other people with the same uh sensual experience uh could uh talk in colors. For example, it's, it seems like a very improbable but uh it would be interesting and a robot translating senses, for example, like translating languages, that would be a nice idea just to show how we can communicate through colors, for example. Yeah. Hm. The only multisensory uh messages that exist in tech are those based on uh vision and hearing as I told you this just a summary.

Uh Yeah, synesthesia and the information architecture are strongly intertwined as together they improve the design of information as synestia that deals with the perception and the information architecture deals with organizing the information. Maybe it would be helpful for users who have environments.

Digital S nest is a niche in the tech industry. Multisensory messages are the future of artificial intelligence. And this is my opinion. Uh you know, uh research on a multisensory text is still underway. This is what I also wanted to indicate that uh there are many ideas and uh just to uh show people who doesn't have synaesthesia how it is. Maybe it would be helpful in access to information. Um Yeah, just to conclude, uh in my opinion, uh I refer to Frank Chimo designer. People ignore design that ignores people. So if we had the solution just to uh engage all of our senses, that would be really very nice and helpful. And uh I think uh it will happen one day. OK. Thank you so much for joining me and having me and I see we have some questions and here is here you have the services. If you want to take a screenshot, of course, that would be awesome. And if you would like to call me and you know, I'm also sorry that I, I was uh a little bit uh hurry and I'm sorry for all my language, but I'm still learning English. But yeah, uh let's have a look. Yeah, very interesting topic and glad I have done. Thank you interested to understand that the sensory data is at the concept stage or do you know uh of companies that are already using any of this technology?

Just wondering of a wider use can uh uh use a case such as cloud these technologies work um Or let's say near the divergent community to help their job. Um I don't know actually uh if um companies uh uh mm are already using this technology. Uh I haven't um done this research to say which companies for example, use it. Uh But yeah, that would be really helpful if the sensory data would be uh would be used at work in general. And uh yeah, actually, that's what I can say. OK. And thank you. Uh Thank you for feedback. Thank you so much. Oh, I see. Uh Is there a commercial use of digital lollipop still? I also don't know because it's not my uh research area in general. I just wanted to introduce you to the topic. Uh Yeah, but I need to make some more research of it. Sorry that it can be more helpful for you. Thank you. I'm wondering if we still have some time. Yeah, I'm from Poland. Hello. It's very nice to meet you all. Yeah, you can find me on linkedin or yeah, also, uh you can just write me and we can connect and then talk a little bit more about those technologies. Yeah. Thank you. Hey, I think that's enough for now and yeah, if you want to stay in touch, just let me know. Ok, bye bye.