How human can AI really become?

How human can AI really become?

An essay on a possible philosophical classification

When can we consider an AI to be "human"?

INTRODUCTION

Erich Fromm, an outstanding thinker of the 20th century, left us a reflection on the essential aspects of being human in his work "To Have or to Be". This essay explores the question of whether artificial intelligence (AI) can ever be human by combining Fromm's philosophy with Richard David Precht's thoughts from "AI and the Meaning of Life". The fundamental question is to what extent human decisions are made logically or emotionally.

I. Fromm's "To have or to be" in the age of AI:

Fromm held the view that true fulfillment lies in being and not in having. If we consider AI as a tool, the question arises as to whether machines can ever understand and replicate the "being" of human beings. Fromm held the view that the true identity of man is primarily to be found in non-material "being" rather than in material "having". 

II The humanity of AI according to Precht:

In his book, Precht argues for an ethical approach to AI development. He recognizes the superiority of AI in many areas, but doubts that machines will ever reach the consciousness and depth of human life. The integration of moral principles and values remains a challenge for the creation of human-like AI.

III Emotions and intuition in humans

Fromm emphasized the role of emotions in humans, which often underlie intuitive decisions. The question of the humanity of AI leads to considerations of whether machines can ever understand emotions and make intuitive decisions. Precht emphasizes that it is unlikely that AI can develop true empathy, which is an essential part of human intuition.

IV. Logic vs. intuition: the mix of human decisions:

The question of how much human decision-making is based on logic and how much on intuition is complex. Psychological studies show that many decisions are not purely logical, but are also influenced by emotional factors. This poses a challenge for the development of AI, which must not only think logically but also understand emotional nuances.

V. The future of human-like AI:

Against the background of Fromm's philosophy and Precht's views, the question of the possibility of a human-like AI remains open. The challenges lie not only in technological development, but also in the integration of moral and emotional aspects. An AI can make logical decisions, but humanity requires more than logic - it requires empathy, love and a deeper connection to life.

Conclusion:

Erich Fromm's "To Have or To Be" provides a critical lens through which to view humanity in the context of AI. Richard David Precht's views emphasize the ethical aspects of AI development and the unlikelihood of a fully human-like AI. The question of the relationship between logical and intuitive decisions in human action extends the discussion into the field of psychology. All in all, the human being remains a complex being that goes beyond purely logical thinking and presents AI developers with a challenging, perhaps unsolvable, task.

PWC study sees cloud gaming on the rise

Cloud gaming is picking up speed, at least PWC claims in a recent study.


Since the future of the Metaverse is in the cloud, and the technical framework is almost identical to that of gaming, I think this statement is very exciting.
The issue of latency is mission-critical for both.
Unfortunately, PWC comes to the wrong conclusion:
Here, the bandwidth in the network is presented as a limiting factor.

❗️Latenz has nothing to do with bandwidth tun❗️

The decisive factor is the network architecture. What is needed is a high-performance edge infrastructure in the networks.

An example: The rendering of elaborate Spaces must not take place in a cloud data center 6000 kilometers away, but in physical proximity to the user.

Such infrastructure will be a key locational advantage in the Metaverse in the future.

Secondary markets also benefit from AI and Metaverse

Secondary markets also benefit from AI and Metaverse

Following Elon Musk's announcement that his AI startup will rely on processors from Nvidia, the shares of the graphics processor manufacturer from Santa Clara, USA, are soaring.

The Nvidia share reaches the highest closing price of the past 12 months on 4/18-23. Similar to Nvidia are the gains at quite a few smaller and larger suppliers in the industry. This shows that we are dealing with a multi-billion market in AI and Metaverse, even beyond the generally visible companies such as OpenAI, Microsoft, Meta, etc.

Cloud and communications technology in particular play an extremely important role. If you don't have the right partners on board early on and can scale in time, you won't be able to serve the newly emerging markets.
AI hype and the metaverse are very closely related. AI will give the metaverse the momentum it needs to scale and adapt dynamically to new business models.

This is not about the AI-based animation of an avatar in the Metaverse, which then conducts training sessions as a 3D chatbot. No, it is about nothing less than the organic further development of the Metaverse itself. The traditional business of developing virtual spaces will be completely transformed. Instead, we are talking about dynamic platforms that, based on AI, can create requirement-specific virtual worlds at the push of a button in just a few seconds, which can be changed and adapted again just as quickly at any time.

Volumetric streaming

The future of immersion in the metaverse

The Metaverse is meant to provide an immersive and interactive experience that blurs the lines between reality and fiction. But how can the Metaverse be made even more realistic and vivid? One possible answer is volumetric streaming.

metaverse virtual reality man 7252038
metaverse virtual reality man 7252038

Volumetric streaming is a technology that makes it possible to capture, process and transmit three-dimensional objects and people in real time. It captures not only the surfaces, but also the depth and volume of the scene. The result is a holographic representation that can be viewed from any angle. Volumetric streaming opens up new possibilities for displaying content in the metaverse, such as:

  • Live events: You can participate in concerts, sporting events or other events as if you were there. It is possible to change the perspective, move freely or interact with other spectators.
  • Social interaction: This is about meeting other users of the Metaverse as if they were physically present. You can see and hear their facial expressions, gestures and body language. Virtual objects can be shared or manipulated.
  • Education and training: Streamed objects can be used to learn from experts or participate in simulations that present realistic scenarios. For example, you can practice a medical procedure or perform training on a complex machine.

However, volumetric streaming is not without its challenges. High bandwidth and computing power are required to ensure high quality and a good experience. In addition, privacy and copyright issues need to be addressed when it comes to capturing and distributing people and objects. Finally, ethical and social aspects must be considered, such as the impact of volumetric streaming on perceptions of reality and identity.

Volumetric streaming is an exciting and promising technology that has the potential to revolutionize the metaverse. It offers new opportunities for content creation and consumption that enable immersive and interactive experiences. However, it also requires careful development and regulation to avoid potential risks and abuses. Volumetric streaming is thus not only a technical challenge, but also a cultural and social one.

Left:

https://learn.microsoft.com/en-us/windows/mixed-reality/develop/mixed-reality-cloud-services
https://8i.com/
https://volucap.com/
https://doobmeta.eu/de/startseite/

Empathy in the metaverse in times of ChatGPT

A lot has been written about the topic of empathy in the metaverse in the past time. Primarily it was about the question whether empathy can be experienced or felt in the metaverse. I am firmly convinced that this is possible and that it happens consciously or unconsciously when working in the metaverse.

However, this question takes on a new quality with the increasing capabilities of artificial intelligence (AI). Can an AI be empathic, and what impact does this have on virtual encounters in the metaverse. Specifically, the issue is whether in a situation where one avatar is a natural person and the opposite is an avatar controlled by an AI. One can approach this issue on two levels. One is a purely neurological approach. The other approach is more an ethical one. This question was already raised by John Wheeler (1) in his consideration. The prerequisite for the existence of empathy is not only the biochemical process, but also depends very much on our "I" understanding as a human being.

The question now arises whether the use of AI-controlled avatars in the metaverse creates a completely new situation? Basically, one has to say that superficially nothing changes in the basic statement. However, in the metaverse and the use of photorealistic avatars, further components are added. Through the immersion, i.e. the mental "immersion" in the virtual world, and the possibly objectively natural behavior of an AI-controlled avatar, something like a "mock empathy" can be conveyed. This is also the conclusion of Andrew McStay (2) in his article published in October 22 ("It from Bit") on the moral problem of an AI-controlled avatar. His conclusion is that while AI is able to provide large parts of empathy, it is incomplete in significant parts. Aspects such as responsibility, solidarity, community, etc. are missing.

In my opinion, these aspects must be taken into account when we think about ChatGPT and similar systems and their use in the metaverse. Basically, this development offers huge opportunities and the potential to create free space for areas where direct human-to-human interaction is necessary. But in the ethical evaluation of the development, we are just at the beginning, and we should conduct this discussion at least as forcefully as we think about new business models with AI.

(1) Wheeler, J.: Information, Physics, Quantum: The Search for Links. Proceedings of the 3rd international symposium on the founda- tions of quantum mechanics, Tokyo. https://philpapers.org/archi ve/WHEIPQ.pdf. Accessed 3 Oct 2022, (1989)
(2) McStay, A. Replica in the Metaverse: the moral problem with empathy in 'It from Bit'. AI Ethics (2022). https://doi.org/10.1007/s43681-022-00252-7

Zen meditation in the metaverse as a driver of innovation

What does Zen and meditation have to do with innovation and New Work ?

Quite a lot...

Innovation arises from inspiration and creativity. In order to be able to fully develop one's creative power, one must be "at peace" with oneself. It is therefore first necessary to find one's own "brand core" in order to then unfold one's creative potential untainted by foreign influences and constraints. Zen meditation can be a great help here. After all, Zen is all about (re)finding one's core.

Japanese Kouji Miki founded his Zen School more than 10 years ago after some personal setbacks. Here he brings together people from a wide variety of backgrounds (managers, monks, scientists...).

During the lockdown, he migrated his Zen School to the Metaverse and did so with great success. His conclusion is that the willingness to open up is much greater in the Metaverse than in real life.

A statement that Pamela Buchwald and I can confirm from countless events in the Metaverse.

The metaverse and meditation are innovation drivers!

zenschoolVR: True innovation comes from within - NewWorkStories.com

Pain management through virtual reality

In healthcare, a whole range of new use cases are currently emerging around the topic of metaverse and virtual reality. One approach that was scientifically investigated very early on is the area of pain management for chronic pain. One approach is the so-called distraction therapy. Here, VR is used to stimulate specific brain regions. In particular, these are the prefrontal cortex and the somatosensory cortex, which are involved in pain processing. In addition, further studies have found that VR can also stimulate the ventral striatum, the brain's reward center.

A 2019 study by Brennan Spiegel (MD, MSHS, director of Cedars-Sinai's Health Service Research) supports this approach. He contrasted a group of 61 patients who received VR therapy 3 times a week with a group of 59 patients with chronic pain. The VR group experienced reductions in pain levels of up to 25%.

Here is the direct link to the study:

https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0219115

Less bandwidth and better graphics through texture baking

When we talk about cloud-based metaverse applications like T-Systems' Magenta Metaverse, interactivity and immersion are key!

Bandwidth and GPU-powered cloud data centers are technical basics.
But almost more important are the architecture and design of the spaces. Here we can learn a lot from the gaming industry.

👉 One of the most common techniques is the so-called "texture baking".

In simple terms, this means that you transfer the high-resolution textures of a static 3D model to a game-ready model with fewer polygons.
The result is a high-quality but interactive object with less computational and bandwidth overhead.

As a first demonstrator, we tried this out with amazing results at our #virtualinnovationcenter! (Above without and below with texture baking) 🚀

It is very important to optimize industry cases to ensure maximum customer experience without specific client hardware.

Empathy in the metaverse through haptic feedback

Empathy in the metaverse is a very important topic when we focus on social interaction and mental health.

There are typically 5 senses that allow us to receive empathy:

  1. Acoustic stimuli
  2. Visual stimuli
  3. Smell stimuli
  4. Taste stimuli
  5. Haptic stimuli

1 and 2 can be transferred to the metaverse without any problems.

👉 Now, California-based startup emerge.io is launching a tabletop device that envelops virtual objects and interactions with ultrasonic waves in the air.

Technologies like these will provide groundbreaking support for "haptic stimuli" and open up a broad field of new empathy-driven applications in the metaverse.

#NewWork #MentalHealth #Empathy #DigitalHealth #Metainnovator

About the author

Andreas Droste studied electrical engineering with a focus on digital control engineering / automation technology at the University of Duisburg. 

He began his career at Deutsche Telekom in 1992 in the founding team of DeTeSystem. From 1994 to 1999, he was head of advertising for interactive media and direct marketing in Deutsche Telekom's corporate communications department. In this role, he managed all of the Group's e-commerce activities from 1996. From 1999 to 2002, he was responsible for setting up and managing Deutsche Telekom's newly created "Competence Center Internet." He joined the T-Systems Innovation Center in 2010 through various consulting functions in the areas of travel, transport & logistics (including the introduction of the Toll Collect system). Here he was responsible for international innovation activities and selected innovation initiatives with major T-Systems customers. 

Andreas Droste works as an independent author, speaker and conceptual consultant in the field of technologically innovative application scenarios in the areas of New-Work and Mental-Health.