Martin Luther answered questions live on Reformation Day

Martin Luther answered questions live on Reformation Day

Luther speaks from the pulpit

On October 31, 2023, the YouTube channel EKiRInternet premiered a three-dimensional Luther avatar that is controlled by AI and looks photorealistic. The avatar was created using a painting of Martin Luther as a visual basis and answered questions as if Martin Luther were speaking today.¹ With the help of modern AI algorithms, a painting of the reformer was converted into a photorealistic representation. The result is an avatar that behaves like a real person and can interact in space.

These interactions took place on the XRHuman platform in the Metaverse and were broadcast live on YouTube¹. Using ChatGPT technology, the Luther avatar was able to answer questions from the audience in real time. The AI was programmed to give answers similar to Martin Luther¹.

Ralf Peter Reimann (Internet Officer of the Protestant Church in the Rhineland) and I initiated and implemented the cooperation project with the Metaverse platform XRhuman.

We see great potential in making historical figures accessible to a broad target group through the use of AI and providing new impetus, including for the church.

Up to 150 people took part in the live chat and over 100 questions were answered.

Sources:
(1) Ask Martin Luther your questions on Reformation Day - presse.ekir.de.
https://presse.ekir.de/presse/D89F4DDA59924D37B70A56665710E09C/stell-martin-luther-am-reformationstag-deine-fragen.
(2) Live video of the event with Luther on YouTube.
https://www.youtube.com/live/uBwCHNYvgRY?si=fO_aXVeSNdpPzC4R

Trauma therapy with AI support: opportunities and challenges.

Traumatic experiences can lead to serious psychological consequences, such as post-traumatic stress disorder (PTSD). The treatment of trauma disorders requires individual and professional support by psychotherapists. But how can artificial intelligence (AI) support or complement trauma therapy? What are the benefits and risks involved?

AI in the diagnosis and prevention of trauma sequelae.

AI could also help identify traumatized individuals at an early stage and offer preventive measures. For example, AI-supported apps or chatbots could provide victims with information, tips or exercises to deal with traumatic symptoms. Such digital interventions could provide a low-threshold and anonymous means of accessing psychological help.

AI in the therapy of trauma sequelae.

One possible field of application for artificial intelligence is to support the diagnosis of mental illnesses. For example, AI-based models based on various parameters could provide indications as to the direction in which more in-depth diagnostics might be useful and thus facilitate diagnosis. This could be done, for example, by analyzing speech patterns, facial expressions, gestures or physiological data.

AI could also be used in the therapy of trauma sequelae, e.g. as a complement or alternative to conventional psychotherapy. Different methods could be used, such as:

- Virtual Reality (VR): VR makes it possible to recreate traumatic situations in a controlled and safe environment to provide exposure therapy. The VR environment could be adapted by AI to the individual needs and reactions of the patient.

- Avatar therapy: Avatar therapy is a form of conversational psychotherapy in which patients interact with a virtual counterpart controlled by AI. This could, for example, represent a traumatic person with whom the patient can have a dialogue in order to process the experience.

- AI-based software: AI-based software could support therapy for trauma sequelae, for example, by providing personalized feedback, recommendations, or reminders. It could also facilitate documentation and evaluation of therapy.

Ethical issues and challenges

However, the use of AI in trauma therapy also raises ethical issues and challenges that need to be considered. Some of these are:

- Data protection and security: Processing sensitive data about traumatic experiences requires a high level of protection against misuse or unauthorized access. Both technical and legal measures must be taken to protect the privacy and autonomy of patients.

- Quality and effectiveness: The quality and effectiveness of AI-based interventions must be scientifically tested and evaluated before they can be applied in practice. This must also take into account possible side effects or harm that could result from faulty or inappropriate AI.

- Trust and relationship: The relationship between patient and therapist is an essential factor for the success of trauma therapy. Trust, empathy and respect play an important role. How can such a relationship be established and maintained with an AI? How can an AI complement or replace human interaction without replacing or endangering it?

Conclusion

AI offers many opportunities to enhance or expand trauma therapy. However, the ethical aspects and challenges associated with the use of AI in this sensitive area must also be considered. Interdisciplinary collaboration and critical discourse are therefore needed to explore and responsibly shape the opportunities and risks of AI in trauma therapy.

Quantum networks create more immersion

Quantum physics becomes the key to a truly immersive metaverse

This is a steep thesis that is currently voiced by many, but perhaps has a completely different relevance than assumed.

The superficial argument in this context is that quantum computing, by virtue of its sheer computing power, will bring a gigantic push towards realism and thus immersion.

However, I personally (and some studies support this) believe that after a certain point, more realism does not necessarily bring more immersion.

Especially in collaborative applications, another factor plays a much more important role:

👉 It is the latency

In my post yesterday I mentioned that immersion essentially takes place in the intuitive part of our brain. This works almost in real time. Our conscious mind, on the other hand, needs up to 300 milliseconds until it can trigger a movement or other interaction.

If we now imagine the whole thing in a classic collaboration application, whether in the metaverse or in 2D, network latencies of 0.5 to several seconds are added to these 300 milliseconds.

👉 Precisely this problem can be solved by quantum networks:

Here, the states of photons are "entangled". This coupling remains interestingly also over larger distances. If the state of one photon changes, the state of the other changes in parallel. This happens in real time. This is also called quantum teleportation.

Such a network has already been tested, for example, between Shanghai and Beijing over a distance of more than 4000 km.

If such networks were built globally, for example, avatars in Munich could really communicate in real time with avatars in Sidney and Los Angeles and be very close to 100% immersion.

Metaworking reduces greenhouse gases

Metaworking reduces greenhouse gases

The metaverse can make an important contribution to reducing global warming.

This is confirmed by a Cornell University study.

👉 In the USA alone, the targeted use of metaverse technologies could reduce greenhouse gas emissions by more than 10 gigatons by 2050

As much as we are happy that personal meetings, events and business trips are possible again after the pandemic, we should also have learned that many things are dispensable and that the metaverse has good alternatives in store. A good mix of both worlds already helps here.

Collaboration in the metaverse as a hybrid approach is just one example here. Such meetings are more spontaneous than physical meetings and still create a "virtual" proximity.

Spatial partitioning is not the solution for scaling the metaverse

Spatial partitioning is not the solution for scaling the metaverse

Current metaverse platforms are very limited in terms of the number of concurrent users that can interact in a space.

Depending on the platform, there are only 40-100 avatars!

One technological approach to break through this limitation should be so-called spatial partitioning.
Simply put, the three-dimensional world is broken down into 2D tiles, which are then played out per user via synchronized cloud servers and then reassembled into a 3D world. Theoretically, such a platform can be scaled indefinitely.

In the meantime, however, it has become apparent that the problem lies precisely in the synchronization of the servers(see the current article in MetaGravity)

But exactly this scalability is necessary to be able to really talk about a metaverse in the future and not only about 100*X cloned avatar islands. This does not only apply to games in the metaverse. Also for events like fairs and events an unlimited number of concurrent users is absolutely necessary.

It remains to be seen what technological developments there will be here in the near future, and who the drivers will be.

New Work In the metaverse

New Work In the metaverse

In stores from June 8

The release date is set!

Based on concrete use cases, the relevance of the metaverse as a technological driver for New Work is presented. It is intended as an inspiration to take the first practical steps in the metaverse today. The technology is there. What is needed now are creative ideas and the courage to start something new.


The book will be available for order at all major booksellers starting Thursday.
It will be available as a softcover and eBook:

ISBN Softcover: 978-3-347-81797-5

ISBN e-book: 978-3-347-81806-4

Further information and multimedia background information is available here:

Https://newworkmeta.drostenet.de

PWC study sees cloud gaming on the rise

Cloud gaming is picking up speed, at least PWC claims in a recent study.


Since the future of the Metaverse is in the cloud, and the technical framework is almost identical to that of gaming, I think this statement is very exciting.
The issue of latency is mission-critical for both.
Unfortunately, PWC comes to the wrong conclusion:
Here, the bandwidth in the network is presented as a limiting factor.

❗️Latenz has nothing to do with bandwidth tun❗️

The decisive factor is the network architecture. What is needed is a high-performance edge infrastructure in the networks.

An example: The rendering of elaborate Spaces must not take place in a cloud data center 6000 kilometers away, but in physical proximity to the user.

Such infrastructure will be a key locational advantage in the Metaverse in the future.

Empathy in the metaverse in times of ChatGPT

A lot has been written about the topic of empathy in the metaverse in the past time. Primarily it was about the question whether empathy can be experienced or felt in the metaverse. I am firmly convinced that this is possible and that it happens consciously or unconsciously when working in the metaverse.

However, this question takes on a new quality with the increasing capabilities of artificial intelligence (AI). Can an AI be empathic, and what impact does this have on virtual encounters in the metaverse. Specifically, the issue is whether in a situation where one avatar is a natural person and the opposite is an avatar controlled by an AI. One can approach this issue on two levels. One is a purely neurological approach. The other approach is more an ethical one. This question was already raised by John Wheeler (1) in his consideration. The prerequisite for the existence of empathy is not only the biochemical process, but also depends very much on our "I" understanding as a human being.

The question now arises whether the use of AI-controlled avatars in the metaverse creates a completely new situation? Basically, one has to say that superficially nothing changes in the basic statement. However, in the metaverse and the use of photorealistic avatars, further components are added. Through the immersion, i.e. the mental "immersion" in the virtual world, and the possibly objectively natural behavior of an AI-controlled avatar, something like a "mock empathy" can be conveyed. This is also the conclusion of Andrew McStay (2) in his article published in October 22 ("It from Bit") on the moral problem of an AI-controlled avatar. His conclusion is that while AI is able to provide large parts of empathy, it is incomplete in significant parts. Aspects such as responsibility, solidarity, community, etc. are missing.

In my opinion, these aspects must be taken into account when we think about ChatGPT and similar systems and their use in the metaverse. Basically, this development offers huge opportunities and the potential to create free space for areas where direct human-to-human interaction is necessary. But in the ethical evaluation of the development, we are just at the beginning, and we should conduct this discussion at least as forcefully as we think about new business models with AI.

(1) Wheeler, J.: Information, Physics, Quantum: The Search for Links. Proceedings of the 3rd international symposium on the founda- tions of quantum mechanics, Tokyo. https://philpapers.org/archi ve/WHEIPQ.pdf. Accessed 3 Oct 2022, (1989)
(2) McStay, A. Replica in the Metaverse: the moral problem with empathy in 'It from Bit'. AI Ethics (2022). https://doi.org/10.1007/s43681-022-00252-7

Less bandwidth and better graphics through texture baking

When we talk about cloud-based metaverse applications like T-Systems' Magenta Metaverse, interactivity and immersion are key!

Bandwidth and GPU-powered cloud data centers are technical basics.
But almost more important are the architecture and design of the spaces. Here we can learn a lot from the gaming industry.

👉 One of the most common techniques is the so-called "texture baking".

In simple terms, this means that you transfer the high-resolution textures of a static 3D model to a game-ready model with fewer polygons.
The result is a high-quality but interactive object with less computational and bandwidth overhead.

As a first demonstrator, we tried this out with amazing results at our #virtualinnovationcenter! (Above without and below with texture baking) 🚀

It is very important to optimize industry cases to ensure maximum customer experience without specific client hardware.

Empathy in the metaverse through haptic feedback

Empathy in the metaverse is a very important topic when we focus on social interaction and mental health.

There are typically 5 senses that allow us to receive empathy:

  1. Acoustic stimuli
  2. Visual stimuli
  3. Smell stimuli
  4. Taste stimuli
  5. Haptic stimuli

1 and 2 can be transferred to the metaverse without any problems.

👉 Now, California-based startup emerge.io is launching a tabletop device that envelops virtual objects and interactions with ultrasonic waves in the air.

Technologies like these will provide groundbreaking support for "haptic stimuli" and open up a broad field of new empathy-driven applications in the metaverse.

#NewWork #MentalHealth #Empathy #DigitalHealth #Metainnovator