Toggle light / dark theme

The sense of touch may soon be added to the virtual gaming experience, thanks to an ultrathin wireless patch that sticks to the palm of the hand. The patch simulates tactile sensations by delivering electronic stimuli to different parts of the hand in a way that is individualized to each person’s skin.

Developed by researchers at City University of Hong Kong (CityU) with collaborators and described in the journal Nature Machine Intelligence (“Encoding of tactile information in hand via skin-integrated wireless haptic interface”), the patch has implications beyond virtual gaming, as it could also be used for robotics surgery and in prosthetic sensing and control.

‘Haptic’ gloves, that simulate the sense of touch, already exist but are bulky and wired, hindering the immersive experience in virtual and augmented reality settings. To improve the experience, researchers led by CityU biomedical engineer Yu Xinge developed an advanced, wireless, haptic interface system called ‘WeTac’.

The most famous one is the cell phone itself: Captain Kirk’s communicator inspired the folks at Motorola to make the first handheld mobile device in 1973. Star Trek: The Original Series (popularly called TOS) from the 1960s also inspired video conferencing. But things started to amp up when, in 1987, Star Trek: The Next Generation (aka TNG) hit the floors, with Sir Patrick Stewart in the lead. It became one of the most syndicated shows on television—which is how I discovered it in mid-90s India on the Star network. It fundamentally impacted my life, inspiring me to become the technology writer I am today.

But more than me, this show heralded more technological concepts that are becoming increasingly real. The LCARS computer on the Galaxy-Class USS Enterprise D is basically the foundation of what Google is today. Google’s former head of search, Amit Singhal, often said that the company is “trying to build the Star Trek computer”.

Haptic holography promises to bring virtual reality to life, but a new study reveals a surprising physical obstacle that will need to be overcome.

A research team at UC Santa Barbara has discovered a new phenomenon that underlies emerging holographic haptic displays, and could lead to the creation of more compelling virtual reality experiences. The team’s findings are published in the journal Science Advances.

Holographic haptic displays use phased arrays of ultrasound emitters to focus ultrasound in the air, allowing users to touch, feel and manipulate three-dimensional virtual objects in mid-air using their bare hands, without the need for a physical device or interface. While these displays hold great promise for use in various application areas, including augmented reality, virtual reality and telepresence, the tactile sensations they currently provide are diffuse and faint, feeling like a “breeze” or “puff of air.”

Meta’s AR glasses could be launched in 2027.

Mark Zuckerberg’s Meta Platforms is doubling down on its virtual reality (VR) products and plans to rope in augmented reality (AR) experiences. It looks to define its position in the technology industry a few years from now. Thousands of employees of the Reality Labs Division at Meta were recently presented with a roadmap for the company’s products, which was then shared with The Verge.


VR, AR, and neural interfacesAlthough Zuckerberg has spoken mainly of the metaverse that the company would build as the future of the internet, Meta now seems to have taken its foot off the pedal to make the metaverse itself and focus on the tools instead and improving them.

Coming out later this year is the Meta Quest 3, the flagship product from the company. It is expected to be twice as powerful but half the thickness of its predecessor—the Quest 2. Meta has sold more than 20 million Quest headsets so far, so the Quest 3 sales will be a benchmark to determine if customers are interested in these products.

Priced at $400, Quest 3 will also feature front-facing cameras that will make it less immersive than its predecessors but add the ability to deliver mixed reality experiences to users. Meta is hopeful that this will prompt users to keep the headsets on for longer and plans to ship 41 new apps and games with this headset.

Sensing a hug from each other via the internet may be a possibility in the near future. A research team led by City University of Hong Kong (CityU) recently developed a wireless, soft e-skin that can both detect and deliver the sense of touch, and form a touch network allowing one-to-multiuser interaction. It offers great potential for enhancing the immersion of distance touch communication.

“With the rapid development of virtual and augmented reality (VR and AR), our visual and auditory senses are not sufficient for us to create an immersive experience. Touch communication could be a revolution for us to interact throughout the metaverse,” said Dr. Yu Xinge, Associate Professor in the Department of Biomedical Engineering (BME) at CityU.

While there are numerous haptic interfaces in the market to simulate in the , they provide only sensing or . The uniqueness of the novel e-skin is that it can perform self-sensing and haptic reproducing functions on the same interface.

Are you ready to explore the future and imagine the incredible technological advancements that await us in the year 2090? From Neural Interfaces to Hypersonic Vactrains, Photosynthetic Humans, Fully Immersive Virtual Realities, Self-Healing Materials, and Genetic Enhancement, our world will look nothing like it does today.

Imagine controlling devices with your thoughts through Neural Interfaces, traveling from New York to Los Angeles in just 30 minutes with Hypersonic Vactrains, fueling your body with the sun with Photosynthetic Humans, immersing yourself in fully realistic virtual environments, repairing materials instantly with Self-Healing Materials, and enhancing your genetics for better physical and mental capabilities.

The future is truly exciting and we can’t wait to explore it with you! So hit that like and subscribe button and don’t forget to hit the bell icon to stay updated with our latest videos.

Let’s dive into the future together!

Thanks for stopping by at Future Tech Enthusiast!

Check out our website at: https://futuretechenthusiast.com/

Top 10 upcoming future technologies | trending technologies | 10 upcoming tech.

Future technologies are currently developing at an acclerated rate. Future technology ideas are being converted into real life at a very fast pace.

These Innovative techs will address global challenges and at the same time will make life simple on this planet. Let’s get started and have a look at the top technologies of the future | Emerging technologies.

#futuretechnologies #futuretech #futuristictechnologys #emergingtechnologies #technology #tech #besttechnology #besttech #newtechnology #cybersecurity #blockchain #emergingtech #futuretechnologyideas #besttechnologies #innovativetechs.

Chapters.
00:00 ✅ Intro.
00:23 ✅ 10. Genomics: Device to improve your health.
01:13 ✅ 09. New Energy Solutions for the benefit of our environment.
01:53 ✅ 08. Robotic Process Automation: Technology that automates jobs.
02:43 ✅ 07. Edge Computing to tackle limitations of cloud computing.
03:39 ✅ 06. Quantum Computing: Helping to stop the spread of diseases.
04:31 ✅ 05. Augmented reality and virtual reality: Now been employed for training.
05:05 ✅ 04. Blockchain: Delivers valuable security.
05:50 ✅ 03. Internet of things: So many things can connect to the internet and to one another.
06:40 ✅ 02. Cyber Security to improve security.
07:24 ✅ 01. 3D Printing: Used to create prototypesfuturistic technologybest future tech.

Here at Tech Buzzer, we ensure that you are continuously in touch with the latest update and aware of the foundation of the tech industry. Thank you for being with us. Please subscribe to our channel and enjoy the ride.

If you haven’t developed a coping mechanism for deeply human and heart-shattering experiences of grief and loss, Metaverse has something for you.

As per the recent claims made by the founder of Somnium Space, a top metaverse company, the launch of ChatGPT has accelerated the process of making one of his most ambitious and eccentric projects real.

“Honestly, it is progressing at a much faster pace than everyone’s expectations.”


Andrey Suslov/iStock.

“The AI is growing at an extremely fast pace,” Artur Sychov, the CEO of Somnium Space, whose organization at present is working to develop a “Live Forever” mode for robot avatars in its “virtual reality world,” told Motherboard.

John Carmack — Doom creator, father of virtual reality, and premier disgruntled Meta employee — believes humanity is on the cusp of Artificial General Intelligence (AGI).

“I think that, almost certainly, the tools that we’ve got from deep learning in this last decade,” the famed programmer told Dallas Innovates, “we’ll be able to ride those to artificial general intelligence.”

A 3D mesh is a three-dimensional object representation made of different vertices and polygons. These representations can be very useful for numerous technological applications, including computer vision, virtual reality (VR) and augmented reality (AR) systems.

Researchers at Florida State University and Rutgers University have recently developed Wi-Mesh, a system that can create reliable 3D human meshes, representations of humans that can then be used by different computational models and applications. Their system was presented at the Twentieth ACM Conference on Embedded Networked Sensor Systems (ACM SenSys ‘22), a conference focusing on computer science research.

“Our research group specializes in cutting-edge wi-fi sensing research,” Professor Jie Yang at Florida State University, one of the researchers who carried out the study, told Tech Xplore. “In previous work, we have developed systems that use to sense a range of human activities and objects, including large-scale human body movements, small-scale finger movements, sleep monitoring, and daily objects. Our E-eyes and WiFinger systems were among the first to use wi-fi sensing to classify various types of daily activities and finger gestures, with a focus on predefined activities using a trained model.”