Toggle light / dark theme

The James Webb Space Telescope keeps finding galaxies that shouldn’t exist, a scientist has warned.

Six of the earliest and most massive galaxies that NASA’s breakthrough telescope has seen so far appear to be bigger and more mature than they should be given where they are in the universe, researchers have warned.

The new findings build on previous research where scientists reported that despite coming from the very beginnings of the universe, the galaxies were as mature as our own Milky Way.

OpenAI’s breakthrough consistency model will lead into image understanding to make GPT4 multimodal, providing next generation improvements with human-computer interaction, human-robot interaction, and even helping the disabled. Microsoft has already released a predecessor to GPT4 image understanding by with Visual ChatGPT, which is much more limited in its abilities.

Deep Learning AI Specialization: https://imp.i384100.net/GET-STARTED
AI Marketplace: https://taimine.com/

AI news timestamps:
0:00 Multimodal artificial intelligence.
0:35 OpenAI consistency models.
1:35 GPT4 and computers.
3:04 GPT4 and robotics.
4:28 GPT4 and the disabled.
5:36 Microsoft Visual ChatGPT

#ai #future #tech

Researchers at Texas A&M University have discovered a novel circuit element referred to as a meminductor that led to a significant breakthrough in circuit elements.

In an electrical circuit, circuit elements play a crucial role in managing the flow of electricity. The resistor, capacitor, and inductor are the traditional circuit elements, while the memristor and memcapacitor are the more recent additions discovered in the past 15 years. These newer components, known as “mem-” versions of the classical elements, have different voltage and current characteristics that are influenced by previous voltage or current values over time, giving them memory-like properties.

Dr. H. Rusty Harris, an Associate Professor at the Department of Electrical and Computer Engineering at Texas A&M University, has made a significant breakthrough in circuit elements with the discovery of a new component called the meminductor.

West Australian researchers have developed a breakthrough method to measure the brain fluid pressure in humans, which may reduce vision damage experienced by astronauts on long-haul space flights.

A cross-disciplinary team from the Lions Eye Institute and the International Space Centre at The University of Western Australia has developed a clever technique to measure the pressure in the brain fluid, the study was published in Nature in npj Microgravity.

Cambridge University researchers developed a novel robotic hand that works with minimal finger actuation.

In a significant breakthrough, researchers at the University of Cambridge have designed an energy-efficient robotic hand that can grasp a variety of objects with minimal finger actuation, according to a study published on April 11 in Advanced Intelligent Systems.

By relying on passive wrist movement and tactile sensors embedded in its ‘skin,’ the 3D-printed hand can carry out complex movements, paving the way for low-cost, energy-efficient robotics with more natural and adaptable activities.

In recent years, cancer therapies have often fallen short of expectations, with tumors developing resistance to medication. One such example is alpelisib, a drug approved for use in Switzerland as a treatment for advanced breast cancer.

However, a research group at the Department of Biomedicine of the University of Basel has made a breakthrough in understanding the reasons behind this resistance, publishing their findings in the journal Cell Reports Medicine.

For patients suffering from advanced and metastatic breast cancer, effective treatment options are limited. The PI3K signaling pathway is frequently overactive in breast cancer due to mutations that encourage tumor growth.

The GPT phenomenon and the future of humanity in the face of advances in Artificial Intelligence.

The Age of Artificial Intelligence is an increasingly present reality in our daily lives. With the rise of technologies such as Natural Language Processing (NLP) and Artificial Neural Networks (ANN), the possibility of creating machines capable of performing tasks that were previously exclusive to humans has emerged.

One of these technologies is the Generative Pre-trained Transformer, better known as GPT. It’s the Large Language Model (LLM) developed by OpenAI.

Most Beautiful japanese humanoid Robots | AI SCience |

The world of robots is evolving at an unprecedented rate. We just cannot imagine the kind of innovations the Japanese companies have come up with when it comes to humanoid robots. There are a lot of new technological upgrades in these robots that will prove to be very important as time passes. But what possible capabilities can we reach through it? Stay tuned and you shall find all of the answers.
#WhatisArtificialIntelligence #QuickSupport #Innovation.

🔔 Subscribe now for more Artificial Intelligence news, Data science news, Machine.
Every week we will update you about the latest news of the week related with AI & Learning news and more.

🦾 Support us NOW so we can create more videos: https://www.youtube.com/channel/UCCNA3fEj1ub8-cvv4jzUSMw.

Each project will get up to $600,000 over two years to continue developing the concepts.

National Aeronautics and Space Administration’s (NASA) Innovative Advanced Concepts program has chosen six research teams to receive Phase II funds.

“NASA’s story is one of [the] barriers broken, and technologies transformed to support our missions and benefit all of humanity,” said NASA Administrator Bill Nelson.


Christopher Morrison/NASA

I ve been quite impressed so far. And, if they can be improved over night i would love to see it.


With long-term memory, language models could be even more specific – or more personal. MemoryGPT gives a first impression.

Right now, interaction with language models refers to single instances, e.g. in ChatGPT to a single chat. Within that chat, the language model can to some extent take the context of the input into account for new texts and replies.

In the currently most powerful version of GPT-4, this is up to 32,000 tokens – about 50 pages of text. This makes it possible, for example, to chat about the contents of a long paper. To find new solutions, developers can talk to a larger code database. The context window is an important building block for the practical use of large language models, an innovation made possible by Transformer networks.