Menu

Blog

Archive for the ‘supercomputing’ category: Page 66

Apr 16, 2019

Optimizing network software to advance scientific discovery

Posted by in categories: mathematics, particle physics, supercomputing

High-performance computing (HPC)—the use of supercomputers and parallel processing techniques to solve large computational problems—is of great use in the scientific community. For example, scientists at the U.S. Department of Energy’s (DOE) Brookhaven National Laboratory rely on HPC to analyze the data they collect at the large-scale experimental facilities on site and to model complex processes that would be too expensive or impossible to demonstrate experimentally.

Modern science applications, such as simulating , often require a combination of aggregated computing power, high-speed networks for data transfer, large amounts of memory, and high-capacity storage capabilities. Advances in HPC hardware and software are needed to meet these requirements. Computer and computational scientists and mathematicians in Brookhaven Lab’s Computational Science Initiative (CSI) are collaborating with physicists, biologists, and other domain scientists to understand their data analysis needs and provide solutions to accelerate the scientific discovery process.

Read more

Apr 15, 2019

Even more frightening than military AI: an AI President of the Republic?

Posted by in categories: government, military, robotics/AI, supercomputing

A recent survey by the IE University in Madrid reveals that one in four Europeans would be ready to put an artificial intelligence in power. Should we be concerned for democracy or, on the contrary, welcome Europeans’ confidence in technology?

Europeans ready to elect an AI?

According to the study in question, about one in four out of the 25,000 Europeans surveyed would be prepared to be governed by an AIt worth noting that there are significant variations between countries, because where the European average is around 30%, respondents in the Netherlands are much more open to having a government run by a supercomputer (+ 43%) than in France (+ 25%). “The idea of a pragmatic machine, impervious to fraud and corruption” is one of the reasons that seems most compelling to the interviewees. Added to this are the options that Machine Learning would enable: in fact, the AI described would be able to improve by studying and selecting the best political decisions in the world… It would then be able to make better decisions than existing politicians.

Continue reading “Even more frightening than military AI: an AI President of the Republic?” »

Apr 10, 2019

Human Brain/Cloud Interface

Posted by in categories: biotech/medical, education, internet, nanotechnology, Ray Kurzweil, robotics/AI, supercomputing

The Internet comprises a decentralized global system that serves humanity’s collective effort to generate, process, and store data, most of which is handled by the rapidly expanding cloud. A stable, secure, real-time system may allow for interfacing the cloud with the human brain. One promising strategy for enabling such a system, denoted here as a “human brain/cloud interface” (“B/CI”), would be based on technologies referred to here as “neuralnanorobotics.” Future neuralnanorobotics technologies are anticipated to facilitate accurate diagnoses and eventual cures for the ∼400 conditions that affect the human brain. Neuralnanorobotics may also enable a B/CI with controlled connectivity between neural activity and external data storage and processing, via the direct monitoring of the brain’s ∼86 × 10 neurons and ∼2 × 1014 synapses. Subsequent to navigating the human vasculature, three species of neuralnanorobots (endoneurobots, gliabots, and synaptobots) could traverse the blood–brain barrier (BBB), enter the brain parenchyma, ingress into individual human brain cells, and autoposition themselves at the axon initial segments of neurons (endoneurobots), within glial cells (gliabots), and in intimate proximity to synapses (synaptobots). They would then wirelessly transmit up to ∼6 × 1016 bits per second of synaptically processed and encoded human–brain electrical information via auxiliary nanorobotic fiber optics (30 cm) with the capacity to handle up to 1018 bits/sec and provide rapid data transfer to a cloud based supercomputer for real-time brain-state monitoring and data extraction. A neuralnanorobotically enabled human B/CI might serve as a personalized conduit, allowing persons to obtain direct, instantaneous access to virtually any facet of cumulative human knowledge. Other anticipated applications include myriad opportunities to improve education, intelligence, entertainment, traveling, and other interactive experiences. A specialized application might be the capacity to engage in fully immersive experiential/sensory experiences, including what is referred to here as “transparent shadowing” (TS). Through TS, individuals might experience episodic segments of the lives of other willing participants (locally or remote) to, hopefully, encourage and inspire improved understanding and tolerance among all members of the human family.

“We’ll have nanobots that… connect our neocortex to a synthetic neocortex in the cloud… Our thinking will be a… biological and non-biological hybrid.”

— Ray Kurzweil, TED 2014

Continue reading “Human Brain/Cloud Interface” »

Apr 5, 2019

Getting a big look at tiny particles

Posted by in categories: biotech/medical, nuclear energy, quantum physics, supercomputing

At the turn of the 20th century, scientists discovered that atoms were composed of smaller particles. They found that inside each atom, negatively charged electrons orbit a nucleus made of positively charged protons and neutral particles called neutrons. This discovery led to research into atomic nuclei and subatomic particles.

An understanding of these ’ structures provides crucial insights about the forces that hold matter together and enables researchers to apply this knowledge to other scientific problems. Although electrons have been relatively straightforward to study, protons and neutrons have proved more challenging. Protons are used in medical treatments, scattering experiments, and fusion energy, but nuclear scientists have struggled to precisely measure their underlying structure—until now.

In a recent paper, a team led by Constantia Alexandrou at the University of Cyprus modeled the location of one of the subatomic particles inside a , using only the basic theory of the strong interactions that hold matter together rather than assuming these particles would act as they had in experiments. The researchers employed the 27-petaflop Cray XK7 Titan supercomputer at the Oak Ridge Leadership Computing Facility (OLCF) and a method called lattice quantum chromodynamics (QCD). The combination allowed them to map on a grid and calculate interactions with high accuracy and precision.

Read more

Mar 31, 2019

Supercomputers help supercharge protein assembly

Posted by in categories: biotech/medical, supercomputing

Red blood cells are amazing. They pick up oxygen from our lungs and carry it all over our body to keep us alive. The hemoglobin molecule in red blood cells transports oxygen by changing its shape in an all-or-nothing fashion. Four copies of the same protein in hemoglobin open and close like flower petals, structurally coupled to respond to each other. Using supercomputers, scientists are just starting to design proteins that self-assemble to combine and resemble life-giving molecules like hemoglobin. The scientists say their methods could be applied to useful technologies such as pharmaceutical targeting, artificial energy harvesting, ‘smart’ sensing and building materials, and more.

Read more

Mar 20, 2019

Supercomputer sheds light on how droplets merge

Posted by in categories: 3D printing, climatology, supercomputing

Scientists have revealed the precise molecular mechanisms that cause drops of liquid to combine, in a discovery that could have a range of applications.

Insights into how merge could help make 3D printing technologies more accurate and may help improve the forecasting of thunderstorms and other weather events, the study suggests.

Read more

Mar 20, 2019

A surprising, cascading earthquake

Posted by in categories: physics, supercomputing

The Kaikoura earthquake in New Zealand in 2016 caused widespread damage. LMU researchers have now dissected its mechanisms revealing surprising insights on earthquake physics with the aid of simulations carried out on the supercomputer SuperMUC.

The 2016 Kaikoura earthquake (magnitude 7.8) on the South Island of New Zealand is among the most intriguing and best-documented seismic events anywhere in the world – and one of the most complex. The earthquake exhibited a number of unusual features, and the underlying geophysical processes have since been the subject of controversy. LMU geophysicists Thomas Ulrich and Dr. Alice-Agnes Gabriel, in cooperation with researchers based at the Université Côte d’Azur in Valbonne and at Hong Kong Polytechnic University, have now simulated the course of the earthquake with an unprecedented degree of realism. Their model, which was run on the Bavarian Academy of Science’s supercomputer SuperMUC at the Leibniz Computing Center (LRZ) in Munich, elucidates dynamic reasons for such uncommon multi-segment earthquake. This is an important step towards improving the accuracy of earthquake hazard assessments in other parts of the world. Their findings appear in the online journal Nature Communications.

Continue reading “A surprising, cascading earthquake” »

Mar 14, 2019

Why modern enterprises need to adopt cognitive computing for faster business growth in a digital economy

Posted by in categories: business, economics, robotics/AI, supercomputing

Cognitive computing (CC) technology revolves around making computers adept at mimicking the processes of the human brain, which is basically making them more intelligent. Even though the phrase cognitive computing is used synonymously with AI, the term is closely associated with IBM’s cognitive computer system, Watson. IBM Watson is a supercomputer that leverages AI-based disruptive technologies like machine learning (ML), real-time analysis, natural language processing, etc. to augment decision making and deliver superior outcomes.

Read more

Mar 7, 2019

Physicists Used Supercomputers to Map the Bone-Crushing Pressures Hiding Inside Protons

Posted by in categories: physics, supercomputing

If you shrank yourself down and entered a proton, you’d experience among the most intense pressures found anywhere in the universe.

Read more

Mar 5, 2019

Scientists use machine learning to identify high-performing solar materials

Posted by in categories: engineering, robotics/AI, solar power, supercomputing, sustainability

Finding the best light-harvesting chemicals for use in solar cells can feel like searching for a needle in a haystack. Over the years, researchers have developed and tested thousands of different dyes and pigments to see how they absorb sunlight and convert it to electricity. Sorting through all of them requires an innovative approach.

Now, thanks to a study that combines the power of supercomputing with and experimental methods, researchers at the U.S. Department of Energy’s (DOE) Argonne National Laboratory and the University of Cambridge in England have developed a novel “design to device” approach to identify promising materials for dye-sensitized solar cells (DSSCs). DSSCs can be manufactured with low-cost, scalable techniques, allowing them to reach competitive performance-to-price ratios.

The team, led by Argonne materials scientist Jacqueline Cole, who is also head of the Molecular Engineering group at the University of Cambridge’s Cavendish Laboratory, used the Theta supercomputer at the Argonne Leadership Computing Facility (ALCF) to pinpoint five high-performing, low-cost dye materials from a pool of nearly 10,000 candidates for fabrication and device testing. The ALCF is a DOE Office of Science User Facility.

Continue reading “Scientists use machine learning to identify high-performing solar materials” »

Page 66 of 93First6364656667686970Last