Menu

Blog

Archive for the ‘computing’ category: Page 816

Jan 20, 2016

Quantum computing is coming — are you prepared for it?

Posted by in categories: business, computing, economics, quantum physics

2 weeks ago, I posted a big announcement was coming; well we have officially received it. Now, the question is “WILL YOU BE READY?” Within less than 4 years (2020) Quantum will be available. Everyone needs to be planning and getting budgets and resources in place for this massive transformation that is coming within 4 years. It will be expensive, time consuming, and a lot of prep work around business and it needs to be assessed, planned, and position to onboard quickly to quantum because other countries (including hackers) are going to be on quantum as well meaning more powerful network and platforms to attack older systems. https://lnkd.in/baSZrBY


Quantum computing will change lives, society and the economy and a working system is expected to be developed by 2020 according to a leading figure in the world of quantum computing, who will talk tomorrow Jan. 21, 2016 at the World Economic Forum (WEF) in Davos, Switzerland.

Professor O’Brien, Director of the Centre for Quantum Photonics at the University of Bristol and Visiting Fellow at Stanford University, is part of a European Research Council (ERC) Ideas Lab delegation who have been invited to talk at the annual meeting to industrial and political leaders of the world, including Prime Minister David Cameron. The session will discuss the future of computing and how new fields of computer sciences are paving the way for the next digital revolution.

Continue reading “Quantum computing is coming -- are you prepared for it?” »

Jan 20, 2016

Graphene ‘optical capacitors’ can make chips that mesh biophysics and semiconductors

Posted by in categories: computing, materials, physics

Graphene’s properties make it a tantalizing target for semiconductor research. Now a team from Princeton has showed that flakes of graphene can work as fast, accurate optical capacitors for laser transistors in neuromorphic circuits.

Read more

Jan 20, 2016

Open-Source GPU Could Push Computing Power to the Next Level

Posted by in category: computing

Researchers at Binghamton University are using an open-source graphics processor unit (GPU) to push the devices’ performance and application.

Binghamton University computer science assistant professor Timothy Miller, assistant professor Aaron Carpenter, and graduate student Philip Dexterm, along with co-author Jeff Bush, have developed Nyami, a synthesizable GPU architectural model for general-purpose and graphics-specific workloads. This marks the first time a team has taken an open-source GPU design and run a series of experiments on it to see how different hardware and software configurations would affect the circuit’s performance.

According to Miller, the results will help other scientists make their own GPUs and push computing power to the next level.

Read more

Jan 19, 2016

The US Military Wants a Chip to Translate Your Brain Activity Into Binary Code

Posted by in categories: biotech/medical, computing, engineering, military, neuroscience, supercomputing

It’s been a weird day for weird science. Not long after researchers claimed victory in performing a head transplant on a monkey, the US military’s blue-sky R&D agency announced a completely insane plan to build a chip that would enable the human brain to communicate directly with computers. What is this weird, surreal future?

It’s all real, believe it or not. Or at least DARPA desperately wants it to be. The first wireless brain-to-computer interface actually popped up a few years ago, and DARPA’s worked on various brain chip projects over the years. But there are shortcomings to existing technology: According to today’s announcement, current brain-computer interfaces are akin to “two supercomputers trying to talk to each other using an old 300-baud modem.” They just aren’t fast enough for truly transformative neurological applications, like restoring vision to a blind person. This would ostensibly involve connect a camera that can transmit visual information directly to the brain, and the implant would translate the data into neural language.

To accomplish this magnificent feat, DARPA is launching a new program called Neural Engineering System Design (NESD) that stands to squeeze some characteristically bonkers innovation out of the science community. In a press release, the agency describes what’s undoubtedly the closest thing to a Johnny Mneumonic plot-line you’ve ever seen in real life. It reads:

Read more

Jan 19, 2016

Quantum Weirdness Now a Matter of Time

Posted by in categories: computing, quantum physics, robotics/AI

Imagine a Quantum computer and/ or device that can perform “all possible operations” with the associated data all at once; and not just performing a series of operations. Meaning, with Quantum Entanglement processing, our AI machines could truly outperform all of us. No longer a fantasy or myth; it will be real.


Bizarre quantum bonds connect distinct moments in time, suggesting that quantum links — not space-time — constitute the fundamental structure of the universe.

Read more

Jan 19, 2016

Samsung announces mass production of next-generation HBM2 memory

Posted by in category: computing

Samsung announced today that its next-generation HBM2 memory is in mass production. 4GB stacks are available now, with 8GB expected by the end of the year.

Read more

Jan 18, 2016

EverLaw: Another Useful Artificial Intelligence Capability By @BobGourley | @CloudExpo #Cloud

Posted by in categories: business, computing, robotics/AI

With AI, why have attorneys or judges anymore. Frankly, AI is proving to be the most unbiased judges/ decision makers already. And, AI can develop contracts and patent agreements, etc. better than most humans. Plus, AI will outperform humans in discovery work on cases. So, we truly in just 3 years may not need judges and attorneys anymore.


Our list of Truly Useful Artificial Intelligence Tools You Can Use Today was out of date the minute we published it. We knew that would happen and are absolutely thrilled when we discover new capabilities that belong on this list. One we just learned about is EverLaw, provider of perhaps the world’s most advanced litigation platform, designed to be easy to use and programmed to leverage the most powerful technologies available, including cloud computing, mobile solutions and yes, artificial intelligence.

We found Everlaw and learned about their prediction engine and other key platform characteristics from an a16z blog post introducing a new investment. From a16z:

Continue reading “EverLaw: Another Useful Artificial Intelligence Capability By @BobGourley | @CloudExpo #Cloud” »

Jan 18, 2016

In Silico

Posted by in categories: biotech/medical, computing, life extension

https://youtube.com/watch?v=hWUqZmDBJLc

Insilico Medicine utilizing high-performance computing to combat aging and age-related diseases selected for NVIDIA GTC Contest Finals.

Read more

Jan 17, 2016

Smart robots could soon steal your job

Posted by in categories: computing, nanotechnology, quantum physics, robotics/AI

Guessing my earlier posting about imagining you’re in a scenario that you must decide to either to have a chip implant v. waiting on a nanobot is not that far fetched. Nonetheless, there are truly careers that will not be replaced by robot such as artist’s works, designers, etc. And, new careers and companies will be created throughout the AI and Quantum evolution. https://lnkd.in/b5i5C-X


Think you are too smart to be replaced by a robot in your job? Think again.

Read more

Jan 17, 2016

IBM, U. of Michigan Creating Chatty Computer

Posted by in category: computing

Now everyone can eventually feel like their mother-in-law is always with them. A computer that never stops talking and always has an opinion on everything.


IBM and the University of Michigan are working on a conversational computing system that will transform human-machine communication.

Read more