Dec 20, 2020
Dozens of journalists’ iPhones hacked with NSO ‘zero-click’ spyware, says Citizen Lab
Posted by Quinn Sena in category: mobile phones
Citizen Lab said Saudi Arabia and the United Arab Emirates were likely behind the attacks.
Citizen Lab said Saudi Arabia and the United Arab Emirates were likely behind the attacks.
Emerging and reemerging infections present an ever-increasing challenge to global health. Here, we report a nanoparticle-enabled smartphone (NES) system for rapid and sensitive virus detection. The virus is captured on a microchip and labeled with specifically designed platinum nanoprobes to induce gas bubble formation in the presence of hydrogen peroxide. The formed bubbles are controlled to make distinct visual patterns, allowing simple and sensitive virus detection using a convolutional neural network (CNN)-enabled smartphone system and without using any optical hardware smartphone attachment. We evaluated the developed CNN-NES for testing viruses such as hepatitis B virus (HBV), HCV, and Zika virus (ZIKV). The CNN-NES was tested with 134 ZIKV-and HBV-spiked and ZIKV-and HCV-infected patient plasma/serum samples. The sensitivity of the system in qualitatively detecting viral-infected samples with a clinically relevant virus concentration threshold of 250 copies/ml was 98.97% with a confidence interval of 94.39 to 99.97%.
See allHide authors and affiliations.
Smartphone systems can also benefit from the recent unprecedented advancements in nanotechnology to develop diagnostic approaches. Catalysis can be considered as one of the popular applications of nanoparticles because of their large surface-to-volume ratio and high surface energy (11–16). So far, numerous diagnostic platforms for cancer and infectious diseases have been developed by substituting enzymes, such as catalase, oxidase, and peroxidase with nanoparticle structures (17–20). Here, we adopted the intrinsic catalytic properties of platinum nanoparticles (PtNPs) for gas bubble formation to detect viruses on-chip using a convolutional neural network (CNN)–enabled smartphone system.
A research group led by Prof. Chen Tao at the Ningbo Institute of Materials Technology and Engineering (NIMTE) of the Chinese Academy of Sciences (CAS), developed a novel soft self-healing and adhesive human-machine interactive touch pad based on transparent nanocomposite hydrogels, in cooperation with the researchers from the Beijing Institute of Nanoenergy and Nanosystems of CAS. The study was published in Advanced Materials.
With the rapid development of information technology and the Internet of things, flexible and wearable electronic devices have attracted increasing attention. A touch pad is a requisite input device for a mobile phone, smart appliance and point-of-information terminal. Indium tin oxide (ITO) has been used as the dominant transparent conductive film for manufacturing commercial touch pads, which inevitably have obvious shortcomings, like fragility.
To improve the stretchability and biocompatibility of touch pads to allow their interaction with humans, the researchers at NIMTE developed highly transparent and stretchable polyzwitterion-clay nanocomposite hydrogels with transmittance of 98.8% and fracture strain beyond 1500%.
CSL’s Systems and Networking Research Group (SyNRG) is defining a new sub-area of mobile technology that they call “earable computing.” The team believes that earphones will be the next significant milestone in wearable devices, and that new hardware, software, and apps will all run on this platform.
“The leap from today’s earphones to ‘earables’ would mimic the transformation that we had seen from basic phones to smartphones,” said Romit Roy Choudhury, professor in electrical and computer engineering (ECE). “Today’s smartphones are hardly a calling device anymore, much like how tomorrow’s earables will hardly be a smartphone accessory.”
Instead, the group believes tomorrow’s earphones will continuously sense human behavior, run acoustic augmented reality, have Alexa and Siri whisper just-in-time information, track user motion and health, and offer seamless security, among many other capabilities.
Light-emitting diodes—LEDs—can do way more than illuminate your living room. These light sources are useful microelectronics too.
Smartphones, for example, can use an LED proximity sensor to determine if you’re holding the phone next to your face (in which case the screen turns off). The LED sends a pulse of light toward your face, and a timer in the phone measures how long it takes that light to reflect back to the phone, a proxy for how close the phone is to your face. LEDs are also handy for distance measurement in autofocus cameras and gesture recognition.
One problem with LEDs: It’s tough to make them from silicon. That means LED sensors must be manufactured separately from their device’s silicon-based processing chip, often at a hefty price. But that could one day change, thanks to new research from MIT’s Research Laboratory of Electronics (RLE).
Recently, Google introduced Portrait Light, a feature on its Pixel phones that can be used to enhance portraits by adding an external light source not present at the time the photo was taken. In a new blog post, Google explains how they made this possible.
In their post, engineers at Google Research note that professional photographers discovered long ago that the best way to make people look their best in portraits is by using secondary flash devices that are not attached to the camera. Such flash devices can be situated by the photographer prior to photographing a subject by taking into account the direction their face is pointing, other light available, skin tone and other factors. Google has attempted to capture those factors with its new portrait-enhancing software. The system does not require the camera phone operator to use another light source. Instead, the software simply pretends that there was another light source all along, and then allows the user to determine the most flattering configuration for the subject.
The engineers explain they achieved this feat using two algorithms. The first, which they call automatic directional light placement, places synthetic light into the scene as a professional photographer would. The second algorithm is called synthetic post-capture relighting. It allows for repositioning the light after the fact in a realistic and natural-looking way.
Being able to see, move, and exercise independently is something most of us take for granted. [Thomas Panek] was an avid runner before losing his sight due to a genetic condition, and had to rely on other humans and guide dogs to run again. After challenging attendants at a Google hackathon, Project Guideline was established to give blind runners (or walkers) independence from a cane, dog or another human, while exercising outdoors. Using a smartphone with line following AI software, and bone conduction headphones, users can be guided along a path with a line painted on it. You need to watch the video below to get a taste of just how incredible it is for the users.
Getting a wheeled robot to follow a line is relatively simple, but a running human is by no means a stable sensor platform. At the previously mentioned hackathon, developers put together a rough proof of concept with a smartphone, using its camera to recognize a painted line on the ground and provide left/right audio cues. As the project developed, the smartphone was attached to a waist belt and bone conduction headphones were used, which don’t affect audio situational awareness as much as normal headphones.
The shaking and side to side movement of running, and varying light conditions and visual obstructions in the outdoors made the problem more difficult to solve, but within a year the developers had completed successful running tests with [Thomas] on a well-lit indoor track and an outdoor pedestrian path with a temporary line. For the first time in 25 years, [Thomas] was able to run independently.
:ooooooooooo.
How bright is your flashlight? I only have the one on my phone because I’m completely unprepared for any sort of emergency situation. Well, presumably with the belief that it should be daylight all the time, the team over at Hacksmith Industries took it upon themselves to build a giant, 1,414,224 lumen flashlight. I can already imagine myself staring at it until I go blind.
Continue reading “Team Builds 1.4 Million Lumen ‘World’s Brightest’ Flashlight” »
Ever dream about never having to charge your phone?
Nikola Tesla couldn’t figure out the secret to ubiquitous wireless power. Now, more than 100 years later, are we any closer to pulling it off?