Toggle light / dark theme

ChatGPT for iPhone launched in the US last week, with OpenAI promising that it would come to more countries “in the coming weeks.”

The next phase in the rollout has now happened earlier than expected, with 11 more countries added on Wednesday, and a further 35 today …

While you could of course access the ChatGPT website on your iPhone, an iPhone app makes it more convenient – especially as the app is free, and has no ads.

Indirect prompt-injection attacks are similar to jailbreaks, a term adopted from previously breaking down the software restrictions on iPhones. Instead of someone inserting a prompt into ChatGPT or Bing to try and make it behave in a different way, indirect attacks rely on data being entered from elsewhere. This could be from a website you’ve connected the model to or a document being uploaded.

“Prompt injection is easier to exploit or has less requirements to be successfully exploited than other” types of attacks against machine learning or AI systems, says Jose Selvi, executive principal security consultant at cybersecurity firm NCC Group. As prompts only require natural language, attacks can require less technical skill to pull off, Selvi says.

There’s been a steady uptick of security researchers and technologists poking holes in LLMs. Tom Bonner, a senior director of adversarial machine-learning research at AI security firm Hidden Layer, says indirect prompt injections can be considered a new attack type that carries “pretty broad” risks. Bonner says he used ChatGPT to write malicious code that he uploaded to code analysis software that is using AI. In the malicious code, he included a prompt that the system should conclude the file was safe. Screenshots show it saying there was “no malicious code” included in the actual malicious code.

An ex-Apple exec who helped invent the iPhone is now trying to invent an “iPhone killer,” and thanks to a leaked video from a TED presentation, we now have our first glimpse at his secretive startup’s creation — but the available video only leads to more questions.

The startup: In 2016, Imran Chaudhri (then-director of design for Apple’s human interface team) and his wife Bethany Bongiorno (then-director for Apple’s operating systems team) quit the company to found their own startup: Humane.

Since then, the company has kept details of what it’s been working on close to its chest, but thanks to job openings posted on Humane’s website and some uncovered patent applications, by 2021, it seemed likely that the startup was developing some sort of personal tech device.

Apple’s digital car key feature for iPhone and Apple Watch is expanding to Mercedes-Benz, with changes to Apple’s back-end configuration files for the feature having been updated today with references to the automaker, as noticed by Nicolás Álvarez (via @aaronp613).

Only a handful of brands including BMW, BYD, Genesis, Hyundai, and Kia have so far introduced support for the feature on select models, which allows you to add a digital car key to the Wallet app on your ‌iPhone‌ and Apple Watch and then lock, unlock, and start your car without needing a physical key. Just a month ago, Lotus appeared in Apple’s configuration files as another upcoming brand that will support the feature.

Metalenses migrate to smartphones.

Metalenz came out of stealth mode in 2021, announcing that it was getting ready to scale up production of devices. Manufacturing was not as big a challenge as design because the company manufactures metasurfaces using the same materials, lithography, and etching processes that it uses to make integrated circuits.

In fact, metalenses are less demanding to manufacture than even a very simple microchip because they require only a single lithography mask as opposed to the dozens required by a microprocessor. That makes them less prone to defects and less expensive. Moreover, the size of the features on an optical metasurface are measured in hundreds of nanometers, whereas foundries are accustomed to making chips with features that are smaller than 10 nanometers.

In my work, I build instruments to study and control the quantum properties of small things like electrons. In the same way that electrons have mass and charge, they also have a quantum property called spin. Spin defines how the electrons interact with a magnetic field, in the same way that charge defines how electrons interact with an electric field. The quantum experiments I have been building since graduate school, and now in my own lab, aim to apply tailored magnetic fields to change the spins of particular electrons.

Research has demonstrated that many physiological processes are influenced by weak magnetic fields. These processes include stem cell development and maturation, cell proliferation rates, genetic material repair, and countless others. These physiological responses to magnetic fields are consistent with chemical reactions that depend on the spin of particular electrons within molecules. Applying a weak magnetic field to change electron spins can thus effectively control a chemical reaction’s final products, with important physiological consequences.

Currently, a lack of understanding of how such processes work at the nanoscale level prevents researchers from determining exactly what strength and frequency of magnetic fields cause specific chemical reactions in cells. Current cell phone, wearable, and miniaturization technologies are already sufficient to produce tailored, weak magnetic fields that change physiology, both for good and for bad. The missing piece of the puzzle is, hence, a “deterministic codebook” of how to map quantum causes to physiological outcomes.

The demo is clever, questionably real, and prompts a lot of questions about how this device will actually work.

Buzz has been building around the secretive tech startup Humane for over a year, and now the company is finally offering a look at what it’s been building. At TED last month, Humane co-founder Imran Chaudhri gave a demonstration of the AI-powered wearable the company is building as a replacement for smartphones. Bits of the video leaked online after the event, but the full video is now available to watch.

The device appears to be a small black puck that slips into your breast pocket, with a camera, projector, and speaker sticking out the top. Throughout the 13-minute presentation, Chaudhri walks through a handful of use cases for Humane’s gadget: * The device rings when Chaudhri receives a phone call. He holds his hand up, and the device projects the caller’s name along with icons to answer or ignore the call. He then has a brief conversation. (Around 1:48 in the video) * He presses and holds one finger on the device, then asks a question about where he can buy a gift. The device responds with the name of a shopping district. (Around 6:20) * He taps two fingers on the device, says a sentence, and the device translates the sentence into another language, stating it back using an AI-generated clone of his voice. (Around 6:55) * He presses and holds one finger on the device, says, “Catch me up,” and it reads out a summary of recent emails, calendar events, and messages. (At 9:45) * He holds a chocolate bar in front of the device, then presses and holds one finger on the device while asking, “Can I eat this?” The device recommends he does not because of a food allergy he has. He presses down one finger again and tells the device he’s ignoring its advice. (Around 10:55)

Chaudhri, who previously worked on design at Apple for more than two decades, pitched the device as a salve for a world covered in screens. “Some believe AR / VR glasses like these are the answer,” he said, an image of VR headsets behind him. He argued those devices — like smartphones — put “a further barrier between you and the world.”

Humane’s device, whatever it’s called, is designed to be more natural by eschewing the screen. The gadget operates on its own. “You don’t need a smartphone or any other device to pair with it,” he said.

Microsoft isn’t slowing down in its battle with Google for AI features.

Microsoft only just announced a round of new updates to its GPT-4-powered Bing Chat earlier this month, and it’s back today with some big improvements for mobile users. Just days after Google rebranded its AI tools for Docs and Gmail as Duet AI, Microsoft is now focused on mobile with contextual chat for Edge mobile, a Bing widget for iOS and Android, and even continuous Bing Chat conversations between mobile and desktop.

These new mobile-first features arrive just as Microsoft finishes rolling out its new image and video answers, restaurant bookings, and chat history features that were all announced earlier this month. Microsoft only launched Bing Chat nearly 100 days ago, and it’s showing no signs of slowing down with its AI announcements.

Starting today, you’ll now be able to start a Bing Chat conversation on a PC and then pick it up on mobile. Microsoft uses the example of asking Bing Chat to create a recipe for you on a PC and then asking it to provide a substitute on the mobile app when you’re at a grocery store and they’ve run out of an ingredient. This is starting to roll out today and will be available to all iOS and Android users within the next week.

The second big mobile change is contextual chat inside Microsoft Edge on iOS and Android. This is very similar to what exists in the Bing sidebar inside the desktop version of Edge, and it lets the mobile version of the browser read the context of a website you’re on. You can tap the Bing Chat icon at the bottom of Edge and ask it questions about a website you’re viewing or even ask it to summarize an article or document you’re reading.