Controlling VR with my mind: NextMind’s dev kit shows me a strange new world – CNET

nextmind-sensor

The NextMind band, or what it looks like from the side that rests against the back of your head.

Scott Stein/CNET
This story is part of CES, where our editors will bring you the latest news and the hottest gadgets of the entirely virtual CES 2021.

In my Oculus Quest VR headset, I was in a room surrounded by large-brained aliens. Their heads flashed, white and black. I turned to one, staring at it. Soon enough, its head exploded. I looked at the others, making their heads explode. Then I looked at a flashing portal marker across the room and was gone. I did this without eye tracking. A band on the back of my head was sensing my visual cortex with electrodes.

I felt like I was living some sort of real-life virtual version of the David Cronenberg film, Scanners. But in reality, I was trying a neural input device made by NextMind.

Now playing: Watch this: The future of mind-controlled tech is already here, sort...

14:21

Before holiday break, I received a large black box with a small package inside. A black disc, with a headband. The disc was covered in small rubber-footed pads. NextMind's $399 developer kit, announced a year ago at CES 2020, aims at something many companies are striving for: neural inputs. NextMind aims to read a brain's signals to track attention, control objects and maybe even more.

It's hard to understand the real potential and possibilities of neural input technology. Also, many of the startups in this space are doing different things. CTRL-Labs, a neurotechnology company acquired by Facebook in 2019, developed an armband that could send hand and finger inputs. Another company, Mudra, is making a wristband for Apple Watch later this year that also senses neural inputs on the wrist.

I wore an early version of the Mudra Band a year ago, and experienced how it could interpret my finger's movements, and even roughly measure how much pressure I was applying when I squeezed my fingers. Even more weirdly, Mudra's tech can work when you aren't moving your fingers at all. The applications could include assisting people who don't even have hands, like a prosthetic wearable.

NextMind's ambitions look to follow a similar assistive-tech path, while also aiming for a world where neural devices could possibly help improve accuracy with physical inputs -- or combine with a world of other peripherals. Facebook's AR/VR head, Andrew Bosworth, sees neural input tech emerging at Facebook within three to five years, where it could end up being combined with wearable devices like smart glasses.

nextmind-vr

Attaching the NextMind device onto an Oculus Quest 2 headband.

Scott Stein/CNET

My NextMind experience has been rough, but also mesmerizing. The dev kit has its own tutorial and included demos that can run on Windows or Mac, plus a Steam VR demo that I played back on the Oculus Quest with a USB-C cable. The compact Bluetooth plastic puck has a headband but can also unclip from that and attach directly onto the back of a VR headset strap with a little effort.

All of NextMind's experiences involve looking at large, subtly flashing areas of your screen, which can be "clicked" by focusing. Or staring. It was hard to tell how to make something activate, and I found myself trying to open my eyes more, or breathe, or concentrate. Eventually, sooner or later, the thing I was looking at would click. Out of a field of five or so on-screen flashing "buttons," this really did know what I was looking at. And again, there was no eye tracking involved at all, this just rested on the back of my head.

Did it make me feel uncomfortable? Uncertain? Oh, yes. And as my kid came in and saw me doing this, and I showed him what I was doing, he was as astonished as if I had performed a magic trick.

NextMind's dev kit isn't meant for consumer devices yet. The Mudra Band, while launching as an Apple Watch accessory via crowdfunding site Indiegogo, is also experimental. I have no doubt we'll see more technology like this. At this year's virtual CES, there was even a "neural mouse" glove that aimed to improve reaction times by sensing click inputs a hair faster than even the physical mouse could receive. I didn't try that glove, but the idea doesn't sound far off from what companies like NextMind are imagining, either.

Right now, neural inputs feel like an imperfect attempt at creating an input, like algorithms searching for a way to do something I'd probably just do with a keyboard, a mouse or touchscreen instead. But, that was how voice recognition felt, once. And hand tracking. Right now, NextMind's demos really do work. I'm just trying to imagine what happens next. Whatever it is, I hope more exploding heads won't be a part of it.

Let's block ads! (Why?)

Read More

Sensor maker Valencell expects more blood pressure-sensing wearables in 2022 – CNET

valencell-sensor-image-compared-to-quarter

Valencell's PPG optical heart rate sensor: It's small, could eventually check for blood pressure and could fit in earbuds (or a watch bezel).

Valencell
This story is part of CES, where our editors will bring you the latest news and the hottest gadgets of the entirely virtual CES 2021.

It's 2021... where's my blood pressure smartwatch? Despite wearable health tech introducing features like blood oxygen measurement, ECG and even generalized changes in body temperature, blood pressure has proved a harder challenge. But sensor maker Valencell believes the possibilities are there, with research claiming finger-sensor blood pressure without calibration on wearables could be here by 2022.

Samsung promised blood pressure capabilities on its recent smartwatches, but the feature needs calibration with a standard blood pressure cuff, and even then it still hasn't gotten clearance from the US Food and Drug Administration. Omron, a manufacturer of medical devices, has a blood pressure-monitoring watch, but it uses an inflatable wrist-cuff and is expensive.

Valencell develops the optical heart rate sensors that a number of other devices use, and has been working on evolving its photoplethysmography sensors, which use light and a sensor to measure blood activity, to also measure blood pressure. Last year, before the COVID-19 pandemic, the company believed that earbud heart-rate monitors could double as blood pressure-sensing devices in 2021. Plans changed, though.

Valencell President Steven LeBoeuf says that the capability in earbuds is still there, but that consumer earbud manufacturers aren't that interested (the medical hearing aid market, apparently, could be a better landing place for the tech). Instead, interest has been more towards finger and wrist health devices.

One earbud with Valencell's heart rate tech, made by Huami (the company behind the Amazfit brand), doesn't have the blood pressure feature. One likely reason is that manufacturers would still need to pursue FDA and other country-based health clearance, which takes a lot of time. Valencell is currently pursuing a 510(k) submission to the FDA for its finger-sensing blood pressure technology.

Valencell's making progress on wrist-based blood pressure, but its error rate is still plus or minus 13mm of mercury: "That's still not good enough," LeBoeuf says. Meanwhile, its finger-based measurements are at error levels closer to its earbud tech, which more closely approach the accuracy of blood pressure cuffs.

LeBoeuf still sees the medical device landscape as an easier path than mainstream wearables, referencing the success of standalone devices like AliveCor's ECG accessories.

"We think that on the blood pressure technology for wearables, it's much more likely to be adopted in use cases where the payment methodology ... is very well defined," LeBoeuf says, referring to devices that can be reimbursed by insurance, or where doctors can get paid. "This person needs to monitor their blood pressure for whatever reason, or is told to by their doctor. Now they could use a blood pressure cuff for this, but it's likely they won't even use a blood pressure cuff, if that's the case. So what can you do as an alternative to the blood pressure cuff?"

According to Valencell, its calibration-free blood pressure sensing still has margins of error too large to work on-wrist yet, but progress has been made since last year. The company's latest studies show it's feasible to use a finger to check for blood pressure with its tech, good enough to pursue eventual FDA clearance. Other companies are doing finger-based blood pressure sensing, even using phone cameras, but those systems still need calibration to work, and don't yet have FDA clearance either. Valencell's tech aims to work on its own, with no calibration setup needed. That's how it worked for me when I tested the company's in-ear blood pressure monitor last year at CES.

Valencell doesn't see this tech as replacing blood-pressure cuffs, though -- not yet. Instead, it could be a more frequent monitoring method between check-ins with a cuff, almost like the way the Apple Watch monitors possible atrial fibrillation, then recommends doing an ECG spot check.

A finger-scanning device wouldn't be such a bad idea. Valencell says it could incorporate the tech into finger-worn pulse oximetry readers that test for blood oxygen levels, which have become a popular home purchase during the pandemic. Those devices use similar optical tech, and could add on Valencell's new algorithms. The finger's less invasive-feeling than an earbud, and less personal than a watch, which could make it a more easily-shared family device. The company's recent survey found that a finger or pulse oximeter was the most-preferred measurement spot for potential users (41%), followed by watch, then phone, then fitness band, with earbuds in last place (10%).

Valencell doesn't expect its tech to emerge in products in the US in 2021, but 2022 seems likely. In the meantime, your best bet if you have high blood pressure (like I do) is probably to take standard cuff measurements, and be patient. Or, look into cuffs that communicate directly with your doctor.

The information contained in this article is for educational and informational purposes only and is not intended as health or medical advice. Always consult a physician or other qualified health provider regarding any questions you may have about a medical condition or health objectives.

Let's block ads! (Why?)

Read More

My newest fitness tracker is a VR headset – CNET

oculus-fitness-watch

The Oculus Quest 2 is a surprisingly good workout device. Just know its limits. (Start with Beat Saber.)

Scott Stein/CNET

I was on a health kick once, but 2020 challenged my fitness plans. Being at home for nearly a year means setting up a home fitness regimen, somehow. Apple launched its watch-connected Fitness Plus, and there are plenty of exercise bikes and Pelotons. Nintendo has its Ring Fit Adventure. But for me, I've found myself swinging with lightsabers and punching targets in VR.

The Oculus Quest has been a surprisingly capable fitness device since the day it launched, thanks mostly to Beat Saber. Moving around in VR can feel like an escape to a completely different space. When it's used for fitness, that means it can inspire me to let go a bit more, absorb myself in the activity and consequently work harder.

VR headsets seem like an inevitability for the next wave of fitness tech. VR fitness is already here, in a way. People have found ways to lose weight with VR workout regimens. There are downsides, though: The equipment isn't explicitly designed for exercising. Headsets can get sweaty fast, and most aren't designed to breathe well during workouts. My glasses fog up, sometimes. If I don't add some sort of protective rubberized eye cushion, the foam padding soaks with sweat, which is disgusting.

Facebook made a move to push VR further into the fitness zone with an app called Oculus Move late last year, which tracks motion and estimated calories in VR apps and games. It's like a systemwide fitness tracker. This type of app already existed via a sideloaded app called YUR, but Facebook made its own version. The concept demonstrates how the fitness tracker tech that's on your smartwatch or Fitbit could make the move to headsets more easily than you think.

In some ways, the idea's already here without a headset. Apple's Fitness Plus pairs with an Apple Watch and shows heads-up stats during workouts, but displays that heads-up info on a TV screen, iPhone or iPad. Oculus Move goes for a similar idea, projecting a heads-up display in VR that can float above my head, or down on the floor. 

Oculus Move's ring-filling feels Apple Watch-like, but the metrics are different. There are only two rings to fill: One is for total active minutes, and one is for estimated calories. The Oculus Quest measures headset and controller motion to calculate and estimate the numbers, and it's not perfect. Also, it calculates during any VR activity, which can get weird. My time playing a casual platform game in VR, such as Moss, somehow earns a few active minutes -- I guess because I'm moving. But the ring makes more sense for deliberately active fitness games and apps, such as Beat Saber, Supernatural and FitXR.

974

What Oculus Move looks like when playing Beat Saber: the readout floats in the air (or at your feet).

Screenshot by Scott Stein/CNET

One thing the Oculus Move tracking goals and dashboard does is set up to show daily achievements just like the Apple Watch Fitness app. And it works. It motivates me. I actually get going, try to play long enough and exhaust myself to get those active minutes. The game becomes a workout. 

If VR headsets were more fitness-friendly, and could pair more automatically with fitness trackers, maybe they could be the next big idea in home fitness equipment. I love using the Quest for exercise, but really, VR isn't optimized for fitness. It's possible to injure yourself by throwing your hand into a table (it doesn't have live collision warnings), or you could smack yourself in the head with a controller (I've done that many times). The headset should be lighter, too.

But I feel like I've seen the future of my home gym now. I don't want to go back.

9 great fitness apps to try on an Oculus Quest

Beat Saber: The starting point, and maybe your finishing point, for fitness in VR. It's music-rhythm light saber dancing, and you need to try it. Beat Saber is not only fantastic and perfectly tuned to lightning-quick reflexes, but it's also where most of your VR friends are most likely to play. Leaderboards and high-score challenges make a great way to set fitness goals -- I keep swapping high scores with my nephew, and it's exhausting. A multiplayer mode also works for live two-player matches, and there are a solid number of DLC music packs you can buy. The included game also has a lot of tracks (from mostly unknown artists) to play with.

FitXR: A more fitness-focused boxing-type music rhythm experience has separate download packages to buy, and has timed workouts. There are also some in-game tracking metrics for estimated calorie burn.

screen-shot-2020-12-02-at-9-27-54-am.png

What it feels like to do fitness in VR (of course, you can't see yourself). 

Supernatural

Supernatural: The most elaborate fitness experience on Oculus Quest feels like VR Peloton, with holographic videos of real trainers guiding you through routines (which involve you swiping at Beat Saber-like targets along with music). Supernatural pairs with the Apple Watch, showing heart rate and fitness stats. But it also requires a monthly subscription fee.

Pistol Whip: A music-rhythm shooting game that feels like The Matrix mixed with Dance Dance Revolution. A new update adds a story-based quest, and there are lots of levels to try. The activity level is pretty low-impact, though.

Eleven Table Tennis: This isn't quite cardio, but the realism of this ping-pong game is pretty intense at higher difficulty levels.

Tai Chi: A relaxed meditative movement game where you move your controllers around to match positions of glowing targets. Like Beat Saber, but slower and more focused.

OhShape: This clever dance game has you match shapes of cut-out figures to strike poses and keep playing. It makes you move.

Dance Central: Harmonix's dancing music game feels like a club where you can dance with people and try out moves to songs. It's tiring but also weirdly fun.

Thrill of the Fight: A complete boxing simulation, with Rocky-like thematic overtones.

Now playing: Watch this: Oculus Quest 2 is better and cheaper... with one Facebook...

8:56

What you need to know about VR fitness games

They generally use your hand motions, head movement, ducking and leaning

VR fitness is usually about standing in place and waving your arms around a lot, relying on motion tracking in the controllers. There's some leaning and ducking in VR headsets that have six degrees of freedom tracking, too. (Oculus Rift/Quest, HTC Vive, Valve Index, PlayStation VR and Microsoft's VR headsets, for example.)

You need some space 

At least 5 square feet, ideally. You'll be arm swinging and lunging, and you don't want to smash your hand into a chair or wall or another person. Make sure your VR system's room boundaries are set well beyond the space you need to be safe.

Your headset can get super sweaty

VR fitness isn't always a great match with the fabrics and lenses and straps that VR headsets use. It's not easy to clean a VR headset, either. If it gets gross and soggy, try removing the foam eye-liner and cleaning it gently. There are VR headset liners you can buy, too (I haven't gotten that serious yet).

You can use your own fitness tracker

Use your fitness tracker to start a stationary workout (or "other" workout), and you can record your heart rate and estimated calorie burn.

Many are music-rhythm games that rely on timing or specific movement goals

There's a common pattern in many of these games: whether it's swinging sabers or hitting colored blocks, they're often about timing and beat, like VR versions of Dance Dance Revolution.

Listen to your body

VR fitness games won't tell you if you're overextending yourself. Much like when I pulled a muscle in Ring Fit Adventure on the Switch, you need to remember to keep to your own pace, even if the game is screaming at you to do something. Start at the easiest setting and work your way up.

The information contained in this article is for educational and informational purposes only and is not intended as health or medical advice. Always consult a physician or other qualified health provider regarding any questions you may have about a medical condition or health objectives.

Let's block ads! (Why?)

Read More

2020 forced us all to live online: Here’s how my virtual year went – CNET

Oculus Quest 2
Scott Stein/CNET

Every once in a while, back in a year called 2019, I'd run off every few weeks or so and spend time in a VR headset, or peek at some augmented reality device. I'd come back from demos imagining a world lived virtually. With all the pieces in play from so many companies, it seemed like the possibilities were almost here. In fact, at the beginning of 2020, I had a demo with one company, Spatial, imagining the future of remote work in augmented reality using smartglasses that made coworkers seem projected all around me. Seemed like science fiction at the time. 

Then 2020 happened.

On my last day in the office in 2020, I was attempting to film a video for Valve's upcoming VR game Half-Life: Alyx. I was setting up VR equipment in our studio. Then I realized, with rising COVID-19 cases everywhere, I'd better start working from home. I packed some gear in my bag. That was March 9, 2020.

Like an episode of The Twilight Zone, I was granted the dark wish of discovering what a world lived remotely, connecting only through tech, would be like. The answer: It isn't fun. While being at home has forced me into new workflows, discovering new types of art, and able to set up elaborate VR play spaces in my own house, it was disruptive. And scary. And time just completely melted away.

When I look back at 2020, I'll always call it My Virtual Year. I haven't been to stores since March. I haven't been to museums. I haven't been to restaurants. I haven't seen friends, or family, except for a few hours here and there, separated by masks and distance. I haven't flown since last January. But I've been casting myself everywhere. Shooting videos from my living room; livestreaming for tech events; product reviews from my backyard. Always just me, and the people I connect with on my screens. I've been on the holodeck, permanently. 

I get restless. I get panicked. Sometimes I settle back in. I lose sense of time. Reading books helps. Hugging my family helps. But also, sometimes, connecting virtually with others helps. It's all worked, in some ways, a lot better than I expected. And a lot worse.

It's given me a totally new perspective on what we lost in 2020, and what tech didn't completely provide. But it also showed me so many glimpses of what's possible on the other side. This virtual year was the doorway to what I think will end up being many half-virtual years to come.

tempest-grove.png

The Tempest, a VR theater experience that was on Oculus Quest this summer.

Tender Claws

VR started to creep into my everyday life

Virtual reality never became the hero to fix everyone's communication problems. Instead, Zoom swooped in. But at the end of 2020, I know people who have actually bought Oculus Quest 2 VR headsets. I no longer feel like my interest in VR is on a total island removed from everyone else I know. Like the first smartwatches, there's starting to be a bit of overlap.

The Oculus Quest, and Quest 2, have been my companions all year long. I've tested new games, demoed new worlds, briefed on new products, attended experimental theater pieces, and increasingly, I've been exercising with it nearly every day. I play Beat Saber as a meditative escape, and the headset's added fitness tracker mode is starting to become like a floating workout space.

I don't use VR all the time, and I keep it in my office. But I dip in and out, curious what I can see next, what experiments will show me what's possible. I tried working in VR, casting all my laptop monitors into my heads-on screen. That's not quite ready for prime time yet. (But it's getting closer.)

Visionaries like Facebook's Michael Abrash see a big future in which VR will be our extended workspace. Business-targeted VR headsets with incredible resolution promise the same. It hasn't happened yet, but devices like the Quest 2 are far closer to being the accessories that could live in that world.

Meanwhile, PC VR kept slowly evolving, but not very much. Games like Star Wars Squadrons and Half-Life: Alyx -- which our sister site GameSpot honored as Game of the Year 2020 -- are fantastic, but the cables, setup and high prices of PC VR gear still make it feel like a weird niche hobby. I expect more headsets will connect cheaply and wirelessly, things like the Quest 2, and more.

image002

A virtual concert in Burning Man on AltspaceVR this year, with Diplo showing up as a holographic scan.

Microsoft

I went places in VR, almost

I saw the world of theater and live events slowly try to adjust to everyone being at home. The results were mixed. Some VR festivals had virtual screening rooms, and I had an amazing time exploring museum-like gallery spaces and seeing incredible experiences.

I never went to Burning Man before, but I did this year, putting on a Quest headset and entering a galaxy of virtual art spaces in an app called Altspace VR. I saw a concert with Diplo, where he performed as a projected scan of himself in a crowd of cartoon people. I wandered out into the desert and floated into the air and saw massive, temple-like structures. I met briefly with friends.

VR theater pieces like escape rooms and an adaptation of The Tempest took me away from my home and my life, for an hour or so. Actors learning to work in VR led to some wild experiments. Some, like a multinight performance in VRChat created around the HBO show Lovecraft Country, were inspired but suffered from broken tools and laggy interfaces. I took part in an experimental VR theater show in VRChat called Finding Pandora X that made me part of a Greek chorus. Some of these moments felt captured in a special space, but all of them lacked real faces and eyes. In these VR worlds I was just a cartoon among cartoons.

But sometimes it felt like a sacred space. My nephew got a Quest 2 and he asked me to play games with him, so I bought him a cooking game called Cookout. We joined up and made sandwiches, and talked. We were avatars, and I couldn't see his face. But it was like we were hanging out together, for a little while.

I'd hoped to go to Disney this year, one of dozens of canceled plans. Instead of seeing Star Wars Galaxy's Edge in person, I tried ILMxLab's VR experience set in Batuu. It sometimes feels like a theme park, and at moments its incredible detail made me feel transported somewhere else. But it's more video game than park substitute. And it sometimes made me feel sadder about the real trips I've missed.

zoom-ipad

Zoom selfie, spring. It never got that much better.

Scott Stein/CNET

Our chat tools are evolving, but still broken

All. Day. Zoom. My kids on their remote schooling sessions, me at work, weekly chats with friends, and to relax, tickets to virtual theater experiences that also take place... on Zoom. This Full Zoom Year has gotten all of us a lot more familiar with screen-sharing, virtual backgrounds and the importance of mute buttons. But the awkwardness of it all never really went away.

Most of my time wasn't in VR: it was in various calls, video chats, AirPods in, iPad on, laptop swap, grabbing the right microphone, leaning over a camera.

I remember my meeting with Microsoft, discussing its Teams app experiment of putting people into virtual classrooms and auditoriums. That was months and months ago, and really, nothing much has changed in my life since then. I found my necessary tools, settled in and survived. But I can't say my workflow changed much after that. Microsoft's evolving idea of video chat is a unique idea, but other apps didn't seem to evolve fast enough for anyone's needs. We used what worked.

Change is hard. But we've all had to do it. I learned new tools and set up home offices and classrooms, and upgraded my wireless network with an Eero mesh router (which I'm still not sure I set up right). 

I usually just settle down by my same window or blank wall, connect and do my best. It's not a perfect situation. I'd prefer apps that more easily blend and blur my background, and devices with much better cameras: Laptops are disappointing but well-placed; iPads are good but the camera's too shifted to one side.

My favorite Zoom moments? A virtual Passover seder with my family. My weekly friend and work "drinks" days. A Zoom magic show at the Geffen Playhouse, called The Future, that involved props sent to me that I took out and used with the magician. Also, a puzzle-solving magic night called Inside The Box. Many of the others blur together.

I'd love for VR or future AR glasses to blend these Zooms into something more immersive. I could sit across from colleagues, or see my Mom next to me. VR headsets are magical, but are limited-use, don't connect to my Google and Apple apps, and are better suited right now for quick-dive nonrealistic things like games or art, or things where I'm trying to detach from everything else and immerse by myself for a bit. 

The promises of augmented reality keep growing, but the road to glasses is going to take a long time. Longer than this lockdown. There are apps that hint at what blending Zooms with 3D space will feel like, but they're totally experimental now. So far, my home life and my virtual lives have remained separate for the most part, except for clever games like Mario Kart Live Home Circuit, which let me shrink down and race through my cluttered home, or AirPods, which help me listen in on calls while making lunch for my kids.

food-home

Not tech at all. Just one of my 1 million food photos.

Scott Stein/CNET

My favorite moments of this year weren't really tech at all

The more I spent apart from real people, and only connecting online, the more I felt a strange panic at times. Did people know where I was? Did anyone share my feelings? Was I alone? Sometimes I'd feel anger, and resentment. Then I'd feel appreciative for what I had. The feelings fluctuated. I talked to a therapist. I tried to meditate.

I like cooking. Simple cleaning food, prepping, chopping. Doing dishes. Making things. Frying eggs. It makes the kids happy. I feel like I've made something real happen.

I like practicing magic. I do it once in a while, shuffling cards, trying a new coin move. Reading a book that can challenge my idea of how to perform. I like the way my hands can subtly move, with more nuance than any VR controller or computer keyboard.

hero-press-scott

Tele blends avatars and real faces in a chat app. Expect more of this.

Spatial

The future needs to glue my worlds together

Time accelerated, all of a sudden, from all this time in one place. The year ended. I wonder where it went. I'm sitting at home again, for another holiday where I'm not going anywhere. Hoping that the future, sometime soon, can be different for us. 

I see my kids hopping in and out of virtual classes, jumping upstairs to play online games with friends, and they've found some sort of pattern. But it doesn't replace what we lost.

My work from home, my connection with others remotely, doesn't replace what I've lost, either.

But there are ways all of these tools could expand the idea of how to connect. This year was a terrible pop-quiz field test of all of our VR, AR, remote connecting, gaming and online tools, and for me they earned a passing grade. But no one came close to acing anything. The very best I got was stuff that did well enough, worked without breaking, entertained and connected for a bit. 

VR doesn't know how to connect to the rest of my work apps and life yet. All my screens help me stay informed, but they tire me out. 

In the corners, there were immersive shows I went to that were full of creativity, brilliant ideas, hopes and dreams. I think there's a world ahead where those ideas could expand, with better tech, and just as extensions of live events and in-person experiences that we'll finally get to go back to again, someday.

Looking forward, 2021 and onwards might move on from this Virtual Year, but I don't think the tech experiments will go away. We laid a backbone for how a whole generation will connect to experiences together. Right now, it's a mess. In time, it'll probably get better. Every tech company got to see firsthand how their attempts at connection and community succeeded, and failed. I expect 2021 to be overflowing with promised solutions to our 2020 headaches. And whether it's a headset, or a phone, or a laptop, or a router, or something else entirely weird and new.

But all the tech in the world can't allow me to hug my mom again. Not yet, at least.

Let's block ads! (Why?)

Read More

AirPods 3: Everything I hope Apple changes in the next-gen earbuds – CNET

Apple Airpods Pro

The AirPods (left) and AirPods Pro (right). The original AirPods could adopt the Pro look next year.

Sarah Tew/CNET

Apple may have the new AirPods 3 ready for 2021. After a year spent trapped in my house and trying to get work done, AirPods have become a practically essential tool. Lots of other excellent Bluetooth earbuds exist, but AirPods are still impressively instant and have really good microphones. Plus, being able to swap from one ear to another has helped me extend battery life by one-earing my way through long meetings.

The second-gen AirPods are a year and a half old, while AirPods Pro are a year old, and have continued to evolve. Apple added spatial audio earlier this year, and the massively expensive AirPods Max arrived with a different proposition (I prefer small earbuds, not big headphones), so what will AirPods 3 bring? And what do I even want? For me, it's mostly about better connection.

AirPods Pro Lite buds could shift all AirPods to the Pro look

The latest reports say that the third-gen AirPods will look like the Pros, minus the active noise cancellation and spatial audio features. Which means they'll have shorter stems and have replaceable eartips in different sizes.

I appreciate the AirPod Pro design and its less visible look, but I have mixed feelings about the eartips. The deeper in-ear feel seals off for better noise-blocking, but I appreciate the easier and more pass-through-friendly design of the originals for casual call-monitoring. Maybe Apple will split the difference somehow.

AirPods Pro Lite could also end up being more expensive than the second-gen AirPods, which isn't ideal at all. Maybe they'd deliver better sound or better battery life to make the possible price bump worth it.

apple-airpods-pro-2

The AirPods Pro have pass-through audio and noise cancellation. Don't expect all that on entry-level AirPods, but hopefully strides can be made on Bluetooth connection.

Angela Lang/CNET

I want AirPods to device-swap even more effortlessly

Apple's latest iOS 14 and Big Sur OS updates have helped AirPods autoconnect to Apple devices faster, but I still find that I have to double-check that AirPods are paired during a Zoom or FaceTime. Sometimes I also encounter bugs that disconnect the AirPods, too.

I don't just want to switch to whatever device I'm using, though: I want to actively switch as I multitask, or even blend sources. I look at my phone, laptop, even a TV at the same time. I'd love to tap-select to swap, or maybe even use my head orientation as a way for the AirPods to somehow sense what device I'm using. Apple's latest devices have U1 chips that can spatially recognize where other Apple devices are. New AirPods could maybe tap into this more, too. Keep in mind, even the expensive AirPods Max don't have a U1 chip, so less expensive AirPods aren't likely to, either... but it's an interesting thought.

Or I'd even love a way to mix sources (a feed from my laptop, video from my iPhone). Apple already mixes ambient outside noise with one audio source via Transparency -- could it go further? Or could Apple adopt more intelligent audio analysis of ambient noise and adjust what you're listening to to boost treble or bass or spoken voices?

Apple Airpods Pro

AirPods don't measure heart rate, but they should work even more closely with the Apple Watch, which does.

Sarah Tew/CNET

What about AirPods fitness features?

Apple just launched its own Fitness Plus workout subscription service, which requires an Apple Watch. AirPods are an important piece of the Apple Watch-music-workout triangle, but so far the Airpods' built-in tap controls are pretty limiting, and lack positional awareness.

Adding more shortcuts or tap gestures for workouts and activity monitoring could be smart, and would it be possible for AirPods to recognize when you're starting to run or exercise? The AirPods Pro have gyros and accelerometers that could unlock more movement and positional awareness. Next-gen AirPods Pro might push into this territory more, but hopefully the third-gen AirPods will, too.

It's unlikely that Apple will add actual health-tracking sensors, like step counting or heart rate, to the AirPods. But the Apple Watch and AirPods should form even more of a symbiotic pair. Heart rate and fitness readings, maybe even deeper watch control with AirPods commands. But it should be noted that ears are a pretty good place to measure heart rate.

Now playing: Watch this: 1 year with the AirPods Pro

10:47

We'll know more about Airpods 3 next year, probably

I wouldn't hold off on getting AirPods now, but keep in mind that new models should be coming sometime before the end of next year. Battery life, audio performance and fitness are logical places to focus on. But I hope they find a way to be better, more reliable connect-to-everything wearables, too.

Let's block ads! (Why?)

Read More

Google Stadia on iPad is my favorite way to play Cyberpunk 2077 – CNET

cyberpunk-stadia-ipad

Stadia comes to iOS, and Cyberpunk 2077 is the killer app.

Scott Stein/CNET

I haven't played Cyberpunk 2077 on a PlayStation 5 or an Xbox Series X, or a PS4 or Xbox One, or a PC. Instead, I've been playing the game on a couple of iPads (and a TV, and laptops) using Google's streaming game service Stadia. Considering the outcry over dated console graphics and massive glitches and bugs, it feels like I made the right call. A year after Stadia made its debut, I think I've finally found its killer app.

Stadia has just arrived on iOS, making it the latest streaming game service that now works with iPhones and iPads via the browser (there's no official iOS app available yet). I played using Safari, because you can easily add a quick-launch shortcut to your desktop that starts up full-screen like a regular app, no browser borders getting in the way. I paired an Xbox One controller via Bluetooth, stood an iPad Pro up in a Magic Keyboard case, and I was set.

So far, I'm very pleased with my decision. No downloads, no physical console. In a weird way, this is exactly the sort of cyberpunk way of playing games that a cyberpunk game deserves.

I don't want to burn through a lot of storage space, and updates

The last few months have reminded me of what next-gen consoles and PCs require for game downloads: 50GB and up. Cyberpunk 2077 takes up between 60 and 100GB of space for consoles. I'm already close to maxing storage on an Xbox Series X and PS5 I'm trying out. My home internet is 100MBps. I downloaded a 170GB PC VR game last week, that was enough.

Even with two kids doing remote schooling, Stadia ran fine on my 100Mbps internet. Again, I wouldn't say it was perfect, but it was more than good enough to play and not get annoyed. I don't really expect more out of Cyberpunk 2077 in my life right now: For me, it's a story I want to dip into and follow. I want to engage with it casually. I don't have endless hours to stay glued to one big screen and console all the time. 

cyberpunk2077-stadia-ipad-2

The iPad Pro display and speakers are fantastic for this game.

Scott Stein/CNET

Stadia is fast-starting and surprisingly nonbuggy...

On a big-screen 12.9-inch iPad Pro, Stadia was fantastic. The game loaded fast and I didn't have to wait hours for it to install. The game looked good enough to me, although it's possible you'd be disappointed if you were a PC gamer. Everything was plenty playable, and I took in the storyline and was able to play along with an Xbox controller paired.

It feels like the same experience as on a PC, Mac or a TV, except for the screen size limitations. It worked well on an iPad Air, too. Due to Cyberpunk 2077's pretty in-depth layout of small text and weird icons, 10 inches is about as small as I want to get. (Phones are out of the question for me, but Stadia works for Android phones and iPhones, too.)

I had moments where the streaming slowed down a bit, or controller syncing got a bit weird. Mostly, it was totally fine.

Read more: GameSpot's Cyberpunk 2077 review

...until it is buggy

Of course, the moment I finished writing this, I wasn't able to load Cyberpunk 2077 on Stadia. One afternoon, the game just got stuck on the title screen. I gave up and walked away and spent time with my kids instead. Wireless connections and streaming are an everyday puzzle in my Isolation House. (It's working fine, now.)

Now playing: Watch this: Cyberpunk 2077 has its glitches, but still worth playing

10:19

I love swapping screens

It's relaxing to drag Cyberpunk 2077 around to wherever I might be, and play there. You can do that with a gaming laptop, or you could locally stream from an Xbox Series X or PC. It's a lot easier on something like Stadia, though.

It often feels like streaming game services are striving for what streaming TV services do. I can do my binging of Netflix shows wherever, and know everything will pick up when I get to whatever device I have with me. Same thing with Stadia. Maybe it's being trapped at home for a year, knowing that my devices are scattered around, and I move from room to room, but Stadia seems to be a better fit for me now than it was a year ago. I don't love it for smaller games (just play it natively on the device it was designed for) or twitch-action games (where those occasional glitches have a much bigger impact on your enjoyment). But for a sprawling, story-based game like Cyberpunk 2077, it's great.

Stadia is my Cyberpunk spot (for now)

A year into Google's launch of Stadia, there are now several streaming competitors: Xbox cloud gaming via Xbox Game Pass, Nvidia GeForce Now and Amazon Luna. Stadia has a subscription option with a few included games, but most games have to be bought at full price. Stadia never appealed to me much a year ago, and Cyberpunk 2077, for all its crossovers with cyberpunk literature and films I've loved, isn't really my favorite game, either. But I don't feel any need to download a huge Cyberpunk game anywhere else in my life. I much prefer that Google host that stuff somewhere else and just let me stream it.

Let's block ads! (Why?)

Read More

Myst has arrived in VR, and it’s a perfect fit – CNET

myst-oculus-screen-01.png

Get ready to explore the island again.

Cyan Worlds

Before escape rooms, before VR headsets, there was Myst. The 1993 CD-ROM game was, decades ago, my very favorite form of virtual reality. I clicked through it, mesmerized, haunted by the images, confused by the puzzles. I've played it over and over again on game consoles, and on the iPad and iPhone. It's back again in VR form, and this might be the best version yet.

Cyan Worlds, the company that Myst creators Rand and Robyn Miller started, still makes games, and many of them are VR-focused. Obduction, a PC virtual reality puzzle game, came out in 2016. Cyan also has a new VR game coming in the future, called Firmament. In the meantime, Myst has gotten a total VR makeover, launching as an Oculus Quest exclusive at first before moving to PC.

Is it worth your while? I've been playing for a few days on the Oculus Quest 2, and I've been pretty stunned by the game's visuals and immersive sound. It's still completely the same Myst game as before, and movement is handled by teleporting (sort of like a click, but not really). The already available 3D version of Myst, called RealMyst, had the same sort of translation from flat clickable images to a 3D world.

What's new here is the way my hands can reach out and control things. Instead of clicking a door handle, I can move a floating hand and grab the handle. Or turn a knob. Or pull a lever. It's very much like what other VR games already do well, and that escape-room-like VR experiences such as The Room VR excel at. It makes Myst feel more like a living world.

But it's not entirely a new experience. The static landscapes and click-and-move style of exploration are the same, which is mostly great. Sometimes I do miss the booklike flow of the original Myst, though. Much like the game itself was about a magic book whose pages opened to other worlds, the original Myst's gameplay moved like a magic picture book. Finding details and clicking on them was part of the strange wondrous experience. You never knew what you might find.

myst-oculus-screen-05.png
Cyan Worlds

Looking up, or down, or around, is a different experience. Finding clues, I wondered how I'd write them down (I ended up peeking below my VR headset and jotting them on my phone). I'd love to have  some sort of in-game notepad -- luckily, there's supposed to be a "journalism system" coming in an update, but I don't know how that will work.

One nice touch in this Myst is a puzzle randomizer that could make solving the game a new experience, even if you've done it before.

Myst takes up a lot of storage space on the Quest 2 (nearly 9.5GB), so keep that in mind. If you have a 64GB Quest or Quest 2, that's a whole bunch of storage it'll eat up. But if you've never ever played Myst before and you have an Oculus Quest, this is a pretty fantastic and meditative puzzle experience. But it also makes me hunger for a brand-new Cyan VR game that could stretch out to even bolder, wilder places, too.

See also: The best VR games and experiences on Oculus Quest and Quest 2

Now playing: Watch this: Oculus Quest 2 is better and cheaper... with one Facebook...

8:56

Let's block ads! (Why?)

Read More

Best Nintendo gifts that aren’t a Switch – CNET

This story is part of Holiday Gift Guide 2020, CNET's gift picks with expert advice, reviews and recommendations for the latest tech gifts for you and your family.

Nintendo doesn't have a new console this year like the PS5 or Xbox Series X and it's entirely possible that the person you're thinking about already has a Switch. (If they don't, get a Switch.) 

While the previously hard-to-find Nintendo Switch is back in stock now, there are other ideas for Nintendo fans you may not have thought of. Not games, exactly (for games, just buy them an eShop gift card and let them choose), but other, more unusual ideas from the House that Mario Built. 

Scott Stein/CNET

Lego has a whole line of interactive Mario block kits timed for the 35th anniversary of Super Mario Bros. on the NES, which work with a talking Mario figure that can bop on coded blocks, turning playsets into a sort of board game. The multiple kits pair with a connected app and don't work with Nintendo Switch, but can be a whole different type of holiday game: Get the base set to make sure you have the necessary Mario figure and then add expansions. Read our review.

Scott Stein/CNET

Nintendo's Switch-connected RC car game needs a lot of room in your home, but manages to turn your indoor space into Mario Kart tracks like magic. The karts are controlled by a Nintendo Switch, turning the kart's camera feed into a video game mix of real obstacles and imaginary cars and power-ups. The Kart kit includes one car and the necessary cardboard gates, but to race a friend you'll need another Switch and another Kart and the game doesn't currently work with online players. Read our review.

Nintendo

Nintendo does have one new handheld this year. The Game & Watch was Nintendo's original set of handhelds from way before the Game Boy and Nintendo's made a brand-new one with three games loaded. There's a perfect recreations of the NES Super Mario Bros and Super Mario Bros: The Lost Levels, plus a Game & Watch game, called Ball. It's expensive for a handheld that just plays three games, but as a collector's item and piece of nostalgia, it's brilliant. Read our review.

Bridget Carey / CNET

Stuck indoors? Looking for something to get active? Nintendo's fitness accessory for the Switch arrived last year, but it's still the best workout for anyone with a Switch that docks into a TV (sorry, the Switch Lite won't work unless you buy extra Joy-Cons). The flexible Ring-Con controller and a leg strap work with a battle-adventure game that combines stretches, jogging in place, and various workout reps, plus a bunch of extra mini-game modes. Multiple family members can save their own profiles, too. Read our hands-on experience.

Sarah Tew/CNET

Don't sleep on the weird Labo: The cardboard-folding experiences combine elaborate papercraft with interactive games that you can play with the things you make. Making things can take hours and might require parental help, but if you're stuck inside looking for a big project to play with and have a TV-dockable Switch (not the Lite), these kits are still great. And they're often on sale. The Toy-Con Vehicle Kit (listed below) is a really fun set of driving/flying games with wheels and controllers you fold and build yourself. Read our review.

Óscar Gutiérrez/CNET

OK, here's a secret about Switches: They tend to multiply. Nintendo's sneaky Switch strategy, blending family sharing with personal handheld play, means it's not crazy to consider another Switch for someone else in the same house. I've already done it and the Switch Lite ($200) might be that second system you're looking for. But keep in mind that if you're a multi-Switch household, you're better off getting physical game cartridges that can be shared. Read our review of the Switch Lite.

118-defway-dock-nintendo-switch-accessories-listicle-2019

One of our favorite Switch docks. You could always set one up in front of a second TV.

Sarah Tew/CNET

Or, get a Switch accessory

There are tons of useful extras to get Switch owners, from controllers to docks to Bluetooth headset adapters. Here are our favorites right now.

Let's block ads! (Why?)

Read More

Apple’s secret weapon in AR is right in front of us – CNET

Sometime in the not-too-distant future, Apple will reportedly unveil an augmented- or mixed-reality headset. Apple hasn't discussed any headgear yet. But augmented reality is alive and well on the iPhone -- and it's getting better fast. 

Apple began its AR journey in 2017, making a splash with virtual Ikea furniture and realistic-looking outdoor Pokemon Go battles. This year, I've been standing on street corners scanning fire hydrants with Apple's new iPhone 12 Pro. I've mapped my house's interior. I've navigated lava rivers on my floors.

In many ways, Apple's depth-sensing lidar sensor on the latest iPhones and iPads, with its advanced 3D-scanning possibilities, feels like the backbone of the Apple headsets of the future.

Facebook, Microsoft and Magic Leap are already exploring goggles and glasses that aim to blend the virtual and real, with more headsets coming in the future using Qualcomm chips. But Apple's AR mission right now, according to Mike Rockwell, Apple's head of AR, and Allessandra McGinnis, its senior product manager for AR, is to make everything work better on the device you already have in your pocket. Layering AR with real-world locations and popping up experiences automatically, while making creative tools and developing assistive tech based on AR's capabilities, could, in the long run, become the biggest killer apps.

"AR has enormous potential to be helpful to folks in their lives across devices that exist today, and devices that may exist tomorrow, but we've got to make sure that it is successful," Rockwell says. "For us, the best way to do that is to enable our device ecosystem, so that it is a healthy and profitable place for people to invest their time and effort."

Rockwell and McGinnis also talked with me about what's different now compared to three years ago, and why phones matter so much for what comes next.

p1002916-3
Patrick Holland/CNET

Apple's killer AR app: The phone you already have

Virtual reality headsets like the Oculus Quest 2, while continually improving in quality, are still not used by many people compared to phones. "Nobody is really talking about numbers" of VR headsets sold except Sony, which has sold 5 million PlayStation VR headsets so far, says Senior Consumer Chip Analyst Anshel Sag, of Moor Insights, although "there's a high probability that [the Quest 2] could hit 5-6 million headsets sold in the first year." But even then, those VR headsets use apps that usually feel removed from the phones and computers we use everyday. AR headsets still don't exist in any significant numbers yet, even years after Magic Leap and Microsoft's HoloLens promised an imminent future of mixed reality.

"It's been a pretty hard road for developers that are VR-only, or are trying to do AR-only experiences," Rockwell notes. "There just aren't that many [devices] out there." Meanwhile, Apple's sheer numbers of AR-enabled iPhones and iPads dating back to 2017 number in the hundreds of millions. "Even if you only appeal to a relatively small percentage, that's still a gigantic number."

Apple says there are already 10,000 AR-enabled iOS apps from 7,000 developers, with many focused on shopping or home improvement as a way to practically use AR at home. Practicality is exactly what Apple seems most intent on at the moment. "We wanted to provide a platform and ecosystem for developers where they could make a living," Rockwell says.

While the COVID-19 pandemic has shut down physical businesses and slowed travel for most people, home shopping using AR tools is a major part of Apple's focus right now. Much in the same way Google and Microsoft are pursuing ways to see things you might want to buy in 3D on your phone at home using phone-based AR tools, Apple's hook-ins to its Safari browser enabling pop-up AR shopping look to be stand-ins for going to stores. 

Now playing: Watch this: Our in-depth review of the iPhone 12 and 12 Pro

13:48

"Home Depot's found that people are two to three times more likely to convert when they view a product in AR than others that don't," McGinnis points out, citing numbers from Shopify and Build.com that show a greater likelihood to buy (94%) and a 22% lower rate of returns.

App developers including Adobe, which makes the AR creative app Aero for Apple's iOS, seem to see phone-based AR the same way. "Headsets are on our roadmap, None of them has reached the critical mass that makes sense for us to deploy," Adobe's head of AR, Stefano Corrazza, says as to why the company hasn't explored headset creative tools beyond acquiring Medium from Oculus: "Until we have an Apple or Google putting something out there in broad scale, it doesn't make a lot of sense for us to push it out." 

In the meantime, smartphones like the new $999 iPhone 12 Pro can be primary creative tools, building up to headsets down the road. "Even with a headset, the phone will be where all the computation happens," Corrazza says. "And you can stream to the glasses potentially."

That's the same model Qualcomm is already building on for future AR/VR devices, too, but it could take years. In the meantime, there are phones. "It's going to be the primary device for a while for consuming," Corrazza says of the iPhone 12 Pro, "but also for scanning and 3D content, it's a very powerful machine." Adobe doesn't use 3D scanning tools on Aero yet, but may be exploring ways to incorporate those features down the road.

Lidar as a step towards AR as a creative tool

Apple's first steps in AR, alongside the iPhone 8, just recognized floors using the phone's motion sensors, gyros and built-in camera. Then it recognized walls and people. Lidar-enabled iPhones and iPads, which invisibly spray an array of infrared lasers from a small black circle near the rear cameras, go a significant step further by quickly meshing (mapping in 3D) a room's full dimensions. That also includes 3D objects and people in the space. It's an evolution of the type of tech that Google explored years ago with a line of depth-sensing Tango phones, but on a more advanced and widespread scale. Many early lidar-enabled apps like Polycam, 3D Scanner and Record 3D are very creative and 3D capture-focused, a big shift from the dinosaur-conjuring, game-playing AR apps back in 2017.

"That's part of the reason why we put this scanner on the device. We felt like it was a key technology that could open up an explosion of 3D assets that can be used for all kinds of things," Rockwell says. "It also opens the possibility of being able to start to scan environments in a way, and be able to make it easier to create 3D objects."

One of the largest repositories of 3D objects on the internet, Sketchfab, is already seeing an uptick despite years of previous explorations in 3D scanning before this. Sketchfab just hit 4 million subscribers and had its first profitable month since the service began in 2012.

But as Sketchfab's CEO Alban Denoyel says, he's been through previous times where he expected a boom in 3D objects. When VR headsets debuted in 2016 along with a couple of Google 3D-scanning Tango phones, there was a lot of hype. The market adoption didn't happen, though, leading to what Denoyel calls a "VR winter." It might finally be picking up now.

Snapchat is already exploring using lidar for AR effects that can put virtual things into the real world, and even has larger-scale experiments scanning whole city blocks. "We look at depth as very foundational," Snapchat's VP of the Camera Platform Eitan Pilipski says. 

Even with these possibilities, though, I find that learning to use these new tools can be daunting. Apple's own AR-creation tool, Reality Composer, and Adobe's 3D AR creative toolkit, Aero, are not necessarily apps you'll instantly download right away, and I still find them to be apps I avoid. The 3D-scanning apps I've tried so far are fascinating, but also experimental, and not always intuitive. Apple's largely put the world of 3D-scanning apps into developers' hands, while Apple's everyday core iOS tools don't incorporate these features much at all.

Apple's iOS-wide support for 3D objects does suggest a way that 3D things could eventually be shared like PDFs or photos. But in some ways, the creative tools for this future don't fully exist yet.

The possibilities for photography could also be amazing, and Apple's own Camera app uses the iPhone 12 Pro's lidar to improve focus for night photos and portraits. But Apple doesn't incorporate AR into its camera app or allow for any 3D scanning yet. Those ideas are left to developers to explore. Some apps, like DSLR Camera, already use the iPhone's lidar to create custom layers of 3D information on top of photo data, layering text in 3D into photos. 

"The app is able to calculate the segmentation between the person and the background object," says Fulvio Scichilone, the creator of DSLR Camera. "The future plan for the AR portrait ... is to move, with the gyroscope or your finger, the frame of the picture."

people-detection-ios-lidar

People Detection recognizes people and measures distance, using AR tech. 

Scott Stein/CNET

Augmented reality as extended senses, and an accessibility tool

Apple sees the killer app of AR being discoverability, but there's another huge opportunity arriving for accessibility. AR can literally extend one's senses. In the audio realm, Apple already uses AirPods as hearing aids, and Facebook is exploring spatial audio for assistive hearing as well.

The same could come for assisting sight. Future vision-assistive products like Mojo Lens' promised augmented contact lenses are aiming to be helpful tools for the vision-impaired. Apple could be taking a similar path with how AR on the iPhone, and future devices, work as assistive tools. Already, a new people-detection feature in iOS 14.2 uses Apple's AR and lidar to recognize distances from people, and uses that for vision assistance on new iPhones. 

That could just be the beginning. "There's a lot more we can do, especially related to our understanding of the environment that is around us," Rockwell says. "We can recognize people, but if you think about what a human being can understand about an environment, there's no reason that in the fullness of time a device can't have that level of understanding, too, and provide that to developers."

"We'll be working together with the blind and partially sighted communities to improve specifically on the people-detection side," adds McGinnis.

alicjakwade-allatanytime-02.png

A location-based AR art exhibition by Alicja Kwade in Acute Art.

Alicja Kwade/Acute Art

AR's future killer app: Being instant

Even though I cover AR all the time, I admit I forget to look for new AR apps when I use an iPhone or iPad in my daily life. Discovering what's new in the virtual world while I'm busy in the real one isn't a seamless process.

Rockwell sees the future of iPhone AR not as apps, but as quick-glance moments. "Something that you're dipping in and out of three, four, five, six times a day to do various things, and they're lightweight experiences," he explains. "The killer app is really that it's going to be used in a kind of regular basis all the time in these little ways that help you to do the things that you do today, that make them easier and faster."

The road to that involves App Clips, Apple's new way of having small micro-apps in iOS 14 that emerge on an iPhone without needing to download anything. App Clips can be triggered by NFC tags or scannable codes placed in the real world. I could scan or tap, and suddenly bring up AR related to the place I'm in, such as a virtual menu or a museum exhibit brought to life.

It also involves Apple's mapping efforts. Apple's new Location Anchors mean virtual AR objects can exist in real-life locations -- imagine seeing a virtual piece of art in Times Square -- shared by multiple people at the same time. 

"If it's in one of the areas that we have high-res mapping, which is quite a lot in the US ... if it's within one meter, you can place an experience," Rockwell says of Location Anchors, promising a better-than-GPS level of location-specific accuracy. Meanwhile, App Clips, which are triggered by particular QR codes or anchors in the real world, "can be down to centimeters of accuracy."

Both of these are still a work in progress for Apple's AR efforts: In a year of pandemic-induced isolation, it may be less likely that people have been in public places or in stores or museums where this type of location-based AR tech could emerge. But Apple sees them as crucial for people using AR on a daily basis. 

"We knew we had to solve those problems in order for AR to become a mainstream experience -- I think we really are quite on the cusp of that for folks to have AR become something that is more a part of their everyday life," Rockwell says.

"My perception is that App Clips and Anchors will make a massive difference," Acute Art CEO Jacob De Geer says. Acute Art is an app that already hosts AR exhibits in real-world locations, but one of the current challenges to people finding this art is knowing it's there. "The main issue, not just in AR but everything in tech now is, 'Hey, how do you get people to download your app?'"

Another challenge to AR is -- really, it's not any one thing. Is it 3D art? Is it a series of tools to spatially scan the world and sense everything better? In that way, maybe AR is invisible. Maybe it's a similar philosophy to how Google sees AR as a world-scanning tool.

"Often we hear people are using AR [apps] and don't know what they are," McGinnis says, referring to popular iPhone tools like Warby Parker's instant AR-enabled glasses try-on. "As it becomes more and more mainstream, it doesn't matter if you know it's AR or not. It matters that you have an amazing experience in your device."

The future groundwork is being laid now

Combine Apple's lidar-based 3D scanning, Apple's increasingly more capable AR tools for realistic visuals, plus the AirPod Pro's introduction of spatial audio, which can make things you're listening to sound like they're moving in 3D space, and it isn't hard to imagine a future Apple AR headset. 

Apple won't comment on that. But in the meantime, the company is working on encouraging a groundswell of developers to make AR-ready apps. And whether or not a headset arrives anytime soon, more spatially aware iPhones and iPads will transform the phones into world-scanning devices with their own possibilities. Or maybe even for robotics, or as computer-vision-enabled cameras in unexpected places. 

"These things are, kind of in the beginning, a delicate thing, and you have to have all of the elements there, all these ingredients, for them to be successful," Rockwell says. 

"A few years from now, it'll be one of those things where you kind of can't remember living without it, just like the internet," he adds. "You're going to feel like, wow, I'm using this on a regular basis ... it will become just integrated into our lives."

Let's block ads! (Why?)

Read More

Apple’s secret weapon in AR is right in front of us – CNET

Sometime in the not-too-distant future, Apple will reportedly unveil an augmented- or mixed-reality headset. Apple hasn't discussed any headgear yet. But augmented reality is alive and well on the iPhone -- and it's getting better fast. 

Apple began its AR journey in 2017, making a splash with virtual Ikea furniture and realistic-looking outdoor Pokemon Go battles. This year, I've been standing on street corners scanning fire hydrants with Apple's new iPhone 12 Pro. I've mapped my house's interior. I've navigated lava rivers on my floors.

In many ways, Apple's depth-sensing lidar sensor on the latest iPhones and iPads, with its advanced 3D-scanning possibilities, feels like the backbone of the Apple headsets of the future.

Facebook, Microsoft and Magic Leap are already exploring goggles and glasses that aim to blend the virtual and real, with more headsets coming in the future using Qualcomm chips. But Apple's AR mission right now, according to Mike Rockwell, Apple's head of AR, and Allessandra McGinnis, its senior product manager for AR, is to make everything work better on the device you already have in your pocket. Layering AR with real-world locations and popping up experiences automatically, while making creative tools and developing assistive tech based on AR's capabilities, could, in the long run, become the biggest killer apps.

"AR has enormous potential to be helpful to folks in their lives across devices that exist today, and devices that may exist tomorrow, but we've got to make sure that it is successful," Rockwell says. "For us, the best way to do that is to enable our device ecosystem, so that it is a healthy and profitable place for people to invest their time and effort."

Rockwell and McGinnis also talked with me about what's different now compared to three years ago, and why phones matter so much for what comes next.

p1002916-3
Patrick Holland/CNET

Apple's killer AR app: The phone you already have

Virtual reality headsets like the Oculus Quest 2, while continually improving in quality, are still not used by many people compared to phones. "Nobody is really talking about numbers" of VR headsets sold except Sony, which has sold 5 million PlayStation VR headsets so far, says Senior Consumer Chip Analyst Anshel Sag, of Moor Insights, although "there's a high probability that [the Quest 2] could hit 5-6 million headsets sold in the first year." But even then, those VR headsets use apps that usually feel removed from the phones and computers we use everyday. AR headsets still don't exist in any significant numbers yet, even years after Magic Leap and Microsoft's HoloLens promised an imminent future of mixed reality.

"It's been a pretty hard road for developers that are VR-only, or are trying to do AR-only experiences," Rockwell notes. "There just aren't that many [devices] out there." Meanwhile, Apple's sheer numbers of AR-enabled iPhones and iPads dating back to 2017 number in the hundreds of millions. "Even if you only appeal to a relatively small percentage, that's still a gigantic number."

Apple says there are already 10,000 AR-enabled iOS apps from 7,000 developers, with many focused on shopping or home improvement as a way to practically use AR at home. Practicality is exactly what Apple seems most intent on at the moment. "We wanted to provide a platform and ecosystem for developers where they could make a living," Rockwell says.

While the COVID-19 pandemic has shut down physical businesses and slowed travel for most people, home shopping using AR tools is a major part of Apple's focus right now. Much in the same way Google and Microsoft are pursuing ways to see things you might want to buy in 3D on your phone at home using phone-based AR tools, Apple's hook-ins to its Safari browser enabling pop-up AR shopping look to be stand-ins for going to stores. 

Now playing: Watch this: Our in-depth review of the iPhone 12 and 12 Pro

13:48

"Home Depot's found that people are two to three times more likely to convert when they view a product in AR than others that don't," McGinnis points out, citing numbers from Shopify and Build.com that show a greater likelihood to buy (94%) and a 22% lower rate of returns.

App developers including Adobe, which makes the AR creative app Aero for Apple's iOS, seem to see phone-based AR the same way. "Headsets are on our roadmap, None of them has reached the critical mass that makes sense for us to deploy," Adobe's head of AR, Stefano Corrazza, says as to why the company hasn't explored headset creative tools beyond acquiring Medium from Oculus: "Until we have an Apple or Google putting something out there in broad scale, it doesn't make a lot of sense for us to push it out." 

In the meantime, smartphones like the new $999 iPhone 12 Pro can be primary creative tools, building up to headsets down the road. "Even with a headset, the phone will be where all the computation happens," Corrazza says. "And you can stream to the glasses potentially."

That's the same model Qualcomm is already building on for future AR/VR devices, too, but it could take years. In the meantime, there are phones. "It's going to be the primary device for a while for consuming," Corrazza says of the iPhone 12 Pro, "but also for scanning and 3D content, it's a very powerful machine." Adobe doesn't use 3D scanning tools on Aero yet, but may be exploring ways to incorporate those features down the road.

Lidar as a step towards AR as a creative tool

Apple's first steps in AR, alongside the iPhone 8, just recognized floors using the phone's motion sensors, gyros and built-in camera. Then it recognized walls and people. Lidar-enabled iPhones and iPads, which invisibly spray an array of infrared lasers from a small black circle near the rear cameras, go a significant step further by quickly meshing (mapping in 3D) a room's full dimensions. That also includes 3D objects and people in the space. It's an evolution of the type of tech that Google explored years ago with a line of depth-sensing Tango phones, but on a more advanced and widespread scale. Many early lidar-enabled apps like Polycam, 3D Scanner and Record 3D are very creative and 3D capture-focused, a big shift from the dinosaur-conjuring, game-playing AR apps back in 2017.

"That's part of the reason why we put this scanner on the device. We felt like it was a key technology that could open up an explosion of 3D assets that can be used for all kinds of things," Rockwell says. "It also opens the possibility of being able to start to scan environments in a way, and be able to make it easier to create 3D objects."

One of the largest repositories of 3D objects on the internet, Sketchfab, is already seeing an uptick despite years of previous explorations in 3D scanning before this. Sketchfab just hit 4 million subscribers and had its first profitable month since the service began in 2012.

But as Sketchfab's CEO Alban Denoyel says, he's been through previous times where he expected a boom in 3D objects. When VR headsets debuted in 2016 along with a couple of Google 3D-scanning Tango phones, there was a lot of hype. The market adoption didn't happen, though, leading to what Denoyel calls a "VR winter." It might finally be picking up now.

Snapchat is already exploring using lidar for AR effects that can put virtual things into the real world, and even has larger-scale experiments scanning whole city blocks. "We look at depth as very foundational," Snapchat's VP of the Camera Platform Eitan Pilipski says. 

Even with these possibilities, though, I find that learning to use these new tools can be daunting. Apple's own AR-creation tool, Reality Composer, and Adobe's 3D AR creative toolkit, Aero, are not necessarily apps you'll instantly download right away, and I still find them to be apps I avoid. The 3D-scanning apps I've tried so far are fascinating, but also experimental, and not always intuitive. Apple's largely put the world of 3D-scanning apps into developers' hands, while Apple's everyday core iOS tools don't incorporate these features much at all.

Apple's iOS-wide support for 3D objects does suggest a way that 3D things could eventually be shared like PDFs or photos. But in some ways, the creative tools for this future don't fully exist yet.

The possibilities for photography could also be amazing, and Apple's own Camera app uses the iPhone 12 Pro's lidar to improve focus for night photos and portraits. But Apple doesn't incorporate AR into its camera app or allow for any 3D scanning yet. Those ideas are left to developers to explore. Some apps, like DSLR Camera, already use the iPhone's lidar to create custom layers of 3D information on top of photo data, layering text in 3D into photos. 

"The app is able to calculate the segmentation between the person and the background object," says Fulvio Scichilone, the creator of DSLR Camera. "The future plan for the AR portrait ... is to move, with the gyroscope or your finger, the frame of the picture."

people-detection-ios-lidar

People Detection recognizes people and measures distance, using AR tech. 

Scott Stein/CNET

Augmented reality as extended senses, and an accessibility tool

Apple sees the killer app of AR being discoverability, but there's another huge opportunity arriving for accessibility. AR can literally extend one's senses. In the audio realm, Apple already uses AirPods as hearing aids, and Facebook is exploring spatial audio for assistive hearing as well.

The same could come for assisting sight. Future vision-assistive products like Mojo Lens' promised augmented contact lenses are aiming to be helpful tools for the vision-impaired. Apple could be taking a similar path with how AR on the iPhone, and future devices, work as assistive tools. Already, a new people-detection feature in iOS 14.2 uses Apple's AR and lidar to recognize distances from people, and uses that for vision assistance on new iPhones. 

That could just be the beginning. "There's a lot more we can do, especially related to our understanding of the environment that is around us," Rockwell says. "We can recognize people, but if you think about what a human being can understand about an environment, there's no reason that in the fullness of time a device can't have that level of understanding, too, and provide that to developers."

"We'll be working together with the blind and partially sighted communities to improve specifically on the people-detection side," adds McGinnis.

alicjakwade-allatanytime-02.png

A location-based AR art exhibition by Alicja Kwade in Acute Art.

Alicja Kwade/Acute Art

AR's future killer app: Being instant

Even though I cover AR all the time, I admit I forget to look for new AR apps when I use an iPhone or iPad in my daily life. Discovering what's new in the virtual world while I'm busy in the real one isn't a seamless process.

Rockwell sees the future of iPhone AR not as apps, but as quick-glance moments. "Something that you're dipping in and out of three, four, five, six times a day to do various things, and they're lightweight experiences," he explains. "The killer app is really that it's going to be used in a kind of regular basis all the time in these little ways that help you to do the things that you do today, that make them easier and faster."

The road to that involves App Clips, Apple's new way of having small micro-apps in iOS 14 that emerge on an iPhone without needing to download anything. App Clips can be triggered by NFC tags or scannable codes placed in the real world. I could scan or tap, and suddenly bring up AR related to the place I'm in, such as a virtual menu or a museum exhibit brought to life.

It also involves Apple's mapping efforts. Apple's new Location Anchors mean virtual AR objects can exist in real-life locations -- imagine seeing a virtual piece of art in Times Square -- shared by multiple people at the same time. 

"If it's in one of the areas that we have high-res mapping, which is quite a lot in the US ... if it's within one meter, you can place an experience," Rockwell says of Location Anchors, promising a better-than-GPS level of location-specific accuracy. Meanwhile, App Clips, which are triggered by particular QR codes or anchors in the real world, "can be down to centimeters of accuracy."

Both of these are still a work in progress for Apple's AR efforts: In a year of pandemic-induced isolation, it may be less likely that people have been in public places or in stores or museums where this type of location-based AR tech could emerge. But Apple sees them as crucial for people using AR on a daily basis. 

"We knew we had to solve those problems in order for AR to become a mainstream experience -- I think we really are quite on the cusp of that for folks to have AR become something that is more a part of their everyday life," Rockwell says.

"My perception is that App Clips and Anchors will make a massive difference," Acute Art CEO Jacob De Geer says. Acute Art is an app that already hosts AR exhibits in real-world locations, but one of the current challenges to people finding this art is knowing it's there. "The main issue, not just in AR but everything in tech now is, 'Hey, how do you get people to download your app?'"

Another challenge to AR is -- really, it's not any one thing. Is it 3D art? Is it a series of tools to spatially scan the world and sense everything better? In that way, maybe AR is invisible. Maybe it's a similar philosophy to how Google sees AR as a world-scanning tool.

"Often we hear people are using AR [apps] and don't know what they are," McGinnis says, referring to popular iPhone tools like Warby Parker's instant AR-enabled glasses try-on. "As it becomes more and more mainstream, it doesn't matter if you know it's AR or not. It matters that you have an amazing experience in your device."

The future groundwork is being laid now

Combine Apple's lidar-based 3D scanning, Apple's increasingly more capable AR tools for realistic visuals, plus the AirPod Pro's introduction of spatial audio, which can make things you're listening to sound like they're moving in 3D space, and it isn't hard to imagine a future Apple AR headset. 

Apple won't comment on that. But in the meantime, the company is working on encouraging a groundswell of developers to make AR-ready apps. And whether or not a headset arrives anytime soon, more spatially aware iPhones and iPads will transform the phones into world-scanning devices with their own possibilities. Or maybe even for robotics, or as computer-vision-enabled cameras in unexpected places. 

"These things are, kind of in the beginning, a delicate thing, and you have to have all of the elements there, all these ingredients, for them to be successful," Rockwell says. 

"A few years from now, it'll be one of those things where you kind of can't remember living without it, just like the internet," he adds. "You're going to feel like, wow, I'm using this on a regular basis ... it will become just integrated into our lives."

Let's block ads! (Why?)

Read More
Page 1 of 812345»...Last »