Electronics#ar #VR #AR Glasses #Augmented Reality #Virtual Reality #techtok #cftech

Use this section to provide a description of your blog./pages/blog

Digital Renaissance: AI And VR In The Gaming And Tech Industry

Posted by Technology Co., Ltd Shenzhen Mshilor

By Founder and CEO of Gameverse Interactive Corp, an industry-leading gaming company.

 

Artificial intelligence (AI) and virtual reality (VR) are sparking a renaissance in the gaming and tech industry. Advances in AI and VR are not just improving and changing the industry, it’s changing the way we play, and it’s creating a new wave of innovation and expansion that’s taking gaming to farther and more staggering levels than anyone ever thought possible.

AI’s Role In Gameplay

Artificial intelligence has been both, at best—a tool to create better gameplay and mechanics—and at worst—something that some players can use to gain unfair advantages over other players. With advanced enough artificial intelligence algorithms, you can create aimbots, or infer where certain players are through wallhacks and other nefarious methods. This can be problematic for developers trying to create fun and balanced multiplayer games. However, artificial intelligence also can help gamers to become exceptionally good at skill-based games with guidance, and insight. Learning algorithms can take information about the player and break it down into suggestions and training programs that can teach players to master games and even create better strategies. AI can provide real-time answers and aid to questions and can even provide tips and suggestions.

AI’s Impact On Game Development

The central paradigm shift I believe artificial intelligence is spurring in games is that it is changing the very fabric of game development itself. Developers are leveraging artificial intelligence to streamline, speed up and productivity boost the game development process by extending workflows while simultaneously cutting down debug times. By democratizing expertise, beginner-level developers are now able to compete on the same playing field as top-tier developers and usher in a new era of creativity and innovation. From procedural and generative content, allowing developers the ability to generate levels, textures and all kinds of special effects to create larger-than-life worlds that are more vibrant, realistic and dynamic than before.

Virtual Reality Enhanced By AI

Virtual reality is also getting a major makeover. AI-powered VR experiences offer unprecedented levels of engagement and immersion, for example, NPCs (non-playable characters) with complex behaviors and traits that dynamically adapt depending on their interactions with the environment and users. One of the most promising enhancements resulting from the integration of AI in VR environments is the ability to make our interactions with the virtual world and players more intuitive and natural, such as with gesture recognition, voice commands and other user interfaces, making the experience more immersive and engaging, which in turn can broaden the appeal of the game to more players.

Impact Beyond Gaming

The combinatorial effect of AI on VR is going to have far-reaching consequences, not just in terms of gaming alone. Virtual reality can be applied as a training aid that simulates real-life scenarios, and even transform the way education works by making subjects such as history and science more immersive to learn. Personalized learning experiences can be developed using AI to help students and developers master more complex skills in many different industries ranging from gaming to manufacturing. The emergence of AI and VR will transform the way people work by creating more immersive and more efficient simulation environments that help workers become better at their jobs and increase their rates of productivity and efficiency.

Challenges And The Road Ahead

The question of how artificial intelligence (AI) and virtual reality (VR) technologies will be used in gaming and technology must be addressed by gaming communities, technology leaders and educational institutions at every level. This includes appropriately managing issues related to fair play and maintaining an equal playing field for all users, as well as exploring how and if gamers are willing to give up certain identity information for immersive VR experiences. Moreover, education and gaming communities should ensure that the technology is user-friendly and is developed with accessibility and disability in mind so that no one gets left behind because of a lack of resources or accommodations required for a person with a physical or mental disability. It is also imperative to collaborate to reduce algorithmic bias that may result from the sophistication of the programming of AI-driven VR gaming and take further steps to diversify the development and design fields in the process.

The emerging field of immersive gaming and online interactions through VR must cultivate aspects of integration and interoperability and eventually promote not only robot-human interaction but also human-to-human communicative interactions that take place primarily through AI-enhanced VR.

Read more

By Founder and CEO of Gameverse Interactive Corp, an industry-leading gaming company.

 

Artificial intelligence (AI) and virtual reality (VR) are sparking a renaissance in the gaming and tech industry. Advances in AI and VR are not just improving and changing the industry, it’s changing the way we play, and it’s creating a new wave of innovation and expansion that’s taking gaming to farther and more staggering levels than anyone ever thought possible.

AI’s Role In Gameplay

Artificial intelligence has been both, at best—a tool to create better gameplay and mechanics—and at worst—something that some players can use to gain unfair advantages over other players. With advanced enough artificial intelligence algorithms, you can create aimbots, or infer where certain players are through wallhacks and other nefarious methods. This can be problematic for developers trying to create fun and balanced multiplayer games. However, artificial intelligence also can help gamers to become exceptionally good at skill-based games with guidance, and insight. Learning algorithms can take information about the player and break it down into suggestions and training programs that can teach players to master games and even create better strategies. AI can provide real-time answers and aid to questions and can even provide tips and suggestions.

AI’s Impact On Game Development

The central paradigm shift I believe artificial intelligence is spurring in games is that it is changing the very fabric of game development itself. Developers are leveraging artificial intelligence to streamline, speed up and productivity boost the game development process by extending workflows while simultaneously cutting down debug times. By democratizing expertise, beginner-level developers are now able to compete on the same playing field as top-tier developers and usher in a new era of creativity and innovation. From procedural and generative content, allowing developers the ability to generate levels, textures and all kinds of special effects to create larger-than-life worlds that are more vibrant, realistic and dynamic than before.

Virtual Reality Enhanced By AI

Virtual reality is also getting a major makeover. AI-powered VR experiences offer unprecedented levels of engagement and immersion, for example, NPCs (non-playable characters) with complex behaviors and traits that dynamically adapt depending on their interactions with the environment and users. One of the most promising enhancements resulting from the integration of AI in VR environments is the ability to make our interactions with the virtual world and players more intuitive and natural, such as with gesture recognition, voice commands and other user interfaces, making the experience more immersive and engaging, which in turn can broaden the appeal of the game to more players.

Impact Beyond Gaming

The combinatorial effect of AI on VR is going to have far-reaching consequences, not just in terms of gaming alone. Virtual reality can be applied as a training aid that simulates real-life scenarios, and even transform the way education works by making subjects such as history and science more immersive to learn. Personalized learning experiences can be developed using AI to help students and developers master more complex skills in many different industries ranging from gaming to manufacturing. The emergence of AI and VR will transform the way people work by creating more immersive and more efficient simulation environments that help workers become better at their jobs and increase their rates of productivity and efficiency.

Challenges And The Road Ahead

The question of how artificial intelligence (AI) and virtual reality (VR) technologies will be used in gaming and technology must be addressed by gaming communities, technology leaders and educational institutions at every level. This includes appropriately managing issues related to fair play and maintaining an equal playing field for all users, as well as exploring how and if gamers are willing to give up certain identity information for immersive VR experiences. Moreover, education and gaming communities should ensure that the technology is user-friendly and is developed with accessibility and disability in mind so that no one gets left behind because of a lack of resources or accommodations required for a person with a physical or mental disability. It is also imperative to collaborate to reduce algorithmic bias that may result from the sophistication of the programming of AI-driven VR gaming and take further steps to diversify the development and design fields in the process.

The emerging field of immersive gaming and online interactions through VR must cultivate aspects of integration and interoperability and eventually promote not only robot-human interaction but also human-to-human communicative interactions that take place primarily through AI-enhanced VR.

Read more

Good luck with your work throughout the new year

Posted by Technology Co., Ltd Shenzhen Mshilor

"May the upcoming year be filled with success and fulfillment in all your work endeavors. Best wishes to you and your future achievements."

arglassesarglassesarglassesarglasses

Read more

"May the upcoming year be filled with success and fulfillment in all your work endeavors. Best wishes to you and your future achievements."

arglassesarglassesarglassesarglasses

Read more

What are the best practices for designing AR experiences that work with 4G networks?

Posted by Technology Co., Ltd Shenzhen Mshilor

When designing AR experiences that work with 4G networks, it's important to optimize content for quick loading times, consider using cloud-based solutions to offload processing power, and design experiences with network latency in mind to ensure they can still be enjoyed even if the network connection is slow or unreliable. By following these best practices, you can create AR experiences that work well on 4G networks.


First, optimize your content for quick loading times. This means keeping file sizes small and minimizing the number of assets that need to be loaded.

Second, consider using cloud-based solutions to offload some of the processing power required to render AR experiences. This can help reduce the load on the user's device and make the experience smoother and faster.

Third, design your experiences with network latency in mind. Make sure that the experience can still be enjoyed even if the network connection is slow or unreliable.

Overall, by focusing on quick loading times, offloading processing power, and designing for network latency, you can create AR experiences that work well on 4G networks.

Read more

When designing AR experiences that work with 4G networks, it's important to optimize content for quick loading times, consider using cloud-based solutions to offload processing power, and design experiences with network latency in mind to ensure they can still be enjoyed even if the network connection is slow or unreliable. By following these best practices, you can create AR experiences that work well on 4G networks.


First, optimize your content for quick loading times. This means keeping file sizes small and minimizing the number of assets that need to be loaded.

Second, consider using cloud-based solutions to offload some of the processing power required to render AR experiences. This can help reduce the load on the user's device and make the experience smoother and faster.

Third, design your experiences with network latency in mind. Make sure that the experience can still be enjoyed even if the network connection is slow or unreliable.

Overall, by focusing on quick loading times, offloading processing power, and designing for network latency, you can create AR experiences that work well on 4G networks.

Read more

AR Glasses Push Limits At CES 2024

Posted by Technology Co., Ltd Shenzhen Mshilor

Charlie Fink
A former tech executive covering AI, XR and The Metaverse for Forbes.
US-TECHNOLOGY-IT-LIFESTYLE
XReal booth at CES 2024
Locations for demos were primarily but not exclusively, at the Las Vegas Convention Center (LVCC). Startups were in Eureka Park, at the Sands Convention Center a long mile away. Many others demos took place in hotel suites and invitation-only conference rooms, so it’s impossible to see it all. I found XR primarily through intention, invitation, tips and good ole CES serindipity.
Lifestyle-2RayNeo, a spin-off from TCL that created the low cost ($340) RayNeo Air smart glasses, which we favorably reviewed in November, presented its next generation Air2. These handsome XR glasses provide a big screen-on-the-go that mirrors the tethered smartphone. It features an audio “whisper mode,” that keeps the sound extra private to the user. The RayNeo X2 Lite has a 30% field of view, and weighs in at 60 grams, making them the lightweight leaders.
CEO Howie Li showed me a demo of the new RayNeo X2 Lite, powered by Qualcomm’s latest Snapdragon XR2+ chipset which enables AI to do object recognition, depth scanning, 3D navigation, and avatar chat. Best of all, the RayNeo X2 Lite is untethered and has a three hour battery life. In what the company characterizes as “display breakthrough,” the new glasses have 1500 nits of brightness for a brighter screen and much better outdoor use.
Zapbox Open Brush Simulation
Zapbox shared its $80 Zapbox at Pepcom, a press-only show within the show. It’s a 6DOF version of the once-popular Samsung Gear, which was also a holder for a smartphone that in 2015 was limited to 2D games and 360 video. Zapbox looks way more sleek than that, and comes with two controllers. Pass-through AR enables smartphones, and devices like the Meta Quest 3 and the Apple Vision Pro, to mix the physical and digital. Zapbox was demonstrating the popular spatial drawing app Open Brush, as well as 3D chess. Zapbox’s own SpatialTV app also records and plays back spatial video, something only the Vision Pro itself can do.
IMG_0700
AR [Glasses] by NRMYW. I’ve never heard of the California company before. They have a splashy booth which catches the eye in the XR area in the LVCC. This stand-alone headset does not need to be tethered to a smartphone. It has a 90 minute battery life (similar to a Quest). On board there are the expected Qualcomm XR2 chips, enabling stand alone media consumption and gaming, along with real time translation, a teleprompter, and familiar social media, audio and video streaming services. Although the field of view is only 32-degrees, the screen appears as a 90 inch screen would from five feet away. At 700 nits, the screen was bright enough during my indoor demo, but probably needs additional masking for outdoor use.
Solosa Air Go 3
Solos AirGo3 Smart Glasses don’t have a display but offer onboard AI, announced the expansion of SolosTranslate, a full-service translation platform, for AirGo3. New operating modes, including Listen, Group, Text, and Present, can enable group communication by enabling multiple languages to be used at one time. SolosTranslate is built on Solos’ proprietary software platform and OpenAI’s ChatGPT.
Nimo Glasses and the Nimo 1 Core mini computer are a pair of spatial computing devices that enable productivity on the go. Nimo Glasses are a pair of display glasses that features HD displays and a 49-degree field of view. The glasses are available for pre-over at $599. The mini computer runs on an operating system called Nimo OS. The $399 Nimo 1 Core, the world’s first spatial computer designed for productivity, works with Nimo Glasses, Rokid Max, and XREAL Air Glasses. It is powered by a Qualcomm XR2, and has a very respectable 8 Core CPU, 8GB RAM, 128GB Storage, and Adreno™ 650 GPU, WiFi 6, Air Mouse and Trackpad.
curveT5

Curve Reality uses Tilt 5 to fill a room with holograms. Tilt 5 is a game platform that brings holograms to board games. Its ultralight glasses are tethered to a PC, and four people can play at once on its special reflective mat. They are selling them just as fast as they can make them. Curve Reality is a small, powerful PC with an Nvidia processor that fits in a pouch you carry around you. Here’s the trick. Partnering with Tilt 5, they placed their reflective mats all over the suite. Users can then walk among a HoloGallery.

Pimax Crystal, with 12K screens and 60G Airlink (wireless PCVR), was awarded the CES 2024 Innovation Awards, as an Honoree in the “XR Technologies & Accessories” category. The Crystal is the only award-winning VR headset at the awards.

IMG_0726

The Emdor looks like the Vision Pro. But that’s about it. Apple’s trendsetting design attracts imitators in this case the EmdorVR EM-AX162, debuting just two months after Apple's Vision Pro. While it bears a striking resemblance to the vision pro, its low budget specs speak to its superficial nature. It has a Snapdragon XR1 chip, 6GB RAM, and a 5.5-inch LCD for each eye. Apple's Vision Pro, on the other hand, boasts dual Apple Silicon chips, microOLEDs with exceptional resolution, advanced features like eye-tracking and augmented reality, and a robust 256GB storage.

Vuzix Ultalite CES 2024

Vuzix (NASDAQ: VUZI) Introduces a New Wireless Headset for Enterprise. The company secured its twentieth consecutive CES 2024 Innovation Award for its new Vuzix Ultralite S AR smart glasses, an all-day design that delivers on demand digital information in its monocular display. Aimed in part at sports and fitness users, the Ultralite S delivers hands-free, wireless connectivity to the information from the wearer’s smartphone or smartwatch. Weighing in at a mere 38 grams, it’ll last 48 hours on a single charge. The Ultralite S also employs Vuzix Incognito technology, virtually eliminating the eyeglow or forward light found with other waveguide-based solutions. The Vuzix Ultralite OEM Platform is a go-to-market ready, turnkey offering designed to fast track client AR solutions into production. Paul Travers, the founder and CEO, spends CES in his booth, demoing his new headset to irrelevant strangers who stop to gawk at the AR hardware surrounded by CES awards. Paul the CEO of a public company and he’s still repping it like an intern.

Charlie in RealWear at CES 2024

RealWear is Vuzix’ main competitor in the enterprise AR space, and was doing suite demos again this year. The problem with a booth is that it’s physically greuling, expensive, and you spend 99% of your time on irrelevant people. The problem with the suite demo is that it’s hard to get people here. In this case we met halfway, in the ciggy choked Westway hotel bar, next to the LVCC. The RealWear is a ruggedized monocular display that attaches to a hard hat or ball cap and sits on the edge of your peripheral. You swing it in and out of your eyesight. It’s controlled with a special kind of voice recognition that works even in extremely noisy industrial environments. Importantly it’s full of sensors, cameras, and flashlights that can. Realwear has had some real successes and great use cases. They’ve sold 15,000 units in the oil and gas industry. Hundreds of BMW mechanics use its see-what-I-see remote expert technology to troubleshoot maintenance issues with engineers in Germany. Honestly if the US Army had chosen RealWear instead of Microsoft for its IVAS program it might have worked.

 

Everysight CES 2024

Everysight AR Glasses for BMW Motorcycles are sexy as hell. In yet another case of CES serendipity, my podcast co-host Ted made the intro to founder and CEO Asaf Ashkenazi, who met me an hour before my flight home. Ashkenazi is a former Israeli Air Force pilot who saw the need for a practical heads up display ten years ago, when a bright microdisplay that looked like regular glasses seemed impossible. Everysight makes the BMW ConnectedRide Smartglasses ($680) which look like regular sunglasses. The glasses, called Mavericks, offer a monocular microdisplay that uses Sony OLED displays while still carrying a charge for eight hours. Mavericks will be available this summer.

Read the sequel, AR Glasses Push Limits At CES 2024, featuring Xreal, Sony, RayNeo, Zapbox, NRMYW, Solos, Nimio, Tilt5, Pimax, Emdor, Vuzix, RealWear, and Everysight.

Read more
Charlie Fink
A former tech executive covering AI, XR and The Metaverse for Forbes.
US-TECHNOLOGY-IT-LIFESTYLE
XReal booth at CES 2024
Locations for demos were primarily but not exclusively, at the Las Vegas Convention Center (LVCC). Startups were in Eureka Park, at the Sands Convention Center a long mile away. Many others demos took place in hotel suites and invitation-only conference rooms, so it’s impossible to see it all. I found XR primarily through intention, invitation, tips and good ole CES serindipity.
Lifestyle-2RayNeo, a spin-off from TCL that created the low cost ($340) RayNeo Air smart glasses, which we favorably reviewed in November, presented its next generation Air2. These handsome XR glasses provide a big screen-on-the-go that mirrors the tethered smartphone. It features an audio “whisper mode,” that keeps the sound extra private to the user. The RayNeo X2 Lite has a 30% field of view, and weighs in at 60 grams, making them the lightweight leaders.
CEO Howie Li showed me a demo of the new RayNeo X2 Lite, powered by Qualcomm’s latest Snapdragon XR2+ chipset which enables AI to do object recognition, depth scanning, 3D navigation, and avatar chat. Best of all, the RayNeo X2 Lite is untethered and has a three hour battery life. In what the company characterizes as “display breakthrough,” the new glasses have 1500 nits of brightness for a brighter screen and much better outdoor use.
Zapbox Open Brush Simulation
Zapbox shared its $80 Zapbox at Pepcom, a press-only show within the show. It’s a 6DOF version of the once-popular Samsung Gear, which was also a holder for a smartphone that in 2015 was limited to 2D games and 360 video. Zapbox looks way more sleek than that, and comes with two controllers. Pass-through AR enables smartphones, and devices like the Meta Quest 3 and the Apple Vision Pro, to mix the physical and digital. Zapbox was demonstrating the popular spatial drawing app Open Brush, as well as 3D chess. Zapbox’s own SpatialTV app also records and plays back spatial video, something only the Vision Pro itself can do.
IMG_0700
AR [Glasses] by NRMYW. I’ve never heard of the California company before. They have a splashy booth which catches the eye in the XR area in the LVCC. This stand-alone headset does not need to be tethered to a smartphone. It has a 90 minute battery life (similar to a Quest). On board there are the expected Qualcomm XR2 chips, enabling stand alone media consumption and gaming, along with real time translation, a teleprompter, and familiar social media, audio and video streaming services. Although the field of view is only 32-degrees, the screen appears as a 90 inch screen would from five feet away. At 700 nits, the screen was bright enough during my indoor demo, but probably needs additional masking for outdoor use.
Solosa Air Go 3
Solos AirGo3 Smart Glasses don’t have a display but offer onboard AI, announced the expansion of SolosTranslate, a full-service translation platform, for AirGo3. New operating modes, including Listen, Group, Text, and Present, can enable group communication by enabling multiple languages to be used at one time. SolosTranslate is built on Solos’ proprietary software platform and OpenAI’s ChatGPT.
Nimo Glasses and the Nimo 1 Core mini computer are a pair of spatial computing devices that enable productivity on the go. Nimo Glasses are a pair of display glasses that features HD displays and a 49-degree field of view. The glasses are available for pre-over at $599. The mini computer runs on an operating system called Nimo OS. The $399 Nimo 1 Core, the world’s first spatial computer designed for productivity, works with Nimo Glasses, Rokid Max, and XREAL Air Glasses. It is powered by a Qualcomm XR2, and has a very respectable 8 Core CPU, 8GB RAM, 128GB Storage, and Adreno™ 650 GPU, WiFi 6, Air Mouse and Trackpad.
curveT5

Curve Reality uses Tilt 5 to fill a room with holograms. Tilt 5 is a game platform that brings holograms to board games. Its ultralight glasses are tethered to a PC, and four people can play at once on its special reflective mat. They are selling them just as fast as they can make them. Curve Reality is a small, powerful PC with an Nvidia processor that fits in a pouch you carry around you. Here’s the trick. Partnering with Tilt 5, they placed their reflective mats all over the suite. Users can then walk among a HoloGallery.

Pimax Crystal, with 12K screens and 60G Airlink (wireless PCVR), was awarded the CES 2024 Innovation Awards, as an Honoree in the “XR Technologies & Accessories” category. The Crystal is the only award-winning VR headset at the awards.

IMG_0726

The Emdor looks like the Vision Pro. But that’s about it. Apple’s trendsetting design attracts imitators in this case the EmdorVR EM-AX162, debuting just two months after Apple's Vision Pro. While it bears a striking resemblance to the vision pro, its low budget specs speak to its superficial nature. It has a Snapdragon XR1 chip, 6GB RAM, and a 5.5-inch LCD for each eye. Apple's Vision Pro, on the other hand, boasts dual Apple Silicon chips, microOLEDs with exceptional resolution, advanced features like eye-tracking and augmented reality, and a robust 256GB storage.

Vuzix Ultalite CES 2024

Vuzix (NASDAQ: VUZI) Introduces a New Wireless Headset for Enterprise. The company secured its twentieth consecutive CES 2024 Innovation Award for its new Vuzix Ultralite S AR smart glasses, an all-day design that delivers on demand digital information in its monocular display. Aimed in part at sports and fitness users, the Ultralite S delivers hands-free, wireless connectivity to the information from the wearer’s smartphone or smartwatch. Weighing in at a mere 38 grams, it’ll last 48 hours on a single charge. The Ultralite S also employs Vuzix Incognito technology, virtually eliminating the eyeglow or forward light found with other waveguide-based solutions. The Vuzix Ultralite OEM Platform is a go-to-market ready, turnkey offering designed to fast track client AR solutions into production. Paul Travers, the founder and CEO, spends CES in his booth, demoing his new headset to irrelevant strangers who stop to gawk at the AR hardware surrounded by CES awards. Paul the CEO of a public company and he’s still repping it like an intern.

Charlie in RealWear at CES 2024

RealWear is Vuzix’ main competitor in the enterprise AR space, and was doing suite demos again this year. The problem with a booth is that it’s physically greuling, expensive, and you spend 99% of your time on irrelevant people. The problem with the suite demo is that it’s hard to get people here. In this case we met halfway, in the ciggy choked Westway hotel bar, next to the LVCC. The RealWear is a ruggedized monocular display that attaches to a hard hat or ball cap and sits on the edge of your peripheral. You swing it in and out of your eyesight. It’s controlled with a special kind of voice recognition that works even in extremely noisy industrial environments. Importantly it’s full of sensors, cameras, and flashlights that can. Realwear has had some real successes and great use cases. They’ve sold 15,000 units in the oil and gas industry. Hundreds of BMW mechanics use its see-what-I-see remote expert technology to troubleshoot maintenance issues with engineers in Germany. Honestly if the US Army had chosen RealWear instead of Microsoft for its IVAS program it might have worked.

 

Everysight CES 2024

Everysight AR Glasses for BMW Motorcycles are sexy as hell. In yet another case of CES serendipity, my podcast co-host Ted made the intro to founder and CEO Asaf Ashkenazi, who met me an hour before my flight home. Ashkenazi is a former Israeli Air Force pilot who saw the need for a practical heads up display ten years ago, when a bright microdisplay that looked like regular glasses seemed impossible. Everysight makes the BMW ConnectedRide Smartglasses ($680) which look like regular sunglasses. The glasses, called Mavericks, offer a monocular microdisplay that uses Sony OLED displays while still carrying a charge for eight hours. Mavericks will be available this summer.

Read the sequel, AR Glasses Push Limits At CES 2024, featuring Xreal, Sony, RayNeo, Zapbox, NRMYW, Solos, Nimio, Tilt5, Pimax, Emdor, Vuzix, RealWear, and Everysight.

Read more

CES 2024: AI everything, what we expect in Las Vegas and all the announcements so far

Posted by Technology Co., Ltd Shenzhen Mshilor

Updated 

Wireless TV, plug-and-play solar and next-gen headphones look to make a big splash.

I know we say this every year, but it feels like just yesterday we were all crammed in a single room in Las Vegas eating mediocre takeout and voting for our best in show. But CES 2024 is actually just around the corner. The show officially runs from January 9 to January 12, though we'll be on the ground well before that, with the first CES-related events expected to kick off on January 7.

Last year we saw a focus on accessibility and a rather disturbing amount of stuff that you were supposed to pee on or into. While we'll probably see a good amount devices designed to help those with hearing impairments and mobility restrictions again this year, we anticipate some new trends to steal some headlines. Here's a few predictions from our staff about what to expect from CES 2024 in Las Vegas.

User-friendly solar

Jackery solar panels and power stationJackery

I suspect CES 2024 will be full of clean energy technology, packaged in the form of consumer hardware. Solar panels have traditionally been the purview of professional contractors but standalone setups are gaining in popularity. Two or three years ago, this gear would have been targeted at RV users but now it’s cresting into the mainstream. Pop-up panels, coupled with inverters and batteries that look like air conditioning units, sitting unobtrusively in the corner, are all the rage. It’s a plus that most of these setups are plug and play, removing the need for a professional to get involved.

There are a couple of drivers for this beyond the niche audience of folks looking to get off of the electricity grid. In many places outside the US, the cost of energy has spiked dramatically and it’s folly to think the same won’t happen here. Not to mention that, in places like Texas, people have seen the power grid fail with devastating consequences. It’s going to be a big market in the next few years and I’d expect to see more and more consumer brands follow Anker and Jackery into the home battery world. — Dan Cooper, Senior Reporter UK

MEMS earbuds

Exploded view of an xMEMS headphone.

xMEMS

If Engadget’s audience stats are any indication, audio nerds are extremely excited about MEMS earbud drivers. As my colleague James Trew has detailed in his reporting, micro-electromechanical systems (MEMS) may very well be the next big thing in headphones. A California-based company called xMEMS is the first to bring the solid state components to market, and the first true wireless earbuds that use them have recently gone on sale.

Some of the benefits of MEMS drivers are said to be improved response, better durability and more consistent fidelity. They also don’t require the calibration or matching that balanced-armature or dynamic drivers need on a production line. The only downside is that in their current state, they still need a hybrid setup with a secondary driver for bass. In its next-gen MEMS speaker, though, xMEMS is promising 40 times louder bass response.

The new model is called Cypress and the company will be demoing it for attendees at CES. xMEMS says its performance is consistent with the bass performance of “the best” 10-12 coil speakers currently being used in earbuds. What’s more, Cypress can improve ANC performance, which xMEMs says will cover higher frequencies – including crying babies. The company has already said the components won’t go into mass production until the end of 2024, so consumer products are over a year away. But the promise is too good not to be excited about a very early preview in Las Vegas. — Billy Steele, Senior Reporter

Wi-Fi 7 in everything

Wi-Fi 7Netgear

While it may not be the most exciting development, I’m expecting to see a number of new devices with support for Wi-Fi 7 at CES 2024 — from laptops to TVs and everything in between. Currently, it’s still a work in progress, but with the official Wi-Fi 7 spec expected to be finalized sometime in early 2024, gadget makers are looking to get an early jump. Some benefits of Wi-Fi 7 include maximum speeds of up to 46 Gbps — more than twice as fast as what’s available using Wi-Fi 6/6E — along with a 320Mhz channel width that offers double the capacity compared to previous generations.

Another important feature is MLO (multi-link operation) which allows Wi-Fi 7 devices to use two bands at the same time, essentially turning a single wireless connection into a two-lane highway. For people with larger homes, this should improve the performance of mesh networks by allowing devices to switch bands without losing speed or connection. QAM (quadrature amplitude modulation) is also getting a significant boost from 1024-QAM on Wi-Fi 6/6E to 4096-QAM with Wi-Fi 7 which allows devices to pack more information into the same carrier signal.

The downside is that while there are some gadgets on sale today like the Samsung Galaxy S23 that already support Wi-Fi 7, you’ll need both a compatible device and router (not to mention a sufficiently fast internet connection) to take advantage of the spec’s full capabilities. In short, you should keep an eye out for new devices that work with Wi-Fi 7, but don’t rush out and upgrade everything in your home until prices stabilize and they become more widespread. — Sam Rutherford, Senior Reporter

 

The year of the AI PC

Intel Core Ultra

Intel

If there’s one buzzy term you’re guaranteed to hear a ton throughout 2024, it’s “AI PC.” It’s a phrase both Intel and AMD are using to describe computers equipped with chips featuring NPUs, or neural processing units. Similar to the way GPUs speed up graphics processing for gaming, an NPU offloads AI tasks to handle them more efficiently. For Windows 11, that’s mainly limited to Microsoft’s Studio Effects, which can blur your video chat backgrounds or punch up your lighting. But more Windows AI features are rumored to be on the way (Microsoft’s push to bring its Copilot AI everywhere is a big sign), and companies like Adobe and Audacity are also developing NPU-powered features for their apps.

For years chipmakers have been chasing higher clock rates, smaller process designs and a wealth of other architectural upgrades like 3D transistors to make their hardware faster and more efficient. The move towards mobile chip designs, like Apple’s Silicon, is yet another way to reduce power consumption while also speeding up computational possibilities. Intel, AMD and other companies are also focusing more on GPUs to beef up basic gaming performance, while also offloading some creative tasks like media encoding. NPUs are the latest tool chip designers can rely on, and they also have the potential to change the way we use our computers entirely (or at least, deliver a bit more power and battery life for ultraportables).

While it’s easy to be skeptical of marketing terms, the phrase “AI PC” is at least functional. There are still plenty of laptops on the market without NPUs — Intel only got into the AI game with its new Core Ultra chips — so consumers will need an easy way to differentiate between different types of systems. After all, if you’re upgrading your laptop to take advantage of Windows Studio effects and AI powered software, you don’t want to be stuck with a non-NPU system for several years. — Devindra Hardawar, Senior Reporter

Truly wireless TV

A Displace TV unit attached to large windows.

Cherlynn Low / Engadget

Displace made a splash at CES 2023 with its truly wireless TV that could be mounted anywhere, even suction-cupped to a window. The company’s demo left us with a lot of questions as it wasn’t yet ready to discuss key details of the product since what it showed off were CES-specific prototypes. The company is returning to Vegas this year and it’s already announced what it plans to have on display.

First, Displace says two sizes of TVs will be demoed: the 27-inch Displace Mini and the 55-inch Displace Flex. The display we saw earlier this year was also 55 inches, but a key difference between it and the Flex is that this new version attaches to an optional magnetic wireless charging stand. Both the Flex and the Mini pack enough battery life to last a month if you watch six hours of content per day, according to the company. There’s no pricing available for these yet, but they go up for pre-order on January 9, so we’re bound to find out soon. Displace said it will also show off a 110-inch model at CES, although details are scarce.

The original version has gone up in price since last CES: it’s now $4,499 and orders won’t ship until mid-2024. The new Mini and Flex aren’t expected to ship until late next year either. The main thing we’ll be looking for at CES is a status update. Are the units any more polished? Have there been any notable upgrades since that first prototype? How much will the extra swappable batteries cost? Does it look like the company will actually be able to ship in the next 6-12 months?

Displace has also announced an AI-based shopping platform for its TVs. Using the same gestures that control TV viewing, the tech can analyze a paused scene for products that might be available for sale. The system also allows you to quickly make a purchase by either bringing a phone or watch near the NFC-enabled TVs or by using a mobile app. Displace says the goal for its products has always been ambient computing, and the first step towards that is this shopping platform. It’s also a way for the company to make money off the TVs after the initial sale. — Billy Steele

Announcements so far

Read more
Updated 

Wireless TV, plug-and-play solar and next-gen headphones look to make a big splash.

I know we say this every year, but it feels like just yesterday we were all crammed in a single room in Las Vegas eating mediocre takeout and voting for our best in show. But CES 2024 is actually just around the corner. The show officially runs from January 9 to January 12, though we'll be on the ground well before that, with the first CES-related events expected to kick off on January 7.

Last year we saw a focus on accessibility and a rather disturbing amount of stuff that you were supposed to pee on or into. While we'll probably see a good amount devices designed to help those with hearing impairments and mobility restrictions again this year, we anticipate some new trends to steal some headlines. Here's a few predictions from our staff about what to expect from CES 2024 in Las Vegas.

User-friendly solar

Jackery solar panels and power stationJackery

I suspect CES 2024 will be full of clean energy technology, packaged in the form of consumer hardware. Solar panels have traditionally been the purview of professional contractors but standalone setups are gaining in popularity. Two or three years ago, this gear would have been targeted at RV users but now it’s cresting into the mainstream. Pop-up panels, coupled with inverters and batteries that look like air conditioning units, sitting unobtrusively in the corner, are all the rage. It’s a plus that most of these setups are plug and play, removing the need for a professional to get involved.

There are a couple of drivers for this beyond the niche audience of folks looking to get off of the electricity grid. In many places outside the US, the cost of energy has spiked dramatically and it’s folly to think the same won’t happen here. Not to mention that, in places like Texas, people have seen the power grid fail with devastating consequences. It’s going to be a big market in the next few years and I’d expect to see more and more consumer brands follow Anker and Jackery into the home battery world. — Dan Cooper, Senior Reporter UK

MEMS earbuds

Exploded view of an xMEMS headphone.

xMEMS

If Engadget’s audience stats are any indication, audio nerds are extremely excited about MEMS earbud drivers. As my colleague James Trew has detailed in his reporting, micro-electromechanical systems (MEMS) may very well be the next big thing in headphones. A California-based company called xMEMS is the first to bring the solid state components to market, and the first true wireless earbuds that use them have recently gone on sale.

Some of the benefits of MEMS drivers are said to be improved response, better durability and more consistent fidelity. They also don’t require the calibration or matching that balanced-armature or dynamic drivers need on a production line. The only downside is that in their current state, they still need a hybrid setup with a secondary driver for bass. In its next-gen MEMS speaker, though, xMEMS is promising 40 times louder bass response.

The new model is called Cypress and the company will be demoing it for attendees at CES. xMEMS says its performance is consistent with the bass performance of “the best” 10-12 coil speakers currently being used in earbuds. What’s more, Cypress can improve ANC performance, which xMEMs says will cover higher frequencies – including crying babies. The company has already said the components won’t go into mass production until the end of 2024, so consumer products are over a year away. But the promise is too good not to be excited about a very early preview in Las Vegas. — Billy Steele, Senior Reporter

Wi-Fi 7 in everything

Wi-Fi 7Netgear

While it may not be the most exciting development, I’m expecting to see a number of new devices with support for Wi-Fi 7 at CES 2024 — from laptops to TVs and everything in between. Currently, it’s still a work in progress, but with the official Wi-Fi 7 spec expected to be finalized sometime in early 2024, gadget makers are looking to get an early jump. Some benefits of Wi-Fi 7 include maximum speeds of up to 46 Gbps — more than twice as fast as what’s available using Wi-Fi 6/6E — along with a 320Mhz channel width that offers double the capacity compared to previous generations.

Another important feature is MLO (multi-link operation) which allows Wi-Fi 7 devices to use two bands at the same time, essentially turning a single wireless connection into a two-lane highway. For people with larger homes, this should improve the performance of mesh networks by allowing devices to switch bands without losing speed or connection. QAM (quadrature amplitude modulation) is also getting a significant boost from 1024-QAM on Wi-Fi 6/6E to 4096-QAM with Wi-Fi 7 which allows devices to pack more information into the same carrier signal.

The downside is that while there are some gadgets on sale today like the Samsung Galaxy S23 that already support Wi-Fi 7, you’ll need both a compatible device and router (not to mention a sufficiently fast internet connection) to take advantage of the spec’s full capabilities. In short, you should keep an eye out for new devices that work with Wi-Fi 7, but don’t rush out and upgrade everything in your home until prices stabilize and they become more widespread. — Sam Rutherford, Senior Reporter

 

The year of the AI PC

Intel Core Ultra

Intel

If there’s one buzzy term you’re guaranteed to hear a ton throughout 2024, it’s “AI PC.” It’s a phrase both Intel and AMD are using to describe computers equipped with chips featuring NPUs, or neural processing units. Similar to the way GPUs speed up graphics processing for gaming, an NPU offloads AI tasks to handle them more efficiently. For Windows 11, that’s mainly limited to Microsoft’s Studio Effects, which can blur your video chat backgrounds or punch up your lighting. But more Windows AI features are rumored to be on the way (Microsoft’s push to bring its Copilot AI everywhere is a big sign), and companies like Adobe and Audacity are also developing NPU-powered features for their apps.

For years chipmakers have been chasing higher clock rates, smaller process designs and a wealth of other architectural upgrades like 3D transistors to make their hardware faster and more efficient. The move towards mobile chip designs, like Apple’s Silicon, is yet another way to reduce power consumption while also speeding up computational possibilities. Intel, AMD and other companies are also focusing more on GPUs to beef up basic gaming performance, while also offloading some creative tasks like media encoding. NPUs are the latest tool chip designers can rely on, and they also have the potential to change the way we use our computers entirely (or at least, deliver a bit more power and battery life for ultraportables).

While it’s easy to be skeptical of marketing terms, the phrase “AI PC” is at least functional. There are still plenty of laptops on the market without NPUs — Intel only got into the AI game with its new Core Ultra chips — so consumers will need an easy way to differentiate between different types of systems. After all, if you’re upgrading your laptop to take advantage of Windows Studio effects and AI powered software, you don’t want to be stuck with a non-NPU system for several years. — Devindra Hardawar, Senior Reporter

Truly wireless TV

A Displace TV unit attached to large windows.

Cherlynn Low / Engadget

Displace made a splash at CES 2023 with its truly wireless TV that could be mounted anywhere, even suction-cupped to a window. The company’s demo left us with a lot of questions as it wasn’t yet ready to discuss key details of the product since what it showed off were CES-specific prototypes. The company is returning to Vegas this year and it’s already announced what it plans to have on display.

First, Displace says two sizes of TVs will be demoed: the 27-inch Displace Mini and the 55-inch Displace Flex. The display we saw earlier this year was also 55 inches, but a key difference between it and the Flex is that this new version attaches to an optional magnetic wireless charging stand. Both the Flex and the Mini pack enough battery life to last a month if you watch six hours of content per day, according to the company. There’s no pricing available for these yet, but they go up for pre-order on January 9, so we’re bound to find out soon. Displace said it will also show off a 110-inch model at CES, although details are scarce.

The original version has gone up in price since last CES: it’s now $4,499 and orders won’t ship until mid-2024. The new Mini and Flex aren’t expected to ship until late next year either. The main thing we’ll be looking for at CES is a status update. Are the units any more polished? Have there been any notable upgrades since that first prototype? How much will the extra swappable batteries cost? Does it look like the company will actually be able to ship in the next 6-12 months?

Displace has also announced an AI-based shopping platform for its TVs. Using the same gestures that control TV viewing, the tech can analyze a paused scene for products that might be available for sale. The system also allows you to quickly make a purchase by either bringing a phone or watch near the NFC-enabled TVs or by using a mobile app. Displace says the goal for its products has always been ambient computing, and the first step towards that is this shopping platform. It’s also a way for the company to make money off the TVs after the initial sale. — Billy Steele

Announcements so far

Read more