NVIDIA DRIVE Partners Showcase Cutting-Edge Innovations in Automated and Autonomous Driving

NVIDIA DRIVE Partners Showcase Cutting-Edge Innovations in Automated and Autonomous Driving

The automotive industry is being transformed by the integration of cutting-edge technologies into software-defined cars.

At CES, NVIDIA invited industry leaders to share their perspectives on how technology, especially AI and computing power, is shaping the future of transportation.

Watch the video to learn more from NVIDIA’s auto partners.

Redefining Possibilities Through Partnership

Magnus Ostberg, chief software officer at Mercedes-Benz, underscores how the company’s partnership with NVIDIA helps push technological boundaries. “[NVIDIA] enables us to go further to bring automated driving to the next level and into areas that we couldn’t go before,” he says.

Computing Power: The Driving Force Behind Autonomy

Shawn Kerrigan, chief operating officer and cofounder at Plus, emphasizes the role of computing power, saying, “Autonomous technology requires immense computing power in order to really understand the world around it and make safe driving decisions.”

“What was impossible to do previously because computing wasn’t strong enough is now doable,” says Eran Ofri, CEO of Imagry. “This is an enabler for the progress of the autonomous driving industry.”

“We wanted a platform that has a track record of being deployed in the automotive industry,” adds Stefan Solyom, chief technology officer at Pebble. “This is what NVIDIA can give us.”

And Martin Kristensson, head of product strategy at Volvo Cars, says, “We partner with NVIDIA to get the best compute that we can. More compute in the car means that we can be more aware of the environment around us and reacting earlier and being even safer.”

The Critical Role of AI

Don Burnette, CEO and founder of Kodiak Robotics, states, “NVIDIA makes best-in-class hardware accelerators, and I think it’s going to play a large role in the AI developments for self-driving going forward.”

“Driving as a routine task is tedious,” adds Tony Han, CEO and cofounder of WeRide. “We want to alleviate people from the burden of driving to give back the time. NVIDIA is the backbone of our AI engine.”

And Thomas Ingenlath, CEO of Polestar, says, “Our Polestar 3 sits on the NVIDIA DRIVE platform. This is, of course, very much based on AI technology — and it’s really fascinating and a completely new era for the car.”

Simulation Is Key

Ziv Binyamini, CEO of Foretellix, highlights the role of simulation in development and verification. “Simulation is crucial for the development of autonomous systems,” he says.

Bruce Baumgartner, vice president of supply chain at Zoox, adds, “We have been leveraging NVIDIA’s technology first and foremost on-vehicle to power the Zoox driver. We also leverage NVIDIA technologies in our cloud infrastructure. In particular, we do a lot of work in our simulator.”

Saving Lives With Autonomy

Austin Russell, CEO and founder of Luminar, highlights the opportunity to save lives by using new technology, saying, “The DRIVE platform has been incredibly helpful to be able to actually enable autonomous driving capabilities as well as enhance safety capabilities on vehicles. To be able to have an opportunity to save as many as 100 million lives and 100 trillion hours of people’s time over the next 100 years — everything that we do at the company rolls up to that.”

“Knowing that [this technology] is in vehicles worldwide and saves lives on the road each and every day — the impact that you deliver as you keep people and family safe is amazingly rewarding,” adds Tal Krzypow, vice president of product and strategy at Cipia.

Technology Helps Solve Major Challenges

Shiv Tasker, global industry vice president at Capgemini, reflects on the role of technology in addressing global challenges, saying, “Our modern world is driven by technology, and yet we face tremendous challenges. Technology is the answer. We have to solve the major issues so that we leave a better place for our children and our grandchildren.”

Learn more about the NVIDIA DRIVE platform and how it’s helping industry leaders redefine transportation.

Read More

How Amazon and NVIDIA Help Sellers Create Better Product Listings With AI

How Amazon and NVIDIA Help Sellers Create Better Product Listings With AI

It’s hard to imagine an industry more competitive — or fast-paced — than online retail.

Sellers need to create attractive and informative product listings that must be engaging, capture attention and generate trust.

Amazon uses optimized containers on Amazon Elastic Compute Cloud (Amazon EC2) with NVIDIA Tensor Core GPUs to power a generative AI tool that finds this balance at the speed of modern retail.

Amazon’s new generative AI capabilities help sellers seamlessly create compelling titles, bullet points, descriptions, and product attributes.

To get started, Amazon identifies listings where content could be improved and leverages generative AI to generate high-quality content automatically. Sellers review the generated content and can provide feedback if they want to or accept the content changes to the Amazon catalog.

Previously, creating detailed product listings required significant time and effort for sellers, but this simplified process gives them more time to focus on other tasks.

The NVIDIA TensorRT-LLM software is available today on GitHub and can be accessed through NVIDIA AI Enterprise, which offers enterprise-grade security, support, and reliability for production AI.

TensorRT-LLM open-source software makes AI inference faster and smarter. It works with large language models, such as Amazon’s models for the above capabilities, which are trained on vast amounts of text.

On NVIDIA H100 Tensor Core GPUs, TensorRT-LLM enables up to an 8x speedup on foundation LLMs such as Llama 1 and 2, Falcon, Mistral, MPT, ChatGLM, Starcoder and more.

It also supports multi-GPU and multi-node inference, in-flight batching, paged attention, and Hopper Transformer Engine with FP8 precision; all of which improves latencies and efficiency for the seller experience.

By using TensorRT-LLM and NVIDIA GPUs, Amazon improved its generative AI tool’s inference efficiency in terms of cost or GPUs needed by 2x, and reduced inference latency by 3x compared with an earlier implementation without TensorRT-LLM.

The efficiency gains make it more environmentally friendly, and the 3x latency improvement makes Amazon Catalog’s generative capabilities more responsive.

The generative AI capabilities can save sellers time and provide richer information with less effort. For example, it can enrich a listing for a wireless mouse with an ergonomic design, long battery life, adjustable cursor settings, and compatibility with various devices. It can also generate product attributes such as color, size, weight, and material. These details can help customers make informed decisions and reduce returns.

With generative AI, Amazon’s sellers can quickly and easily create more engaging listings, while being more energy efficient, making it possible to reach more customers and grow their business faster.

Developers can start with TensorRT-LLM today, with enterprise support available through NVIDIA AI Enterprise.

Read More

Buried Treasure: Startup Mines Clean Energy’s Prospects With Digital Twins

Buried Treasure: Startup Mines Clean Energy’s Prospects With Digital Twins

Mark Swinnerton aims to fight climate change by transforming abandoned mines into storage tanks of renewable energy.

The CEO of startup Green Gravity is prototyping his ambitious vision in a warehouse 60 miles south of Sydney, Australia, and simulating it in NVIDIA Omniverse, a platform for building 3D workflows and applications.

The concept requires some heavy lifting. Solar and wind energy will pull steel blocks weighing as much as 30 cars each up shafts taller than a New York skyscraper, storing potential energy that can turn turbines whenever needed.

A Distributed Energy Network

Swinnerton believes it’s the optimal way to save renewable energy because nearly a million abandoned mine shafts are scattered around the globe, many of them already connected to the grid. And his mechanical system is cheaper and greener than alternatives like massive lithium batteries better suited for electric vehicles.

Mark Swinnerton, CEO Green Gravity
Mark Swinnerton

Officials in Australia, India and the U.S. are interested in the concept, and a state-owned mine operator in Romania is conducting a joint study with Green Gravity.

“We have a tremendous opportunity for repurposing a million mines,” said Swinnerton, who switched gears after a 20-year career at BHP Group, one of the world’s largest mining companies, determined to combat climate change.

A Digital-First Design

A longtime acquaintance saw an opportunity to accelerate Swinnerton’s efforts with a digital twin.

“I was fascinated by the Green Gravity idea and suggested taking a digital-first approach, using data as a differentiator,” said Daniel Keys, an IT expert and executive at xAmplify, a provider of accelerated computing services.

AI-powered simulations could speed the design and deployment of the novel concept, said Keys, who met Swinnerton 25 years earlier at one of their first jobs, flipping burgers at a fast-food stand.

Today, they’ve got a digital prototype cooking on xAmplify’s Scaile computer, based on NVIDIA DGX systems. It’s already accelerating Green Gravity’s proof of concept.

“Thanks to what we inferred with a digital twin, we’ve been able to save 40% of the costs of our physical prototype by shifting from three weights to two and moving them 10 instead of 15 meters vertically,” said Swinnerton.

Use Cases Enabled by Omniverse

It’s the first of many use cases Green Gravity is developing in Omniverse.

Once the prototype is done, the simulation will help scale the design to mines as deep as 7,000 feet, or about six Empire State Buildings stacked on top of each other. Ultimately, the team will build in Omniverse a dashboard to control and monitor sensor-studded facilities without the safety hazards of sending a person into the mine.

Green Gravity’s physical prototype and test lab.
Green Gravity’s physical prototype and test lab.

“We expect to cut tens of millions of dollars off the estimated $100 million for the first site because we can use simulations to lower our risks with banks and insurers,” said Swinnerton. “That’s a real tantalizing opportunity.”

Virtual Visualization Tools

Operators will track facilities remotely using visualization systems equipped with NVIDIA A40 GPUs and can stream their visuals to tablets thanks to the TabletAR extension in the Omniverse Spatial Framework.

xAmplify’s workflow uses a number of software components such as NVIDIA Modulus, a framework for physics-informed machine learning models.

“We also use Omniverse as a core integration fabric that lets us connect a half-dozen third-party tools operators and developers need, like Siemens PLM for sensor management and Autodesk for design,” Keys said.

Omniverse eases the job of integrating third-party applications into one 3D workflow because it’s based on the OpenUSD standard.

Along the way, AI sifts reams of data about the thousands of available mines to select optimal sites, predicting their potential for energy storage. Machine learning will also help optimize designs for each site.

Taken together, it’s a digital pathway Swinnerton believes will lead to commercial operations for Green Gravity within the next couple years.

It’s the latest customer for xAmplify’s Canberra data center serving Australian government agencies, national defense contractors and an expanding set of enterprise users with a full stack of NVIDIA accelerated software.

Learn more about how AI is transforming renewables, including wind farm optimization, solar energy generation and fusion energy.

Read More

Dino-Mite: Capcom’s ‘Exoprimal’ Joins GeForce NOW

Dino-Mite: Capcom’s ‘Exoprimal’ Joins GeForce NOW

Hold on to your seats — this GFN Thursday is unleashing dinosaurs, crowns and more in the cloud.

Catch it all on Capcom’s Exoprimal and Ubisoft’s Prince of Persia: The Lost Crown, leading 10 new games joining the GeForce NOW library this week.

Suit Up, Adapt, Survive

Exoprimal on GeForce NOW
Life finds a way.

Don cutting-edge exosuit technology and battle ferocious dinosaurs on an Earth overrun with waves of prehistoric predators. Capcom’s online team-based action game Exoprimal is now supported in the cloud.

Face velociraptors, T. rex and mutated variants called Neosaurs using the exosuit’s unique weapons and abilities. Join other players in the game’s main mode, Dino Survival, to unlock snippets and special missions from the original story, piecing together the origins of the dinosaur outbreak. Change exosuits on the fly, switching between Assault, Tank and Support roles to suit the situation.

Catch the game in the cloud this week alongside the release of Title Update 3, which brings a new mission and special Monster Hunter collaboration content, a new map, new rigs, plus the start of the third season. Ultimate members can enjoy it all at up to 4K resolution and 120 frames per second, and new players can purchase the game on Steam at 50% off for a limited time.

Return to the Sands of Time

Prince of Persia on GeForce NOW
So stylish.

Defy time and destiny to reclaim the crown and save a cursed world in Prince of Persia: The Lost Crown. It’s the newest adventure in the critically acclaimed action-adventure platformer series, available to stream in the cloud this week at the game’s PC launch.

Step into the shoes of Sargon, a legendary prince with extraordinary acrobatic skills and the power to manipulate time. Travel to Mount Qaf to rescue the kidnapped Prince Ghassan. Wield blades and various time-related powers to fight enemies and solve puzzles in a Persia-inspired world filled with larger-than-life landmarks.

Members can unleash their inner warrior with an Ultimate membership for the highest-quality streaming. Dash into the thrilling game with support for up to 4K resolution at 120 fps on PCs and Macs, streaming from GeForce RTX 4080-powered servers in the cloud.

Time for New Games

Turnip Boy’s back, allright!

In addition, members can look for the following:

  • Those Who Remain (New release on Xbox, available on PC Game Pass, Jan. 16)
  • Prince of Persia: The Lost Crown (New release on Ubisoft and Ubisoft+, Jan. 18)
  • Turnip Boy Robs a Bank (New release on Steam and Xbox, available for PC Game Pass, Jan. 18)
  • New Cycle (New release on Steam, Jan. 18)
  • Beacon Pines (Xbox, available on the Microsoft Store)
  • Exoprimal (Steam)
  • FAR: Changing Tides (Xbox, available on the Microsoft Store)
  • Going Under (Xbox, available on the Microsoft Store)
  • The Legend of Nayuta: Boundless Trails (Steam)
  • Turnip Boy Commits Tax Evasion (Xbox, available on the Microsoft Store)

What are you planning to play this weekend? Let us know on X or in the comments below.

Read More

From Embers to Algorithms: How DigitalPath’s AI is Revolutionizing Wildfire Detection

From Embers to Algorithms: How DigitalPath’s AI is Revolutionizing Wildfire Detection

DigitalPath is igniting change in the Golden State — using computer vision, generative adversarial networks and a network of thousands of cameras to detect signs of fire in real time.

In the latest episode of NVIDIA’s AI Podcast, host Noah Kravtiz spoke with DigitalPath System Architect Ethan Higgins about the company’s role in the ALERTCalifornia initiative, a collaboration between California’s wildfire fighting agency CAL FIRE and the University of California, San Diego.

DigitalPath built computer vision models to process images collected from network cameras — anywhere from 8 million to 16 million a day — intelligently identifying signs of fire like smoke.

“One of the things we realized early on, though, is that it’s not necessarily a problem about just detecting a fire in a picture,” Higgins said. “It’s a process of making a manageable amount of data to handle.”

That’s because, he explained, it’s unlikely that humans will be entirely out of the loop in the detection process for the foreseeable future.

The company uses various AI algorithms to classify images based on whether they should be reviewed or acted upon — if so, an alert is sent out to a CAL FIRE command center.

One of the downsides to using computer vision to detect wildfires is that extinguishing more fires means a greater buildup of natural fuel and the potential for larger wildfires in the long term. DigitalPath and UCSD are exploring the use of high-resolution lidar data to identify where those fuels can be released in the form of prescribed burns.

Looking ahead, Higgins foresees the field tapping generative AI to accelerate new simulation tools and using AI models to analyze the output of other models to doubly improve wildfire prediction and detection.

“AI is not perfect, but when you couple multiple models together, it can get really close,” he said.

You Might Also Like

Driver’s Ed: How Waabi Uses AI Simulation to Teach Autonomous Vehicles to Drive

Teaching the AI brains of autonomous vehicles to understand the world as humans do requires billions of miles of driving experience—the road to achieving this astronomical level of driving leads to the virtual world. Learn how Waabi uses powerful high-fidelity simulations to train and develop production-level autonomous vehicles.

Polestar’s Dennis Nobelius on the Sustainable Performance Brand’s Plans

Driving enjoyment and autonomous driving capabilities can complement one another in intelligent, sustainable vehicles. Learn about the automaker’s plans to unveil its third vehicle, the Polestar 3, the tech inside it, and what the company’s racing heritage brings to the intersection of smarts and sustainability.

GANTheftAuto: Harrison Kinsley on AI-Generated Gaming Environments

Humans playing games against machines is nothing new, but now computers can develop games for people to play. Programming enthusiast and social media influencer Harrison Kinsley created GANTheftAuto, an AI-based neural network that generates a playable chunk of the classic video game Grand Theft Auto V.

Subscribe to the AI Podcast, Now Available on Amazon Music

The AI Podcast is now available through Amazon Music.

In addition, get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn.

Make the AI Podcast better: Have a few minutes to spare? Fill out this listener survey.

Read More

Māori Speech AI Model Helps Preserve and Promote New Zealand Indigenous Language

Māori Speech AI Model Helps Preserve and Promote New Zealand Indigenous Language

Indigenous languages are under threat. Some 3,000 — three-quarters of the total — could disappear before the end of the century, or one every two weeks, according to UNESCO.

As part of a movement to protect such languages, New Zealand’s Te Hiku Media, a broadcaster focused on the Maori people’s indigenous language known as te reo, is using trustworthy AI to help preserve and revitalize the tongue.

Using ethical, transparent methods of speech data collection and analysis to maintain data sovereignty for the Māori people, Te Hiku Media is developing automatic speech recognition (ASR) models for te reo, which is a Polynesian language.

Built using the open-source NVIDIA NeMo toolkit for ASR and NVIDIA A100 Tensor Core GPUs, the speech-to-text models transcribe te reo with 92% accuracy. It can also transcribe bilingual speech using English and te reo with 82% accuracy. They’re pivotal tools, made by and for the Māori people, that are helping preserve and amplify their stories.

“There’s immense value in using NVIDIA’s open-source technologies to build the tools we need to ultimately achieve our mission, which is the preservation, promotion and revitalization of te reo Māori,” said Keoni Mahelona, chief technology officer at Te Hiku Media, who leads a team of data scientists and developers, as well as Māori language experts and data curators, working on the project.

“We’re also helping guide the industry on ethical ways of using data and technologies to ensure they’re used for the empowerment of marginalized communities,” added Mahelona, a Native Hawaiian now living in New Zealand.

Building a ‘House of Speech’

Te Hiku Media began more than three decades ago as a radio station aiming to ensure te reo had space on the airwaves. Over the years, the organization incorporated television broadcasting and, with the rise of the internet, it convened a meeting in 2013 with the community’s elders to form a strategy for sharing content in the digital era.

“The elders agreed that we should make the stories accessible online for our community members — rather than just keeping our archives on cassettes in boxes — but once we had that objective, the challenge was how to do this correctly, in alignment with our strong roots in valuing sovereignty,” Mahelona said.

Instead of uploading its video and audio sources to popular, global platforms — which, in their terms and conditions of use, require signing over certain rights related to the content — Te Hiku Media decided to build its own content distribution platform.

Called Whare Kōrero — meaning “house of speech” — the platform now holds more than 30 years’ worth of digitized, archival material featuring about 1,000 hours of te reo native speakers, some of whom were born in the late 19th century, as well as more recent content from second-language learners and bilingual Māori people.

Now, around 20 Māori radio stations use and upload their content to Whare Kōrero. Community members can access the content through an app.

“It’s an invaluable reproduce of acoustic data,” Mahelona said.

Turning to Trustworthy AI

Such a trove held incredible value for those working to revitalize the language, the Te Hiku Media team quickly realized, but manual transcription required pulling lots of time and effort from limited resources. So began the organization’s trustworthy AI efforts, in 2016, to accelerate its work using ASR.

“No one would have a clue that there are eight NVIDIA A100 GPUs in our derelict, rundown, musky-smelling building in the far north of New Zealand — training and building Māori language models,” Mahelona said. “But the work has been game-changing for us.”

To collect speech data in a transparent, ethically compliant, community-oriented way, Te Hiku Media began by explaining its cause to elders, garnering their support and asking them to come to the station to read phrases aloud.

“It was really important that we had the support of the elders and that we recorded their voices, because that’s the sort of content we want to transcribe,” Mahelona said. “But eventually these efforts didn’t scale — we needed second-language learners, kids, middle-aged people and a lot more speech data in general.”

So, the organization ran a crowdsourcing campaign, Kōrero Māori, to collect highly labeled speech samples according to the Kaitiakitanga license, which ensures Te Hiku Media uses the data only for the benefit of the Māori people.

In just 10 days, more than 2,500 signed up to read 200,000+ phrases, providing over 300 hours of labeled speech data, which was used to build and train the te reo Māori ASR models.

In addition to other open-source trustworthy AI tools, Te Hiku Media now uses the NVIDIA NeMo toolkit’s ASR module for speech AI throughout its entire pipeline. The NeMo toolkit comprises building blocks called neural modules and includes pretrained models for language model development.

“It’s been absolutely amazing — NVIDIA’s open-source NeMo enabled our ASR models to be bilingual and added automatic punctuation to our transcriptions,” Mahelona said.

Te Hiku Media’s ASR models are the engines running behind Kaituhi, a te reo Māori transcription service now available online.

The efforts have spurred similar ASR projects now underway by Native Hawaiians and the Mohawk people in southeastern Canada.

“It’s indigenous-led work in trustworthy AI that’s inspiring other indigenous groups to think: ‘If they can do it, we can do it, too,’” Mahelona said.

Learn more about NVIDIA-powered trustworthy AI, the NVIDIA NeMo toolkit and how it enabled a Telugu language speech AI breakthrough.

Read More

Starstruck: 3D Artist Brellias Brings Curiosity to Light This Week ‘In the NVIDIA Studio’

Starstruck: 3D Artist Brellias Brings Curiosity to Light This Week ‘In the NVIDIA Studio’

Editor’s note: This post is part of our weekly In the NVIDIA Studio series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how NVIDIA Studio technology improves creative workflows. We’re also deep diving on new GeForce RTX 40 Series GPU features, technologies and resources, and how they dramatically accelerate content creation.

Curiosity leads the way for this week’s featured In the NVIDIA Studio 3D artist, Brellias.

It’s what inspired the native Chilean’s latest artwork Estrellitas, which in English translates to “little stars.” The scene expresses the mixture of emotions that comes with curiosity, depicting a young girl holding little stars in her hand with a conflicted expression.

“She’s excited to learn about them, but she’s also a little scared,” Brellias explained.

The striking visual piece, rich with vibrant colors and expertly executed textures, underscores that while curiosity can invoke various emotions — both joyful and painful — it is always a source of change and growth.

A Sky Full of Stars

To start, Brellias first visualized and reworked an existing 3D scene of a woman in Blender. He used Blender’s built-in multi-resolution modifier for sculpting and added some shape keys to achieve the desired modifications.

He also created a custom shader for the character’s skin — a stylistic choice to lend its appearance a galactic hue.

Brellias is an especially big fan of purple, blue and maroon hues.

Next, Brellias tapped Blender’s OptiX GPU-accelerated viewport denoising, powered by his GeForce RTX GPU.

“The technology helps reduce noise and improve the quality of the viewport image more quickly, allowing me to make decisions and iterate on the render faster,” he said.

Out-of-this-world levels of detail.

Next, Brellias animated the scene using a base model from Daz Studio, a free media design software developed by Daz 3D. Daz features an AI denoiser for high-performance interactive rendering that can also be accelerated by RTX GPUs.

In addition, rig tools in Blender made the animation process easy, eliminating the need to modify file formats.

 

To animate the character’s face, Brellias tied drivers to shape keys using empties, enabling greater fluidity and control over facial expressions.

Geometry nodes bring “Estrellitas” to life.

Brellias then used geometry nodes in Blender to animate the character’s hair, giving it a magical floating effect. To light the scene, Brellias added some ambient light behind the character’s face and between its hands. His RTX GPU accelerated OptiX ray tracing in Blender’s Cycles for the fastest final-frame renders.

 

Finally, he moved to Blackmagic Design’s DaVinci Resolve to denoise and deflicker the scene for the smoothest-looking animation.

Here, Brellias’ RTX GPU accelerated the color grading, video editing and color scoping processes, dramatically speeding his creative workflow. Other RTX-accelerated AI features, including facial recognition for automatically tagging clips and the tracking of effects, were available for his use.

 

Estrellitas was partially inspired by Brellias’ own curiosity in exploring NVIDIA and GeForce RTX GPU technologies to power content creation workflows — a venture that provided rewarding results.

“Every step of my creative process involves GPU acceleration or AI in some way or another,” said Brellias. “I can’t imagine creating without a powerful GPU at my disposal.”

His curiosity in AI extends to productivity. He recently installed the NVIDIA Broadcast app, which can transform any room into a home studio.

The app has enhanced Brellias’ microphone performance by canceling external noise and echo — especially useful given his urban surroundings.

Download the Broadcast beta and explore the rest of the Studio suite of apps, including Canvas, which uses AI to turn simple brushstrokes into realistic landscape images, and RTX Remix, which allows modders to create AI-powered RTX remasters of classic games. The apps are all free for RTX GPU owners.

Digital 3D artist Brellias.

Check out Brellias’ portfolio on Instagram.

Follow NVIDIA Studio on Instagram, X and Facebook. Access tutorials on the Studio YouTube channel and get updates directly in your inbox by subscribing to the Studio newsletter. 

Read More

NVIDIA CEO: ‘This Year, Every Industry Will Become a Technology Industry’

NVIDIA CEO: ‘This Year, Every Industry Will Become a Technology Industry’

“This year, every industry will become a technology industry,” NVIDIA founder and CEO Jensen Huang told attendees Wednesday during the annual J.P. Morgan Healthcare Conference.

“You can now recognize and learn the language of almost anything with structure, and you can translate it to anything with structure — so text-protein, protein-text,” Huang said in a fireside chat with Martin Chavez, partner and vice chairman of global investment firm Sixth Street Partners and board chair of Recursion, a biopharmaceutical company. “This is the generative AI revolution.”

The conversation, which took place at the historic San Francisco Mint, followed a presentation at the J.P. Morgan conference Monday by Kimberly Powell, NVIDIA’s VP of healthcare. In her talk, Powell announced that Recursion is the first hosting partner to offer a foundation model through the NVIDIA BioNeMo cloud service, which is advancing into beta this month.

She also said that Amgen, one of the first companies to employ BioNeMo, plans to advance drug discovery with generative AI and NVIDIA DGX SuperPOD — and that BioNeMo is used by a growing number of techbio companies, pharmas, AI software vendors and systems integrators. Among them are Deloitte, Innophore, Insilico Medicine, OneAngstrom, Recursion and Terray Therapeutics.

From Computer-Aided Chip Design to Drug Design

Healthcare customers and partners now consume well over a billion dollars in NVIDIA GPU computing each year — directly and indirectly through cloud partners.

Huang traced NVIDIA’s involvement in accelerated healthcare back to two research projects that caught his attention around 15 years ago: one at Mass General tapped NVIDIA GPUs to reconstruct CT images, another at the University of Illinois Urbana-Champaign applied GPU acceleration to molecular dynamics.

“It opened my mind that we could apply the same methodology that we use in computer-aided chip design to help the world of drug discovery go from computer-aided drug discovery to computer-aided drug design,” he said, realizing that, “if we scale this up by a billion times, we could simulate biology.”

After 40 years of advancements in computer-aided chip design, engineers can now build complex computing systems entirely in simulation, Huang explained. Over the next decade, the same could be true for AI-accelerated drug design.

“Almost everything will largely start in silico, largely end in silico,” he said, using a term that refers to an experiment run on a computer.

Collaborating on the Future of Drug Discovery and Medical Instruments

With the progress made to date, computer-aided drug discovery is “genuinely miraculous,” Huang said.

NVIDIA is propelling the field forward by building state-of-the-art AI models and powerful computing platforms, and by collaborating with domain experts and investing in techbio companies.

“We are determined to work with you to advance this field,” Huang said, inviting healthcare innovators to reach out to NVIDIA. “We deeply believe that this is going to be the future of the way that drugs will be discovered and designed.”

The company’s pipelines for accelerated healthcare include algorithms for cryo-electron microscopy, X-ray crystallography, gene sequencing, amino acid structure prediction and virtual drug molecule screening. And as AI advances, these computing tools are becoming much easier to access, Huang said.

“Because of artificial intelligence and the groundbreaking work that our industry has done, we have closed the technology divide in a dramatic way,” he said. “Everybody is a programmer, and the programming language of the future is called ‘human.’”

Beyond drug development, this transformation to a software-defined, AI-driven industry will also advance medical instruments.

“A medical instrument is never going to be the same again. Ultrasound systems, CT scan systems, all kinds of instruments — they’re always going to be a device plus a whole bunch of AIs,” Huang said. “The value that will create, the opportunities you create, are going to be incredible.”

For more from NVIDIA at the J.P. Morgan Healthcare Conference, listen to the audio recording and view the presentation deck of Powell’s session.

Learn about NVIDIA’s AI platform for healthcare and life sciences and subscribe to NVIDIA healthcare news.

Read More

AI Takes Center Stage: Survey Reveals Financial Industry’s Top Trends for 2024

AI Takes Center Stage: Survey Reveals Financial Industry’s Top Trends for 2024

The financial services industry is undergoing a significant transformation with the adoption of AI technologies. NVIDIA’s fourth annual State of AI in Financial Services Report provides insights into the current landscape and emerging trends for 2024.

The report reveals that an overwhelming 91% of financial services companies are either assessing AI or already using it in production. These firms are using AI to drive innovation, improve operational efficiency and enhance customer experiences.

Portfolio optimization, fraud detection and risk management remain top AI use cases, while generative AI is quickly gaining popularity with organizations keen to uncover new efficiencies.

Below are the report’s key findings, which show how the financial services industry is evolving as advanced AI becomes more accessible.

Generative AI and Large Language Models Are on the Rise

Reflecting a macro-trend seen across industries, large language models (LLMs) and generative AI have emerged as significant areas of interest for financial services companies. Fifty-five percent of survey respondents reported that they were actively seeking generative AI workflows for their companies.

Organizations are exploring generative AI and LLMs for an array of applications ranging from marketing and sales — ad copy, email copy and content production — to synthetic data generation. Of these use cases, 37% of respondents showed interest in report generation, synthesis and investment research to cut down on repetitive manual work.

Customer experience and engagement was another sought-out use case, with a 34% response rate. This suggests that financial services institutions are exploring chatbots, virtual assistants and recommendation systems to enhance the customer experience.

AI Is Having an Impact Across Departments and Disciplines

With 75% of survey respondents considering their organization’s AI capabilities to be industry leading or middle of the pack, financial services organizations are becoming more confident in their ability to build, deploy and extract value from AI implementations.

The most popular uses for AI were in operations, risk and compliance, and marketing. To improve operational efficiency, financial organizations are using AI to automate manual processes, enhance data analysis and inform investment decisions.

To enhance risk and compliance, they’re deploying AI to analyze vast amounts of data to identify suspicious activities and anomalous transaction patterns. They’re also using AI to analyze customer data to predict preferences and deliver personalized marketing campaigns, educational content and targeted promotions.

Companies are already seeing results. Forty-three percent of financial services professionals indicated that AI had improved their operational efficiency, while 42% felt it had helped their business build a competitive advantage.

A Shift in the Headwinds

In previous years, the number one challenge respondents reported was recruiting AI experts and data scientists. A 30% increase this year in survey participants resoundingly responded that data-related challenges were the primary concern. This includes data privacy challenges, data sovereignty and data scattered around the globe governed by different oversight regulations.

The growing attention to these issues reflects the advancing power and complexity of AI models, which require huge, diverse datasets to train, as well as increasing regulatory scrutiny and emphasis on responsible AI.

Recruiting and retaining AI experts remains a challenge, as do budget concerns. But more than 60% of respondents are still planning to increase investment in computing infrastructure or optimizing AI workflows, underscoring the importance of these tools in quickly building and deploying trustworthy AI to overcome these barriers.

Paving the Way for Future Investments

By and large, the survey results paint a positive picture of AI bringing greater efficiency to operations, personalization to customer engagements, and precision to investment decisions.

Finance professionals agree. Eighty-six percent of respondents reported a positive impact on revenue, while 82% noted a reduction in costs. Fifty-one percent strongly agreed that AI would be important to their company’s future success, a 76% increase from last year.

With this positive outlook, 97% of companies plan to invest more in AI technologies in the near future. Focus areas for future investments include identifying additional AI use cases, optimizing AI workflows and increasing infrastructure spending.

To build and scale impactful AI across the enterprise, financial services organizations need a comprehensive AI platform that empowers data scientists, quants and developers to seamlessly collaborate while minimizing obstacles. To that end, executives are investing more in AI infrastructure and prioritizing high-yield AI use cases to improve employee productivity while delivering superior customer experiences and investment results.

Download the “State of AI in Financial Services: 2024 Trends” report for in-depth results and insights.

Explore NVIDIA’s AI solutions and enterprise-level AI platforms for delivering smarter, more secure financial services and the AI-powered bank.

Read More

To the Cloud and Beyond: New Activision and Blizzard Games, Day Passes and G-SYNC Technology Coming to GeForce NOW

To the Cloud and Beyond: New Activision and Blizzard Games, Day Passes and G-SYNC Technology Coming to GeForce NOW

GFN Thursday recaps the latest cloud announcements from CES 2024 — Day Pass memberships, Cloud G-SYNC technology, expanded NVIDIA Reflex support and more.

The new year brings new adventures to the cloud for members, including Diablo IV and Overwatch 2 from Blizzard, Exoprimal from Capcom, Honkai: Star Rail from HoYoverse and Pax Dei from Mainframe Industries.

Plus, no GFN Thursday is complete without new games. Get ready for ten new titles joining the cloud this week.

Cloud’s-Eye View of CES

CES 2024 has come to a close, and GeForce NOW members have a lot to look forward to.

Coming in February, day passes for Ultimate and Priority memberships will offer a new way for members to play at up to GeForce RTX 4080 quality for up to 24 hours. Ultimate Day Pass will be available for $7.99, and Priority Day Pass for $3.99, providing all the benefits of both memberships to gamers before they decide to commit to the better-value one-month or six-month memberships.

G-SYNC comes to GeForce NOW
Nothing but a G-SYNC, baby.

Cloud G-SYNC support will match the display refresh rate of variable refresh rate monitors and G-SYNC-compatible monitors to the streaming rate. Paired with new 60 and 120 frames per second streaming options for GeForce NOW Reflex mode, this makes cloud gaming experiences nearly indistinguishable from using a local PC.

Ultimate members will be able to turn their phones into portable gaming rigs with support for 1440p resolutions on compatible Android phones, as well as updated keyboard and mouse support connected through a USB hub. Thanks to the cloud, these smartphones are now capable of PC gaming at Ultimate quality.

Worldwide Expansion for GeForce NOW
The cloud’s drifting into Japan.

GeForce NOW will also soon expand to Japan, operating alongside GeForce NOW Alliance partner KDDI. This will enable gamers across the country to play their favorite PC games in the cloud with Ultimate performance. Learn more and sign up for notifications.

Here Comes the Blizzard

That’s not all: GeForce NOW is bringing even more top titles to the cloud from celebrated publishers.

Following the recent release of Call of Duty, the latest games from top developer Blizzard Entertainment are coming soon to GeForce NOW. Members will be able to play the Steam versions of Diablo IV and Overwatch 2 on nearly any device with the power of a GeForce RTX 4080 rig in the cloud, with support for Battle.net coming soon.

Diablo IV on GeForce NOW
Someone check the weather in hell — seems pretty cold.

Join the fight for sanctuary in Diablo IV. Fight the forces of hell while discovering countless abilities to master, legendary loot to gather and nightmarish dungeons full of evil enemies to vanquish. Explore a shared open world where players can form their own armies to take down World Bosses, or join the fray in player vs. player zones to test skills against others.

Team up and answer the call of heroes in Overwatch 2, a free-to-play shooter featuring 30+ epic heroes, each with game-changing abilities. Lead the charge, ambush enemies or aid allies as one of Overwatch’s distinct heroes. Join the battle across dozens of futuristic maps inspired by real-world locations and master unique game modes in the always-on, ever-evolving live game.

Members can look forward to playing the Steam version of both games from the cloud, with support for the Battle.net launcher coming soon.

Honkai Star Rail coming soon to GeForce NOW
The Astral Express is coming to GeForce NOW.

Expanding the library of hit free-to-play titles for members, Honkai: Star Rail from miHoYo will soon join Genshin Impact in the cloud. The space-fantasy role-playing game is set in a diverse universe filled with wonder, adventure and thrills. Plus, members can experience all the latest updates without worrying about download times.

Mainframe Industries’ Pax Dei is a highly anticipated social sandbox massively multiplayer online game inspired by legends of the medieval era. It’s planned to release on GeForce NOW when it launches for PC.

Exoprimal on GeForce NOW
Dinosaurs? Oh my.

Capcom is working with NVIDIA to bring more of its hit titles to the cloud, including Exoprimal, an online, team-based action game that pits humanity’s cutting-edge exosuit technology against history’s most ferocious beasts: dinosaurs. Look forward to streaming it from the cloud starting Thursday, Jan. 18.

Get ready to play these titles and more at high performance coming soon. Ultimate members will be able to stream at up to 4K resolution and 120 fps with support for NVIDIA DLSS and Reflex technology, and experience the action even on low-powered devices. Keep an eye out on GFN Thursdays for the latest on game release dates in the cloud.

New to Play Today

War Hospital on GeForce NOW
Patch the wounded, mend the broken, survive the storm.

What’s a GFN Thursday without more games? Here’s what’s coming to the GeForce NOW library this week:

  • War Hospital (New release Jan. 11, available on Steam)
  • Assassin’s Creed: Valhalla (Xbox, available for PC Game Pass)
  • Jected – Rivals (Steam)
  • RAILGRADE (Steam)
  • Survivalist: Invisible Strain (Steam)
  • The Talos Principle 2 (Epic Games Store)
  • Turbo Golf Racing (Xbox, available for PC Game Pass)
  • TUNIC (Xbox, available for PC Game Pass)
  • Witch It (Steam)
  • Zombie Army 4: Dead War (Xbox, available for PC Game Pass)

Learn more about activating and playing Ubisoft games from PC Game Pass on GeForce NOW.

What are you playing this weekend? Let us know on X or in the comments below.

Read More