Beyond ‘Data-Driven’: How Energy-Efficient Computing for AI Is Propelling Innovation and Savings Across Industries

Beyond ‘Data-Driven’: How Energy-Efficient Computing for AI Is Propelling Innovation and Savings Across Industries

With advances in computing, sophisticated AI models and machine learning are having a profound impact on business and society. Industries can use AI to quickly analyze vast bodies of data, allowing them to derive meaningful insights, make predictions and automate processes for greater efficiency.

In the public sector, government agencies are achieving superior disaster preparedness. Biomedical researchers are bringing novel drugs to market faster. Telecommunications providers are building more energy-efficient networks. Manufacturers are trimming emissions from product design, development and manufacturing processes. Hollywood studios are creating impressive visual effects at a fraction of the cost and time. Robots are being deployed on important missions to help preserve the Earth. And investment advisors are running more trade scenarios to optimize portfolios.

Eighty-two percent of companies surveyed are already using or exploring AI, and 84% report that they’re increasing investments in data and AI initiatives. Any organization that delays AI implementation risks missing out on new efficiency gains and becoming obsolete.

However, AI workloads are computationally demanding, and legacy computing systems are ill-equipped for the development and deployment of AI. CPU-based compute requires linear growth in power input to meet the increased processing needs of AI and data-heavy workloads. If data centers are using carbon-based energy, it’s impossible for enterprises to innovate using AI while controlling greenhouse gas emissions and meeting sustainability commitments. Plus, many countries are introducing tougher regulations to enforce data center carbon reporting.

Accelerated computing — the use of GPUs and special hardware, software and parallel computing techniques — has exponentially improved the performance and energy efficiency of data centers.

Below, read more on how industries are using energy-efficient computing to scale AI, improve products and services, and reduce emissions and operational costs.

The Public Sector Drives Research, Delivers Improved Citizen Services 

Data is playing an increasingly important role in government services, including for public health and disease surveillance, scientific research, social security administration, and extreme-weather monitoring and management. These operations require platforms and systems that can handle large volumes of data, provide real-time data access, and ensure data quality and accuracy.

But many government agencies rely on legacy systems that are difficult to maintain, don’t efficiently integrate with modern technologies and consume excessive energy. To handle increasingly demanding workloads while sticking to sustainability goals, government agencies and public organizations must adopt more efficient computing solutions.

The U.S. Department of Energy is making inroads in this endeavor. The department runs the National Energy Research Scientific Computing Center for open science. NERSC develops simulations, data analytics and machine learning solutions to accelerate scientific discovery through computation. Seeking new computing efficiencies, the center measured results across four of its key high performance computing and AI applications. It clocked how fast the applications ran, as well as how much energy they consumed using CPU-only versus GPU-accelerated nodes on Perlmutter, one of the world’s largest supercomputers.

At performance parity, a GPU-accelerated cluster consumes 588 less megawatt hours per month, representing a 5x improvement in energy efficiency. By running the same workload on GPUs rather than CPU-only instances, researchers could save millions of dollars per month. These gains mean that the 8,000+ researchers using NERSC computing infrastructure can perform more experiments on important use cases, like studying subatomic interactions to uncover new green energy sources, developing 3D maps of the universe and bolstering a broad range of innovations in materials science and quantum physics.

Governments help protect citizens from adverse weather events, such as hurricanes, floods, blizzards and heat waves. With GPU deployments, climate models, like the IFS model from the European Centre for Medium-Range Weather Forecasts, can run up to 24x faster while reducing annual energy usage by up to 127 gigawatt hours compared to CPU-only systems. As extreme-weather events occur with greater frequency and, often, with little warning, meteorology centers can use accelerated computing to generate more accurate, timely forecasts that improve readiness and response.

By adopting more efficient computing systems, governments can save costs while equipping researchers with the tools they need for scientific discoveries to improve climate modeling and forecasting, as well as deliver superior services in public health, disaster relief and more.

Drug Discovery Researchers Conduct Virtual Screenings, Generate New Proteins at Light Speed

Drug development has always been a time-consuming process that involves innumerable calculations and thousands of experiments to screen new compounds. To develop novel medications, the binding properties of small molecules must be tested against protein targets, a cumbersome task required for up to billions of compounds — which translates to billions of CPU hours and hundreds of millions of dollars each year.

Highly accurate AI models can now predict protein structures, generate small molecules, predict protein-ligand binding and perform virtual screening.

Researchers at Oak Ridge National Laboratory (ORNL) and Scripps Research have shown that screening a dataset of billions of compounds against a protein, which has traditionally taken years, can now be completed in just hours with accelerated computing. By running AutoDock, a molecular-modeling simulation software, on a supercomputer with more than 27,000 NVIDIA GPUs, ORNL screened more than 25,000 molecules per second and evaluated the docking of 1 billion compounds in less than 12 hours. This is a speedup of more than 50x compared with running AutoDock on CPUs.

Iambic, an AI platform for drug discovery, has developed an approach combining quantum chemistry and AI that calculates quantum-accurate molecular-binding energies and forces at a fraction of the computational expense of traditional methods. These energies and forces can power molecular-dynamics simulations at unprecedented speed and accuracy. With its OrbNet model, Iambic uses a graph transformer to power quantum-mechanical operators that represent chemical structures. The company is using the technology to identify drug molecules that could deactivate proteins linked to certain cancer types.

As the number of new drug approvals declines and research and development and computing costs rise, optimizing drug discovery with accelerated computing can help control energy expenditures while creating a far-reaching impact on medical research, treatments and patient outcomes.

Telcos Scale Network Capacity

To connect their subscribers, telecommunications companies send data across sprawling networks of cell towers, fiber-optic cables and wireless signals. In the U.S., AT&T’s network connects more than 100 million users from the Aleutian Islands in Alaska to the Florida Keys, processing 500 petabytes of data per day. As telcos add compute-intensive workloads like AI and user plane function (UPF) to process and route data over 5G networks, power consumption costs are skyrocketing.

AT&T processes trillions of data rows to support field technician dispatch operations, generate performance reports and power mobile connectivity. To process data faster, AT&T tested the NVIDIA RAPIDS Accelerator for Apache Spark. By spreading work across nodes in a cluster, the software processed 2.8 trillion rows of information — a month’s worth of mobile data — in just five hours. That’s 3.3x faster at 60% lower cost than any prior test.

Other telcos are saving energy by offloading networking and security tasks to SmartNICs and data processing units (DPUs) to reduce server power consumption. Ericsson, a leading telecommunications equipment manufacturer, tested a 5G UPF on servers with and without network offload to an NVIDIA ConnectX-6 Dx NIC. At maximum network traffic, the network offloading provided 23% power savings. The study also found that CPU micro-sleeps and frequency scaling — allowing CPUs to sleep and slow their clock frequencies during low workload levels — saved more than 10% of power per CPU.

Hardware-accelerated networking offloads like these allow telco operators to increase network capacity without a proportional increase in energy consumption, ensuring that networks can scale to handle increased demand and conserve energy during times of low use. By adopting energy-efficient accelerated computing, telco operators can reduce their carbon footprint, improve scalability and lower operational costs.

Manufacturing and Product Design Teams Achieve Faster, Cleaner Simulations

Many industries rely on computational fluid dynamics during design and engineering processes to model fluid flows, combustion, heat transfer and aeroacoustics. The aerospace and automotive industries use CFD to model vehicle aerodynamics, and the energy and environmental industries use it to optimize fluid-particle refining systems and model reactions, wind-farm air flow and hydro-plant water flow.

Traditional CFD methods are compute-intensive, using nearly 25 billion CPU core hours annually, and consume massive amounts of energy. This is a major obstacle for industrial companies looking to reduce carbon emissions and achieve net zero. Parallel computing with GPUs is making a difference.

Ansys, an engineering simulation company, is speeding up CFD physics models with GPUs to help customers drastically reduce emissions while improving the aerodynamics of vehicles. To measure computing efficiency, the company ran the benchmark DrivAer model, used for optimizing vehicle geometry, on different CPU and GPU configurations using its Fluent fluid-simulation software. Results showed that a single GPU achieved more than 5x greater performance than a cluster with 80 CPU cores. With eight GPUs, the simulation experienced more than a 30x speedup. And a server with six GPUs reduced power consumption 4x compared with a high performance computing CPU cluster delivering the same performance.

CPFD offers GPU parallelization for Barracuda Virtual Reactor, a physics-based engineering software package capable of predicting fluid, particulate-solid, thermal and chemically reacting behavior in fluidized bed reactors and other fluid-particle systems.

Using CPFD’s Barracuda software, green energy supplier ThermoChem Recovery International (TRI) developed technology that converts municipal solid waste and woody biomass into jet fuel. Since its partnership with CPFD began 14 years ago, TRI has benefitted from 1,500x model speedups as CPFD moved its code from CPU hardware to full GPU parallelization. With these exponential speedups, models that would’ve previously taken years to run can now be completed in a day or less, saving millions of dollars in data center infrastructure and energy costs.

With GPU parallelization and energy-efficient architectures, industrial design processes that rely on CFD can benefit from dramatically faster simulations while achieving significant energy savings.

Media and Entertainment Boost Rendering

Rendering visual effects (VFX) and stylized animations consumes nearly 10 billion CPU core hours per year in the media and entertainment industry. A single animated film can require over 50,000 CPU cores working for more than 300 million hours. Enabling this necessitates a large space for data centers, climate control and computing — all of which result in substantial expenditures and a sizable carbon footprint.

Accelerated computing offers a more energy-efficient way to produce VFX and animation, enabling studios to iterate faster and compress production times.

Studios like Wylie Co., known for visuals in the Oscar-winning film Dune and in HBO and Netflix features, are adopting GPU-powered rendering to improve performance and save energy. After migrating to GPU rendering, Wylie Co. realized a 24x performance boost over CPUs.

Image Engine, a VFX company involved in creating Marvel Entertainment movies and Star Wars-based television shows, observed a 25x performance improvement by using GPUs for rendering.

GPUs can increase performance up to 46x while reducing energy consumption by 10x and capital expenses by 6x. With accelerated computing, the media and entertainment industry has the potential to save a staggering $900 million in hardware acquisition costs worldwide and conserve 215 gigawatt hours of energy that would have been consumed by CPU-based render farms. Such a shift would lead to substantial cost savings and significant reductions in the industry’s environmental impact.

Robotics Developers Extend Battery Life for Important Missions 

With edge AI and supercomputing now available using compact modules, demand for robots is surging for use in factory logistics, sales showrooms, urban delivery services and even ocean exploration. Mobile robot shipments are expected to climb from 549,000 units last year to 3 million by 2030, with revenue forecast to jump from more than $24 billion to $111 billion in the same period, according to ABI Research.

Most robots are battery-operated and rely on an array of lidar sensors and cameras for navigation. Robots communicate with edge servers or clouds for mission dispatch and require high throughput due to diverse sets of camera sensors as well as low latency for real-time decision-making. These factors necessitate energy-efficient onboard computing.

Accelerated edge computing can be optimized to decode images, process video and analyze lidar data to enable robot navigation of unstructured environments. This allows developers to build and deploy more energy-efficient machines that can remain in service for longer without needing to charge.

The Woods Hole Oceanographic Institution Autonomous Robotics and Perception Laboratory (WARPLab) and MIT are using the NVIDIA Jetson Orin platform for energy-efficient edge AI and robotics to power an autonomous underwater vehicle to study coral reefs.

The AUV, named CUREE, for Curious Underwater Robot for Ecosystem Exploration, gathers visual, audio and other environmental data to help understand the human impact on reefs and sea life. With 25% of the vehicle’s power needed for data collection, energy efficiency is a must. With Jetson Orin, CUREE constructs 3D models of reefs, tracks marine organisms and plant life, and autonomously navigates and gathers data. The AUV’s onboard energy-efficient computing also powers convolutional neural networks that enhance underwater vision by reducing backscatter and correcting colors. This enables CUREE to transmit clear images to scientists, facilitating fish detection and reef analysis.

Driverless smart tractors with energy-efficient edge computing are now available to help farmers with automation and data analysis. The Founder Series MK-V tractors, designed by NVIDIA Inception member Monarch Tractor, combine electrification, automation and data analysis to help farmers reduce their carbon footprint, improve field safety and streamline farming operations. Using onboard AI video analytics, the tractor can traverse rows of crops, enabling it to navigate even in remote areas without connectivity or GPS.

The MK-V tractor produces zero emissions and is estimated to save farmers $2,600 annually compared to diesel tractors. The tractor’s AI data analysis advises farmers on how to reduce the use of expensive, harmful herbicides that deplete the soil. Decreasing the volume of chemicals is a win all around, empowering farmers to protect the quality of soil, reduce herbicide expenditures and deliver more naturally cultivated produce to consumers.

As energy-efficient edge computing becomes more accessible to enable AI, expect to see growing use cases for mobile robots that can navigate complex environments, make split-second decisions, interact with humans and safely perform difficult tasks with precision.

Financial Services Use Data to Inform Investment Decisions 

Financial services is an incredibly data-intensive industry. Bankers and asset managers pursuing the best results for investors rely on AI algorithms to churn through terabytes of unstructured data from economic indicators, earnings reports, news articles, and disparate environmental, social and governance metrics to generate market insight that inform investments. Plus, financial services companies must comb through network data and transactions to prevent fraud and protect accounts.

NVIDIA and Dell Technologies are optimizing computing for financial workloads to achieve higher throughput, speed and capacity with greater energy efficiency. The Strategic Technology Analysis Center, an organization dedicated to technology discovery and assessment in the finance industry, recently tested the STAC-A2 benchmark tests on several computing stacks comprising CPU-only infrastructure and GPU-based infrastructure. The STAC-A2 benchmark is designed by quants and technologists to measure the performance, scalability, quality and resource efficiency of technology stacks running market-risk analysis for derivatives.

When testing the STAC-A2 options pricing benchmark, the Dell PowerEdge server with NVIDIA GPUs performed 16x faster and 3x more energy efficiently than a CPU-only system for the same workload. This enables investment advisors to integrate larger bodies of data into derivatives risk-analysis calculations, enabling more data-driven decisions without increasing computing time or energy requirements.

PayPal, which was looking to deploy a new fraud-detection system to operate 24/7, worldwide and in real time to protect customer transactions, realized CPU-only servers couldn’t meet such computing requirements. Using NVIDIA GPUs for inference, PayPal improved real-time fraud detection by 10% and lowered server energy consumption by nearly 8x.

With accelerated computing, financial services organizations can run more iterations of investment scenarios, improve risk assessments and make more informed decisions for better investment results. Accelerated computing is the foundation for improving data throughput, reducing latency and optimizing energy usage to lower operating costs and achieve emissions goals.

An AI Future With Energy-Efficient Computing

With energy-efficient computing, enterprises can reduce data center costs and their carbon footprint while scaling AI initiatives and data workloads to stay competitive.

The NVIDIA accelerated computing platform offers a comprehensive suite of energy-efficient hardware and software to help enterprises use AI to drive innovation and efficiency without the need for equivalent growth in energy consumption.

With more than 100 frameworks, pretrained models and development tools optimized for GPUs, NVIDIA AI Enterprise accelerates the entire AI journey, from data preparation and model training to inference and scalable deployment. By getting their AI into production faster, businesses can significantly reduce overall power consumption.

With the NVIDIA RAPIDS Accelerator for Apache Spark, which is included with NVIDIA AI Enterprise, data analytics workloads can be completed 6x faster, translating to 5x savings on infrastructure and 6x less power used for the same amount of work. For a typical enterprise, this means 10 gigawatt hours less energy consumed compared with running jobs without GPU acceleration.

NVIDIA BlueField DPUs bring greater energy efficiency to data centers by offloading and accelerating data processing, networking and security tasks from the main CPU infrastructure. By maximizing performance per watt, they can help enterprises slash server power consumption by up to 30%, saving millions in data center costs.

As businesses shift to a new paradigm of AI-driven results, energy-efficient accelerated computing is helping organizations deliver on the promise of AI while controlling costs, maintaining sustainable practices and ensuring they can keep up with the pace of innovation.

Learn how accelerated computing can help organizations achieve both AI goals and carbon-footprint objectives.

Read More

Twitch Streamer Mr_Vudoo Supercharges Gaming, Entertaining and Video Editing With RTX This Week ‘In the NVIDIA Studio’

Twitch Streamer Mr_Vudoo Supercharges Gaming, Entertaining and Video Editing With RTX This Week ‘In the NVIDIA Studio’

Editor’s note: This post is part of our weekly In the NVIDIA Studio series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how NVIDIA Studio technology improves creative workflows. We’re also deep diving on new GeForce RTX 40 Series GPU features, technologies and resources, and how they dramatically accelerate content creation.

Mr_Vudoo is a digital renaissance man — a livestreamer, video editor, gamer and entertainer skilled in producing an array of content for his audience.

This week’s featured artist In the NVIDIA Studio, he recently acquired a new GeForce RTX 4080 SUPER graphics card, which helps creators like him take their content to the next level. (Read more about the 4080 SUPER below.)

There’s no better place for creative types to connect with others and explore what’s next in AI and accelerated computing than GTC, which is back in-person, from March 18-21 in San Jose.

From the keynote by NVIDIA founder and CEO Jensen Huang to hundreds of sessions, exhibits and networking events, GTC delivers something for every technical level and interest area, including sessions on how to power content creation using OpenUSD and generative AI. GTC registration is open for virtual or in-person attendance.

Join sessions like “What’s Next in Generative AI” in person or virtually.

In other NVIDIA Studio news, Topaz Labs, a company that delivers AI-powered photo and video enhancement software, recently adopted NVIDIA TensorRT acceleration for its new Remove Tool. It uses AI to replace unwanted objects in an image with a context-aware background. The tool expedites photo editing workflows, delivering 2.4x faster processing on a GeForce RTX 4090 GPU compared with an Apple MacBook M3 Max.

Topaz Lab’s TensorRT-powered Remove Tool removes unwanted objects with just a click.

Mr_Vudoo Taps RTX for a Livestreaming Upgrade

Mr_Vudoo’s Twitch channel is known for its variety of unique content. In his Trading Card Games series, Mr_Vudoo opens trading card packs and competes at one of the highest levels live on stream. In his Gameplay series, he goes back to his original love for gaming, streaming both multiplayer online games and third-person shooters. And in his In Real Life series, he breaks the norm of traditional streaming bringing his viewers outside to share his everyday life experiences.

It takes significant computing power to bring these series to life. Mr_Vudoo’s GeForce RTX 4080 SUPER features the eighth-generation NVIDIA NVENC — an independent component for encoding video that frees up the system to run games or tackle other compute-intensive tasks. Using it, Mr_Vudoo can achieve a more seamless streaming experience.

“It’s a revelation to stream with high-quality settings and get almost no performance loss in games,” he said.

Mr_Vudoo can also join in the new Twitch Enhanced Broadcasting beta, powered by GeForce RTX GPUs and NVENC, to broadcast up to three resolutions simultaneously at up to 1080p. This eliminates the need to trade off resolution for stream reliability.

In the coming months, Enhanced Broadcasting beta testers will be able to experiment with higher input bit rates, up to 4K resolutions, support for up to five concurrent streams and new codecs.

Mr_Vudoo uses a number of AI-enabled features in the video and photo editing part of his creative workflow. With RTX acceleration, he can add a video camera effect with a timed zoom and a film strip transition in real time without having to render the entire project.

 

“Adding multiple effects on a clip without affecting the preview of the video is a massive time-saver,” he said.

DaVinci Resolve has several RTX-accelerated AI features that can boost content creation, offering tools to smooth slow-motion effects or provide seamless video super resolution. These features are available to all RTX GPU owners.

“GeForce RTX graphics cards are the best GPUs for video editors to use, as they can render tasks much faster, allowing us to become more efficient with our work.” — Mr_Vudoo

Mr_Vudoo can quickly export files with the RTX 4080 SUPER’s dual encoders, which work in tandem to slash export times nearly in half.

In post-production, Mr_Vudoo uses Adobe Photoshop’s AI-powered subject selection tool to quickly isolate objects in an image, instead of having to manually crop them out, speeding his work.

 

Mr_Vudoo also taps the free NVIDIA Broadcast app to boost his productivity.

“I’ve utilized the video noise removal and background replacement the most,” he said. “The eye contact feature was very interesting and quite honestly took me by surprise at how well it worked.”

AI has become an irreplaceable part of Mr_Vudoo’s content creation process, helping him quickly and effortlessly produce his best work. Catch him on Twitch.

 

RTX 4080 SUPER Brings Super Performance

GeForce RTX 4080 SUPER graphics cards are changing the content creation game.

Generative AI apps like Adobe Photoshop can take advantage of the GPU’s Tensor Cores to speed productivity and creative workflows. With the 4080 SUPER, 3D apps like Blender can run up to 70% faster than on previous-generation graphics cards. And video editing apps like Blackmagic Design’s DaVinci Resolve can accelerate AI effects over 30% faster than with the GeForce RTX 3080 Ti.

For gamers, the RTX 4080 SUPER enables greater immersion, with fully ray-traced visuals and the ability to run all settings at max. It delivers twice the speed of the RTX 3080 Ti, up to 836 trillion operations per second, in the most graphically intensive games with DLSS Frame Generation.

Get creative and AI superpowers with the GeForce RTX 4080 SUPER.

Since its release, GeForce RTX 4080 SUPER Series GPUs have been put to the test in creating, gaming and other AI-powered tasks. Here’s what some reviewers had to say:

  • “Jumping over to creative professionals and content creators, the 4080 Super also provides nice performance gains over the standard GeForce RTX 4080 in applications like Blender, Maya, After Effects, DaVinci Resolve and more. This means users can take full advantage of what the NVIDIA 4080 SUPER offers in much more than just gaming and can push the software they use for streaming, video creation, audio and 3D creation to get the most out of their PC.” – CG Magazine
  • “Features like NVIDIA Broadcast and Reflex hold deep practical appeal; RTX Video Super Resolution uses AI to make ugly videos beautiful. And NVIDIA maintains a strong lead in most creative and machine learning/AI workloads if you like to put your GPU to work when you’re not playing — witness the dual AV1 encoders in the 4080 SUPER” — PC World  
  • “Blender can make use of the RT cores on NVIDIA’s GPUs through the OptiX ray tracing rendering engine, and as a result, performance is much higher than any competing GPU in a similar class. The GeForce RTX 4080 SUPER notches another victory over its namesake, and dang the RTX 4090 is a beast.” – Hot Hardware
  • “In terms of creative performance, the RTX 4080 SUPER walks away the winner against the RX 7900 XTX, even if you don’t factor in the fact that Blender Benchmark 4.0.0 workloads wouldn’t even run on the RX 7900 XTX (though the RX 7900 XT was able to run them, just not nearly as well).” – Tech Radar
  • “And when you throw in RTX technologies like DLSS into the mix, NVIDIA’s superior AV1 encoding quality, content-creator-friendly features, and performance, plus generative AI capabilities – and there’s a lot more to the story here than pure 4K gaming EXPERT-ise.” — TweakTown

Discover what RTX 4080 SUPER Series graphics cards and systems are available.

Read More

Canada Partners with NVIDIA to Supercharge Computing Power

Canada Partners with NVIDIA to Supercharge Computing Power

AI is reshaping industries, society and the “very fabric of innovation” — and Canada is poised to play a key role in this global transformation, said NVIDIA founder and CEO Jensen Huang during a fireside chat with leaders from across Canada’s thriving AI ecosystem.

“Canada, as you know, even though you’re so humble, you might not acknowledge it, is the epicenter of the invention of modern AI,” Huang told an audience of more than 400 from academia, industry and government gathered Thursday in Toronto.

In a pivotal development, Canada’s Industry Minister François-Philippe Champagne shared Friday on X, formerly known as Twitter, that Canada has signed a letter of intent with NVIDIA.

Nations including Canada, France, India and Japan are discussing the importance of investing in “sovereign AI capabilities,” Huang said in an interview with Bloomberg Television in Canada.

Such efforts promise to enhance domestic computing capabilities, turbocharging local economies and unlocking local talent.

“Their natural resource, data, should be refined and produced for their country. The recognition of sovereign AI capabilities is global,” Huang told Bloomberg.

Huang’s conversation with the group of Canadian AI leaders, or “four heroes of mine,” as he described them — Raquel Urtasun of Waabi, Brendan Frey of Deep Genomics, University of Toronto Professor Alan Aspuru-Guzik and Aiden Gomez of Cohere — highlighted both Canada’s enormous contributions and its growing capability as a leader in AI.

“Each one of you,” Huang remarked, addressing the panelists and noting that every one of them is affiliated with the University of Toronto, “are doing foundational work in some of the largest industries in the world.”

“Let’s seize the moment,” Champagne said as he kicked off the panel discussion. “We need to move from fear to opportunity to build trust so that people understand what AI can do for them.”

“It’s about inspiring young researchers to continue to do research here in Canada and about creating opportunities for them after they graduate to be able to start companies here,” Huang said.

The panelists echoed Huang’s optimism, providing insights into how AI is reshaping industries, society and technology.

Gomez, reflecting on the democratization of AI, shared his optimism for the future, stating that it’s “an exciting time to explore product space,” highlighting the vast opportunities for innovation and disruption within the AI landscape.

Cohere’s world-class large language models help enterprises build powerful, secure applications that search, understand meaning and converse in text.

Gomez said the future lies in communities of researchers and entrepreneurs able to leverage AI to bridge technological divides and foster inclusivity.

“I owe everything to this community, the people on the stage and in this room,” he said, acknowledging the collaborative spirit that fuels innovation in AI technologies.

Urtasun highlighted the imperative of safety in autonomous technology as a non-negotiable standard, emphasizing its role in saving lives and shaping the future of transportation.

“Safety should not be proprietary. This technology is a driver, and it’s going to save so many lives,” she said.

Frey underscored the transformative impact of AI in RNA biology. He said that, while “it’s taken quite a bit of time,” at Deep Genomics, he and his colleagues have built the world’s first foundation model for RNA biology, envisioning a future where drug discovery is accelerated, bringing life-saving therapies to market more efficiently.

“If we’re highly successful, best-case scenario, it means that we will be able to design molecules that are safe, and highly efficacious, without doing any cell model studies without doing any animal studies … and getting drugs that save our loved ones rapidly and safely,” Frey said.

Aspuru-Guzik pointed to the fusion of AI with quantum computing and materials science as a frontier for sustainable solutions, emphasizing the importance of creating a conducive environment for innovation in Canada.

“We want to build it here in Canada,” he said.

His work exemplifies the potential of AI to accelerate the development of new materials, driving forward a sustainable future.

Together, these visions articulate a future where AI enhances societal well-being, drives scientific advancement and fosters an inclusive, global community of innovation.

“AI is about the greatest opportunity to close the technology divide and be inclusive, for everybody to enjoy the technology revolution,” Huang said.

For more on the fast-growing impact of AI across the globe, visit NVIDIA’s AI nations hub

Featured image credit: Christian Raul Hernandez, Creative Commons Attribution-Share Alike 4.0 International license.

Read More

New Study Cites AI as Strategic Tool to Combat Climate Change

New Study Cites AI as Strategic Tool to Combat Climate Change

A new study underscores the potential of AI and accelerated computing to deliver energy efficiency and combat climate change, efforts in which NVIDIA has long been deeply engaged.

The study, called “Rethinking Concerns About AI’s Energy Use,” provides a well-researched examination into how AI can — and in many cases already does — play a large role in addressing these critical needs.

Citing dozens of sources, the study from the Information Technology and Innovation Foundation (ITIF), a Washington-based think tank focused on science and technology policy, calls for governments to accelerate adoption of AI as a significant new tool to drive energy efficiency across many industries.

AI can help “reduce carbon emissions, support clean energy technologies, and address climate change,” it said.

How AI Drives Energy Efficiency

The report documents ways machine learning is already helping many sectors reduce their impact on the environment.

For example, it noted:

  • Farmers are using AI to lessen their use of fertilizer and water.
  • Utilities are adopting it to make the electric grid more efficient.
  • Logistics operations use it to optimize delivery routes, reducing the fuel consumption of their fleets.
  • Factories are deploying it to reduce waste and increase energy efficiency.

In these and many other ways, the study argues that AI advances energy efficiency. So, it calls on policymakers “to ensure AI is part of the solution, not part of the problem, when it comes to the environment.”

It also recommends adopting AI broadly across government agencies to “help the public sector reduce carbon emissions through more efficient digital services, smart cities and buildings, intelligent transportation systems, and other AI-enabled efficiencies.”

Reviewing the Data on AI

The study’s author, Daniel Castro, saw in current predictions about AI a repeat of exaggerated forecasts that emerged during the rise of the internet more than two decades ago.

Daniel Castro, ITIF
Daniel Castro

“People extrapolate from early studies, but don’t consider all the important variables including improvements you see over time in digitalization like energy efficiency,” said Castro, who leads ITIF’s Center for Data Innovation.

“The danger is policymakers could miss the big picture and hold back beneficial uses of AI that are having positive impacts, especially in regulated areas like healthcare,” he said.

“For example, we’ve had electronic health records since the 1980s, but it took focused government investments to get them deployed,” he added. “Now AI brings big opportunities for decarbonization across the government and the economy.”

Optimizing Efficiency Across Data Centers

Data centers of every size have a part to play in maximizing their energy efficiency with AI and accelerated computing.

For instance, NVIDIA’s AI-based weather-prediction model, FourCastNet, is about 45,000x faster and consumes 12,000x less energy to produce a forecast than current techniques. That promises efficiency boosts for supercomputers around the world that run continuously to provide regional forecasts, Bjorn Stevens, director of the Max Planck Institute for Meteorology, said in a blog.

Overall, data centers could save a whopping 19 terawatt-hours of electricity a year if all AI, high performance computing and networking offloads were run on GPU and DPU accelerators instead of CPUs, according to NVIDIA’s calculations. That’s the equivalent of the energy consumption of 2.9 million passenger cars driven for a year.

Last year, the U.S. Department of Energy’s lead facility for open science documented its advances with accelerated computing.

Using NVIDIA A100 Tensor Core GPUs, energy efficiency improved 5x on average across four key scientific applications in tests at the National Energy Research Scientific Computing Center. An application for weather forecasting logged gains of nearly 10x.

Chart showing the energy efficiency of GPUs has increased dramatically over time.
The energy efficiency of GPUs has increased dramatically over time.

AI, Accelerated Computing Advance Climate Science

The combination of accelerated computing and AI is creating new scientific instruments to help understand and combat climate change.

In 2021, NVIDIA announced Earth-2, an initiative to build a digital twin of Earth on a supercomputer capable of simulating climate on a global scale. It’s among a handful of similarly ambitious efforts around the world.

An example is Destination Earth, a pan-European project to create digital twins of the planet, that’s using accelerated computing, AI and “collaboration on an unprecedented scale,” said the project’s leader, Peter Bauer, a veteran with more than 20 years at Europe’s top weather-forecasting center.

Experts in the utility sector agree AI is key to advancing sustainability.

“AI will play a crucial role maintaining stability for an electric grid that’s becoming exponentially more complex with large numbers of low-capacity, variable generation sources like wind and solar coming online, and two-way power flowing into and out of houses,” said Jeremy Renshaw, a senior program manager at the Electric Power Research Institute, an independent nonprofit that collaborates with more than 450 companies in 45 countries, in a blog.

Learn more about sustainable computing as well as NVIDIA’s commitment to use 100% renewable energy starting in fiscal year 2025. And watch the video below for more on how AI is accelerating efforts to combat climate change.

Read More

GeForce NOW Leaps Into Its Fourth Year With 27 New Games and More Celebrations All Month Long

GeForce NOW Leaps Into Its Fourth Year With 27 New Games and More Celebrations All Month Long

GeForce NOW is celebrating its fourth anniversary all month — plus an extra day for leap year — during February’s GFN Thursdays, with 2 new games joining the cloud. Keep an eye out for more new games and other announcements for members to come.

Diablo IV and Overwatch 2 on GeForce NOW
The cloud is getting hotter.

Diablo IV and Overwatch 2 heat up the cloud this GFN Thursday to kick off the anniversary celebrations. They’re the next Activision and Blizzard games to join GeForce NOW as part of the NVIDIA and Microsoft partnership, following Call of Duty.

The titles join the RAGE franchise as part of five new games joining the ever-expanding GeForce NOW library this week.

New GeForce NOW IG channel
Did it for the ‘Gram.

Follow the #4YearsofGFN celebration and get the latest cloud gaming news on the newly launched GeForce NOW Instagram and Threads channels. Use #GFNShare and tag these accounts for a chance to be featured.

 

Oh, Heck

Diablo 4 on GeForce NOW
Don’t be a chicken.

Join the fight to protect the world of Sanctuary in the highly acclaimed online action role-playing game Diablo IV with the power of a GeForce RTX 4080 gaming rig in the cloud. Members can look forward to playing the Steam version, with support for the Battle.net launcher coming soon.

The battle between the High Heavens and Burning Hells rages on as chaos threatens to consume Sanctuary. Create customized characters and choose between five classes: Barbarian, Druid, Necromancer, Rogue and Sorcerer. Each features unique skills and talents to help in battle.

Jump into the dark, thrilling campaign solo or with friends. Explore a shared open world where players can form parties to take down World Bosses, or join the fray in player vs. player zones to test skills against others. Explore and loot over 120 monster-filled dungeons, and clear out Strongholds to use as safe havens for Sanctuary citizens.

Ultimate members can stand triumphant at up to 4K resolution and 120 frames per second with support for NVIDIA DLSS technology and experience the action even on low-powered devices.

Be a Hero

Overwatch 2 on GeForce NOW
Time to save the world again.

Team up and answer the call of heroes in Overwatch 2, streaming today on GeForce NOW. Members can look forward to playing the Steam version, with support for the Battle.net launcher coming soon.

Overwatch 2 features 30+ epic heroes, each with game-changing abilities. Battle across all-new maps and modes in this ultimate team-based game. Lead the charge, ambush enemies or aid allies as one of Overwatch’s distinct heroes. Join the battle across dozens of futuristic maps inspired by real-world locations, and master unique game modes in the always-on, ever-evolving, live game.

Ultimate members can join the action with support for 4K and ultra-low latency. It’s the perfect upgrade for teammates on low-powered devices, ensuring everyone’s playing at heroic quality.

Rage Out

Rage 2 on GeForce NOW
No rules, no limits, no mercy!

Get ready for action with this week’s five new additions, including the RAGE series:

And check out what’s coming throughout the rest of the month:

  • Stormgate (Demo on Steam, available Feb. 5-12 during Steam Next Fest)
  • The Inquisitor (New release on Steam, Feb. 8)
  • Banishers: Ghosts of New Eden (New release on Steam, Feb. 13)
  • Solium Infernum (New release on Steam, Feb. 14)
  • Skull and Bones (New release on Ubisoft, Feb. 16)
  • The Thaumaturge (New release on Steam, Feb. 20)
  • Myth of Empires (New re-release on Steam, Feb. 21)
  • Terminator: Dark Fate – Defiance (New release on Steam, Feb. 21)
  • Nightingale (New release on Steam, Feb. 22)
  • Garden Life: A Cozy Simulator (New release on Steam, Feb. 22)
  • Pacific Drive (New release on Steam, Feb. 22)
  • STAR WARS: Dark Forces Remaster (New release on Steam, Feb. 28)
  • Welcome to ParadiZe (New release on Steam, Feb. 29)
  • Aragami 2 (Xbox, available on Microsoft Store)
  • dotAGE (Steam)
  • Fort Solis (Steam)
  • Katamari Damacy REROLL (Steam)
  • Klonoa Phantasy Reverie Series (Steam)
  • PAC-MAN MUSEUM+ (Steam)
  • PAC-MAN WORLD Re-PAC (Steam)
  • Tales of Arise (Steam)
  • Tram Simulator Urban Transit (Steam)
  • The Walking Dead: The Telltale Definitive Series (Steam)

Joyful January

In addition to the 20 games announced last month, 14 more joined the GeForce NOW library:

  • Those Who Remain (New release on Xbox, available on PC Game Pass, Jan. 16)
  • New Cycle (New release on Steam, Jan. 18)
  • Assassin’s Creed: Valhalla (Xbox, available for PC Game Pass)
  • Beacon Pines (Xbox, available on the Microsoft Store)
  • Exoprimal (Steam)
  • FAR: Changing Tides (Xbox, available on the Microsoft Store)
  • Going Under (Xbox, available on the Microsoft Store)
  • Metal: Hellsinger (Xbox, available on the Microsoft Store)
  • Road 96: Mile 0 (Xbox, available on the Microsoft Store)
  • The Talos Principle 2 (Epic Games Store)
  • TUNIC (Xbox, available for PC Game Pass)
  • Turbo Golf Racing (Xbox, available for PC Game Pass)
  • Turnip Boy Commits Tax Evasion (Xbox, available on the Microsoft Store)
  • Zombie Army 4: Dead War (Xbox, available for PC Game Pass)

Discussions with Spike Chunsoft have confirmed that the publisher’s games will remain on GeForce NOW. Members can continue to stream these titles from the cloud:

  • 428 Shibuya Scramble
  • AI: The Somnium Files
  • Conception PLUS: Maidens of the Twelve Stars
  • Danganronpa: Trigger Happy Havoc
  • Danganronpa 2: Goodbye Despair
  • Danganronpa V3: Killing Harmony
  • Danganronpa Another Episode: Ultra Despair Girls
  • Fire Pro Wrestling World
  • Re: ZERO – Starting Life in Another World – The Prophecy of the Throne
  • RESEARCH and DESTROY
  • Shiren the Wanderer: The Tower of Fortune and the Dice of Fate
  • STEINS;GATE
  • Zanki Zero: Last Beginning
  • Zero Escape: The Nonary Games

What are you looking forward to streaming? Let us know on X or in the comments below.

Read More

Cardiac Clarity: Dr. Keith Channon Talks Revolutionizing Heart Health With AI

Here’s some news to still beating hearts: AI is helping bring some clarity to cardiology. Caristo Diagnostics has developed an AI-powered solution for detecting coronary inflammation in cardiac CT scans. In this episode of NVIDIA’s AI Podcast, Dr. Keith Channon, the Field Marshal Earl Alexander Professor at the University of Oxford, and the cofounder and chief medical officer at the startup, speaks with host Noah Kravtiz about the technology. Called Caristo, it analyzes radiometric features in CT scan data to identify inflammation in the fat tissue surrounding coronary arteries, a key indicator of heart disease. Tune in to learn more about how Caristo uses AI to improve treatment plans and risk predictions by providing physicians with a patient-specific readout of inflammation levels.

Show Notes

1:56: What is ‌Caristo and how does it work?
7:11 The key signal of a heart attack
10:34 How did Channon come up with the idea of using AI to drive breakthroughs?
22:40 How much has the CT scan changed over the years?
26:01: What’s ahead for Caristo?
30:14: How to take care of your own heart health

You Might Also Like

Immunai Uses Deep Learning to Develop New Drugs – Ep. 176
What if we could map our immune system to create drugs that can help our bodies win the fight against cancer and other diseases? That’s the big idea behind immunotherapy. The problem: the immune system is incredibly complex. Enter Immunai, a biotechnology company using AI technology to map the human immune system and speed the development of new immunotherapies against cancer and autoimmune diseases.

Overjet on Bringing AI to Dentistry – Ep. 179
Dentists get a bad rap. Dentists also get more people out of more aggravating pain than just about anyone, which is why the more technology dentists have, the better. Overjet, a member of the NVIDIA Inception program for startups, is moving fast to bring AI to dentists’ offices.

Democratizing Drug Discovery With Deep Learning – Ep. 172
It may seem intuitive that AI and deep learning can speed up workflows — including novel drug discovery, a typically years-long and several-billion-dollar endeavor. But, professors Artem Cherkasov and Olexandr Isayev were surprised that no recent academic papers provided a comprehensive, global research review of how deep learning and GPU-accelerated computing impact drug discovery.

Subscribe to the AI Podcast

Get the AI Podcast through iTunes, Google Podcasts, Google Play, Amazon Music, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn.

Make the AI Podcast better: Have a few minutes to spare? Fill out this listener survey.

Read More

Singtel, NVIDIA to Bring Sovereign AI to Southeast Asia

Singtel, NVIDIA to Bring Sovereign AI to Southeast Asia

Asia’s lion city is roaring ahead in AI.

Singtel, a leading communications services provider based in Singapore, will bring the NVIDIA AI platform to businesses in the island nation and beyond.

The mobile and broadband company is building energy-efficient data centers across Southeast Asia accelerated with NVIDIA Hopper architecture GPUs and using NVIDIA AI reference architectures proven to deliver optimal performance.

The data centers will serve as sovereign national resources — AI factories that process the private datasets of companies, startups, universities and governments safely on shore to produce valuable insights.

Singtel’s first AI services will spin up in Singapore, with future data centers under construction in Indonesia and Thailand. From its hub in Singapore, the company has operations that stretch from Australia to India.

Trusted Engines of AI

The new data centers will act as trusted engines of generative AI. The most transformative technology of our time, generative AI and its ability to amplify human intelligence and productivity are attracting users worldwide.

Nations are creating large language models tuned to their local dialects, cultures and practices. Singtel sits at the center of such opportunities among Southeast Asia’s vibrant Chinese, Indian, Malay and other communities.

Singtel’s initiative supports Singapore’s national AI strategy to empower its citizens with the latest technology. The plan calls for significantly expanding the country’s compute infrastructure as well as its talent pool of machine learning specialists.

For businesses in the region, having a known, local provider of these computationally intensive services provides a safe, easy on-ramp to generative AI. They can enhance and personalize their products and services while protecting sensitive corporate data.

Taking the Green Path

Singtel is committed to democratizing AI and decarbonizing its operations.

Its latest data centers are being built with an eye to sustainability, including in the selection of materials and use of liquid cooling. They adopt best practices to deliver less than 1.3 in PUE, the power usage effectiveness metric for data center efficiency.

Singtel will use its Paragon software platform to orchestrate how the new AI applications work in concert with its mobile and broadband services. The combination will enable edge computing services like powering robots and other autonomous systems from AI models running in the cloud.

A Full-Stack Foundation

The company will offer its customers NVIDIA AI Enterprise, a software platform for building and deploying AI applications, including generative AI. Singtel will also be an NVIDIA Cloud Partner, delivering optimized AI services on the NVIDIA platform.

Because Singtel’s data centers use NVIDIA’s proven reference architectures for AI computing, users can employ its services, knowing they’re optimized for leading AI performance.

Singtel already has hands-on experience delivering edge services with NVIDIA AI.

Last May, it demonstrated a digital avatar created with the NVIDIA Omniverse and NVIDIA NeMo platforms that users could interact with over its 5G network. And in 2021, Singtel delivered GPU services as part of a testbed for local government agencies.

New AI Role for Telcos

Singapore’s service provider joins pioneers in France, India, Italy and Switzerland deploying AI factories that deliver generative AI services with data sovereignty.

To learn more about how Singtel and other telcos are embracing generative AI, register for a session on the topic at NVIDIA GTC. The global AI conference runs March 18-21, starting with a keynote by NVIDIA founder and CEO Jensen Huang.

Read More

Behold the ‘Magic Valley’: Brandon Tieh’s Stunning Scene Showcases Peak Creativity, Powered by RTX and AI

Behold the ‘Magic Valley’: Brandon Tieh’s Stunning Scene Showcases Peak Creativity, Powered by RTX and AI

Editor’s note: This post is part of our weekly In the NVIDIA Studio series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how NVIDIA Studio technology improves creative workflows. We’re also deep diving on new GeForce RTX 40 Series GPU features, technologies and resources, and how they dramatically accelerate content creation.

This week’s featured In the NVIDIA Studio 3D artist Brandon Tieh puts his artistic talents on full display with his whimsical scene Magic Valley.

An array of colors — from bright crimson to hushed blues and lush greens — help set the mood of the vivid scene, which took inspiration from Tieh’s love for video games, anime and manga.

His diverse pieces — along with the works of fellow Studio artists like Christian Dimitrov, Vera Dementchouk and Eddie Mendoza — take audiences to fantastical getaways in the latest Studio Standouts video.

To fuel creative work, the new GeForce RTX 4080 SUPER is available starting tomorrow in a limited Founders Edition design and as custom boards from partners, starting at $999. It’s equipped with more cores than the GeForce RTX 4080 and includes the world’s fastest GDDR6X video memory at 23 Gbps. In 3D apps like Blender, it can run up to 70% faster than previous generations. The video editing app Blackmagic Design’s DaVinci Resolve accelerates AI effects over 30% faster than the GeForce RTX 3080 Ti.

Get creative and AI superpowers with the GeForce RTX 4080 SUPER.

The RTX 4080 SUPER also brings great frame rates and stunning 4K resolution in fully ray-traced games, including Alan Wake 2, Cyberpunk 2077: Phantom Liberty and Portal with RTX. Discover what RTX 40 SUPER Series graphics cards and systems are available.

Stepping Into Magical Worlds

Tieh’s scene began as a sketch of what he envisioned to be a giant door in a grassy field.

“A very broad and abstract thought — but that’s sort of the point of fantasy,” he explained.

Tieh specializes in building impressive worlds.

To bring it to life, he began by gathering assets such as rocks and grass from Quixel Megascans, the Unreal Engine Marketplace and ArtStation Marketplace.

The door required extra customization, so he modeled one from scratch, first sculpting it in ZBrush before importing it into Adobe 3D Substance Painter for a quick texture pass. Tieh’s GeForce RTX graphics card used RTX-accelerated light and ambient occlusion to bake assets in mere seconds.

Lighting will heavily influence the final look, so only a basic texture pass was needed in Substance 3D Painter.

Next, Tieh tackled modeling the obelisks and pillars in Blender, where RTX-accelerated OptiX ray tracing in the viewport ensured highly interactive, photorealistic rendering.

Modeling and UV layouts of pillars and obelisks.

He then unwrapped his 3D assets onto a 2D plane, where he applied textures to the model’s surfaces to enhance realism — a key process called UV unwrapping.

Tieh’s customized door in Unreal Engine.

With the textured assets in place, Tieh next built the scene in Unreal Engine. His technique involves focusing on the big shapes by looking at the scene in a smaller thumbnail view, then flipping the canvas to refresh his perspective — very similar to the approach concept artists use. He adjusted lighting by deploying the same technique.

“Magic Valley” adjusting atmospheric light in the scene.

“I’ve used NVIDIA GPUs all my life — they’re super reliable and high performing without any issues.”  — Brandon Tieh

Unreal Engine users can tap NVIDIA DLSS Super Resolution to increase the interactivity of the viewport by using AI to upscale frames rendered at lower resolution, and enhance image quality using DLSS Ray Reconstruction.

Fog is another major component of the scene. “Fog is generally darker and more opaque in the background and becomes lighter and more translucent as it approaches the foreground,” said Tieh. He primarily used fog cards in Unreal Engine’s free Blueprints Visual Scripting system to add a paint-like effect.

The majority of lighting was artificial, meaning Tieh had to use a significant number of individually placed light sources, but “it looks very believable if executed well,” he explained.

The menu on the right lists the roughly 100 light actor sources Tieh used.

From there, Tieh exported final renders with ease and speed thanks to his RTX GPU.

 

“There’s no secret tricks or fancy engine features,” said Tieh. “It’s all about sticking to the basics and fundamentals of art, as well as trusting your own artistic eye.”

3D artist Brandon Tieh.

Check out Tieh’s impressive portfolio on ArtStation.

Follow NVIDIA Studio on Instagram, X and Facebook. Access tutorials on the Studio YouTube channel and get updates directly in your inbox by subscribing to the Studio newsletter.

Read More

Boston Children’s Researchers, in Joint Effort, Deploy AI Across Their Hip Clinic to Support Patients, Doctors

Boston Children’s Researchers, in Joint Effort, Deploy AI Across Their Hip Clinic to Support Patients, Doctors

Hip disorders, comprising some of the world’s most common joint diseases, are especially prevalent among adolescents and young adults, causing stiffness, pain or a limp. But they can be hard to diagnose using solely 2D medical imaging.

Helping to treat these disorders, the Boston Children’s Hospital’s (BCH’s) Adolescent and Young Adult Hip Preservation Program is the U.S.’s first to deploy a fully automated AI tool across its clinic.

Called VirtualHip, the tool can create a 3D model of a hip from routine medical images, assess anatomy and movement-related issues, and provide clinicians with diagnostic and treatment guidance. It was built at an orthopedic research lab at BCH, Harvard Medical School’s primary pediatric hospital, using the NVIDIA DGX platform.

A team of 10 researchers are working on the project, funded in part by an NVIDIA Academic Hardware Grant, including engineers, computer scientists, orthopedic surgeons, radiologists and software developers.

“Using AI, clinicians can get more value out of the clinical data they routinely collect,” said Dr. Ata Kiapour, the lab’s principal investigator, director of the musculoskeletal informatics group at BCH and assistant professor of orthopedic surgery at Harvard Medical School. “This tool can augment their performance to make better choices in diagnosis and treatment, and free their time to focus on giving patients the best care.”

Getting Straight to the Joint

Often, clinicians must determine a treatment plan — with levels of intervention ranging from physical therapy to total hip replacement — based on just a 2D image, such as an X-ray, CT scan or MRI. Automatically creating 3D models from these images, and using them to conduct comprehensive joint assessments, can help improve the accuracy of diagnosis to inform treatment planning.

“I started a postdoc with an orthopedic surgeon at BCH in 2013, when I saw how an engineer or scientist could help with patient treatment,” said Dr. Kiapour, who’s also trained as a biomedical engineer. “Over the years, I saw that hospitals have a ton of data, but efficient data processing for clinical use was a huge, unmet need.”

VirtualHip, fully integrated with BCH’s hip clinic and radiology database, helps to fill this gap.

Clinicians can log in to the software tool through a web-based portal, view and interact with 3D simulations of 2D medical images, submit analysis requests and see results within an hour — 4x quicker on average than receiving a radiology report after imaging.

The tool, which produces 3D models with a margin of error less than one millimeter, can assess morphological abnormalities and identify issues related to hip motion, such as pain from hip bones rubbing against each other.

VirtualHip was developed using a database with tens of millions of clinical notes and imaging data from patients seen at BCH over the past two decades. Using natural language processing models and computer vision algorithms, Dr. Kiapour’s team processed this data, training the tool to analyze normal versus pathologic hip development and identify factors influencing the injury risk and treatment outcomes.

This will enable VirtualHip to offer patient-specific risk assessment and treatment planning by comparing current patients to previously treated ones with similar demographic backgrounds.

“The hardware support that we got from NVIDIA made all of this possible,” Dr. Kiapour said. “DGX enabled advanced computations on more than 20 years’ worth of historical data for our fine-tuned clinical AI model.”

Clearer Explanations, Better Outcomes for Patients

In addition to assisting doctors in diagnosis and treatment planning, VirtualHip helps patients better understand their condition.

“When patients look at an X-ray, it doesn’t look like a real hip — but with a 3D model that can be rotated, the doctor can show exactly where the joints are impinging or unstable,” Dr. Kiapour said. “This lets the patient better understand their disorder, which usually makes them more compliant with whatever the doctor’s prescribing.”

This type of visualization is especially helpful for children and younger adults, Kiapour added.

The VirtualHip project is under continuous development, including work toward a patient-facing platform — using large language models and generative AI — for personalized analyses and treatment recommendations.

The BCH researchers plan to commercialize the product for use in other hospitals.

Subscribe to NVIDIA healthcare news.

Read More

Sharper Image: GeForce NOW Update Delivers Stunning Visuals to Android Devices

Sharper Image: GeForce NOW Update Delivers Stunning Visuals to Android Devices

This GFN Thursday levels up PC gaming on mobile with higher-resolution support on Android devices.

This week also brings 10 new games to the GeForce NOW library, including Enshrouded. 

Pixel Perfect

Android 1440p on GeForce NOW
New Year’s resolutions.

GeForce NOW transforms nearly any device into a high-powered PC gaming rig, and members streaming on Android can now access that power from the palms of their hands. The GeForce NOW Android app, rolling out now to members, unlocks a new level of visual quality for Ultimate members gaming on mobile, with improved support for streaming up to 1440p resolution at 120 frames per second.

Explore the vibrant neon landscapes of Cyberpunk 2077, stream triple-A titles like Baldur’s Gate 3 and Monster Hunter: World, and play the latest releases in the cloud, including Prince of Persia: The Lost Crown and Exoprimal — all on the go with higher resolutions for more immersive gameplay.

Ultimate members can stream these and over 1,800 titles from the GeForce NOW library on select 120Hz Android phones and tablets at pixel-perfect quality. Plus, they can take gameplay even further with eight-hour sessions and tap GeForce RTX 4080-powered servers for faster access to their gaming libraries.

Sign up for an Ultimate membership today and check out this article for more details on how to set up Android devices for PC gaming on the go.

Got Games?

Stargate Timekeepers on GeForce NOW
“We fight because we choose to.”

Lead a team of specialists through a story-driven campaign set in the SG-1 universe of Stargate: Timekeepers from Slitherine Ltd. Sneak characters behind enemy lines, use their unique skills, craft the perfect plan to unravel a time-loop mystery, and defeat the Goa’uld threat. It’s available to stream from the cloud this week.

More titles joining the cloud this week include:

  • Stargate: Timekeepers (New release on Steam, Jan. 23)
  • Enshrouded (New release on Steam, Jan. 24)
  • Firefighting Simulator – The Squad (Steam)
  • Metal: Hellsinger (Xbox, available on the Microsoft Store)
  • Road 96: Mile 0 (Xbox, available on the Microsoft Store)
  • Shadow Tactics: Blades of the Shogun (Steam)
  • Shadow Tactics: Blades of the Shogun – Aiko’s Choice (Steam)
  • Solasta: Crown of the Magister (Steam)
  • Tails Noir (Xbox, available on the Microsoft Store)
  • Wobbly Life (Steam)

Games from Spike Chunsoft will be removed from the GeForce NOW library at the request of the publisher. Fourteen titles are leaving on Friday, Feb. 2, so be sure to catch them before they go:

  • 428 Shibuya Scramble
  • AI: The Somnium Files
  • Conception PLUS: Maidens of the Twelve Stars
  • Danganronpa: Trigger Happy Havoc
  • Danganronpa 2: Goodbye Despair
  • Danganronpa V3: Killing Harmony
  • Danganronpa Another Episode: Ultra Despair Girls
  • Fire Pro Wrestling World
  • Re: ZERO – Starting Life in Another World – The Prophecy of the Throne
  • RESEARCH and DESTROY
  • Shiren the Wanderer: The Tower of Fortune and the Dice of Fate
  • STEINS;GATE
  • Zanki Zero: Last Beginning
  • Zero Escape: The Nonary Games

What are you planning to play this weekend? Let us know on X or in the comments below.

Read More