Take the Green Train: NVIDIA BlueField DPUs Drive Data Center Efficiency

Take the Green Train: NVIDIA BlueField DPUs Drive Data Center Efficiency

The numbers are in, and they paint a picture of data centers going a deeper shade of green, thanks to energy-efficient networks accelerated with data processing units (DPUs).

A suite of tests run with help from Ericsson, RedHat and VMware show power reductions up to 24% on servers using NVIDIA BlueField-2 DPUs. In one case, they delivered 54x the performance of CPUs.

The work, described in a recent whitepaper, offloaded core networking jobs from power-hungry host processors to DPUs designed to run them more efficiently.

Accelerated computing with DPUs for networking, security and storage jobs is one of the next big steps for making data centers more power efficient. It’s the latest of a handful of optimizations, described in the whitepaper, for data centers moving into the era of green computing.

DPUs Tested on VMware vSphere

Seeing the trend toward energy-efficient networks, VMware enabled DPUs to run its virtualization software, used by thousands of companies worldwide. NVIDIA has run several tests with VMware since its vSphere 8 software release this fall.

For example, on VMware vSphere Distributed Services Engine —  software that offloads and accelerates networking and security functions using DPUs — BlueField-2 delivered higher performance while freeing up 20% of the CPU’s resources required without DPUs.

That means users can deploy fewer servers to run the same workload, or run more applications on the same servers.

Power Costs Cut Nearly $2 Million

Few data centers face a more demanding job than those run by telecoms providers. Their networks shuttle every bit of data that smartphone users generate or request between their cellular networks and the internet.

Researchers at Ericsson tested whether operators could reduce their power consumption on this massive workload using SmartNICs, the network interface cards that handle DPU functions. Their test let CPUs slow down or sleep while an NVIDIA ConnectX SmartNIC handled the networking tasks.

The results, detailed in a recent article, were stunning.

Energy consumption of server CPUs fell 24%, from 190 to 145 watts on a fully loaded network. This single DPU application could cut power costs by nearly $2 million over three years for a large data center.

Ericsson tests with Bluefield DPUs
Ericsson ran the user-plane function for 5G networks on DPUs in three scenarios.

In the article, Ericsson’s CTO, Erik Ekudden, underscored the importance of the work.

“There’s a growing sense of urgency among communication service providers to find and implement innovative solutions that reduce network energy consumption,” he wrote. And the DPU techniques “save energy across a wide range of traffic conditions.”

70% Less Overhead, 54x More Performance

Results were even more dramatic for tests on Red Hat OpenShift, used by half of all Fortune 500 banks, airlines and telcos to manage software containers.

In the tests, BlueField-2 DPUs handled virtualization, encryption and networking jobs needed to manage these portable packages of applications and code.

The DPUs slashed networking demands on CPUs by 70%, freeing them up to run other applications. What’s more, they accelerated networking jobs by a whopping 54x.

A technical blog provides more detail on the tests.

Speeding the Way to Zero Trust

Across every industry, businesses are embracing a philosophy of zero trust to improve network security. So, NVIDIA tested IPsec, one of the most popular data center encryption protocols, on BlueField DPUs.

The test showed data centers could improve performance and cut power consumption 21% for servers and 34% for clients on networks running IPsec on DPUs. For large data centers, that could translate to nearly $9 million in savings on electric bills over three years.

NVIDIA and its partners continue to put DPUs to the test in an expanding portfolio of use cases, but the big picture is clear.

“In a world facing rising energy costs and rising demand for green IT infrastructure, the use of DPUs will become increasingly popular,” the whitepaper concludes.

It’s good to know the numbers, but seeing is believing. So apply to run your own test of DPUs on VMware’s vSphere.

The post Take the Green Train: NVIDIA BlueField DPUs Drive Data Center Efficiency appeared first on NVIDIA Blog.

Read More

Unearthing Data: Vision AI Startup Digs Into Digital Twins for Mining and Construction

Unearthing Data: Vision AI Startup Digs Into Digital Twins for Mining and Construction

Skycatch, a San Francisco-based startup, has been helping companies mine both data and minerals for nearly a decade.

The software-maker is now digging into the creation of digital twins, with an initial focus on the mining and construction industry, using the NVIDIA Omniverse platform for connecting and building custom 3D pipelines.

SkyVerse, which is a part of Skycatch’s vision AI platform, is a combination of computer vision software and custom Omniverse extensions that enables users to enrich and animate virtual worlds of mines and other sites with near-real-time geospatial data.

“With Omniverse, we can turn massive amounts of non-visual data into dynamic visual information that’s easy to contextualize and consume,” said Christian Sanz, founder and CEO of Skycatch. “We can truly recreate the physical world.”

SkyVerse can help industrial sites simulate variables such as weather, broken machines and more up to five years into the future — while learning from happenings up to five years in the past, Sanz said.

The platform automates the entire visualization pipeline for mining and construction environments.

First, it processes data from drones, lidar and other sensors across the environment, whether at the edge using the NVIDIA Jetson platform or in the cloud.

It then creates 3D meshes from 2D images, using neural networks built from NVIDIA’s pretrained models to remove unneeded objects like dump trucks and other equipment from the visualizations.

Next, SkyVerse stitches this into a single 3D model that’s converted to the Universal Scene Description (USD) framework. The master model is then brought into Omniverse Enterprise for the creation of a digital twin that’s live-synced with real-world telemetry data.

“The simulation of machines in the environment, different weather conditions, traffic jams — no other platform has enabled this, but all of it is possible in Omniverse with hyperreal physics and object mass, which is really groundbreaking,” Sanz said.

Skycatch is a Premier partner in NVIDIA Inception, a free, global program that nurtures startups revolutionizing industries with cutting-edge technologies. Premier partners receive additional go-to-market support, exposure to venture capital firms and technical expertise to help them scale faster.

Processing and Visualizing Data

Companies have deployed Skycatch’s fully automated technologies to gather insights from aerial data across tens of thousands of sites at several top mining companies.

The Skycatch team first determines optimal positioning of the data-collection sensors across mine vehicles using the NVIDIA Isaac Sim platform, a robotics simulation and synthetic data generation (SDG) tool for developing, testing and training AI-based robots.

“Isaac Sim has saved us a year’s worth of testing time — going into the field, placing a sensor, testing how it functions and repeating the process,” Sanz said.

The team also plans to integrate the Omniverse Replicator software development kit into SkyVerse to generate physically accurate 3D synthetic data and build SDG tools to accelerate the training of perception networks beyond the robotics domain.

Once data from a site is collected, SkyVerse uses edge devices powered by the NVIDIA Jetson Nano and Jetson AGX Xavier modules to automatically process up to terabytes of it per day and turn it into kilobyte-size analytics that can be easily transferred to frontline users.

This data processing was sped up 3x by the NVIDIA CUDA parallel computing platform, according to Sanz. The team is also looking to deploy the new Jetson Orin modules for next-level performance.

“It’s not humanly possible to go through tens of thousands of images a day and extract critical analytics from them,” Sanz said. “So we’re helping to expand human eyesight with neural networks.”

Using pretrained models from the NVIDIA TAO Toolkit, Skycatch also built neural networks that can remove extraneous objects and vehicles from the visualizations, and texturize over these spots in the 3D mesh.

The digital terrain model, which has sub-five-centimeter precision, can then be brought into Omniverse for the creation of a digital twin using the easily extensible USD framework, custom SkyVerse Omniverse extensions and NVIDIA RTX GPUs.

“It took just around three months to build the Omniverse extensions, despite the complexity of our extensions’ capabilities, thanks to access to technical experts through NVIDIA Inception,” Sanz said.

Skycatch is working with one of Canada’s leading mining companies, Teck Resources, to implement the use of Omniverse-based digital twins for its project sites.

“Teck Resources has been using Skycatch’s compute engine across all of our mine sites globally and is now expanding visualization and simulation capabilities with SkyVerse and our own digital twin strategy,” said Preston Miller, lead of technology and innovation at Teck Resources. “Delivering near-real-time visual data will allow Teck teams to quickly contextualize mine sites and make faster operational decisions on mission-critical, time-sensitive projects.”

The Omniverse extensions built by Skycatch will be available soon — learn more.

Safety and Sustainability

AI-powered data analysis and digital twins can make operational processes for mining and construction companies safer, more sustainable and more efficient.

For example, according to Sanz, mining companies need the ability to quickly locate the toe and crest (or bottom and top) of “benches,” narrow strips of land beside an open-pit mine. When a machine is automated to go in and out of a mine, it must be programmed to stay 10 meters away from the crest at all times to avoid the risk of sliding, Sanz said.

Previously, surveying and analyzing landforms to determine precise toes and crests typically took up to five days. With the help of NVIDIA AI, SkyVerse can now generate this information within minutes, Sanz said.

In addition, SkyVerse eliminates 10,000 open-pit interactions for customers per year, per site, Sanz said. These are situations in which humans and vehicles can intersect within a mine, posing a safety threat.

“At its core, Skycatch’s goal is to provide context and full awareness for what’s going on at a mining or construction site in near-real time — and better environmental context leads to enhanced safety for workers,” Sanz said.

Skycatch aims to boost sustainability efforts for the mining industry, too.

“In addition to mining companies, governmental organizations want visibility into how mines are operating — whether their surrounding environments are properly taken care of — and our platform offers these insights,” Sanz said.

Plus, minerals like cobalt, nickel and lithium are required for electrification and the energy transition. These all come from mine sites, Sanz said, which can become safer and more efficient with the help of SkyVerse’s digital twins and vision AI.

Dive deeper into technology for a sustainable future with Skycatch and other Inception partners in the on-demand webinar, Powering Energy Startup Success With NVIDIA Inception.

Creators and developers across the world can download NVIDIA Omniverse for free, and enterprise teams can use the platform for their 3D projects.

Learn more about and apply to join NVIDIA Inception.

The post Unearthing Data: Vision AI Startup Digs Into Digital Twins for Mining and Construction appeared first on NVIDIA Blog.

Read More

Check Out 26 New Games Streaming on GeForce NOW in November

Check Out 26 New Games Streaming on GeForce NOW in November

It’s a brand new month, which means this GFN Thursday is all about the new games streaming from the cloud.

In November, 26 titles will join the GeForce NOW library. Kick off with 11 additions this week, like Total War: THREE KINGDOMS and new content updates for Genshin Impact and Apex Legends.

Plus, leading 5G provider Rain has announced it will be introducing “GeForce NOW powered by Rain” to South Africa early next year. Look forward to more updates to come.

And don’t miss out on the 40% discount for GeForce NOW 6-month Priority memberships. This offer is only available for a limited time.

Build Your Empire

Lead the charge this week with Creative Assembly and Sega’s Total War: THREE KINGDOMS, a turn-based, empire-building strategy game and the 13th entry in the award-winning Total War franchise. Become one of many great leaders from history and conquer enemies to build a formidable empire.

Total War Three Kingdoms
Resolve an epic conflict in ancient China to unify the country and rebuild the empire.

The game is set in ancient China, and gamers must save the country from the oppressive rule of a warlord. Choose from a cast of a dozen legendary heroic characters to unify the nation and dominate enemies. Each has their own agenda, and there are plenty of different tactics for players to employ.

Extend your campaign with up to six-hour gaming sessions at 1080p 60 frames per second for Priority members. With an RTX 3080 membership, gain support for 1440p 120 FPS streaming and up to 8-hour sessions, with performance that will bring foes to their knees.

Sega Row on GeForce NOW
Members can find ‘Total War: THREE KINGDOMSand other Sega games from a dedicated row in the GeForce NOW app.

More to Explore

Alongside the 11 new games streaming this week, members can jump into updates for the hottest free-to-play titles on GeForce NOW.

Genshin Impact Version 3.2, “Akasha Pulses, the Kalpa Flame Rises,” is available to stream on the cloud. This latest update introduces the last chapter of the Sumeru Archon Quest, two new playable characters — Nahida and Layla — as well as new events and game play. Stream it now from devices, whether PC, Mac, Chromebook or on mobile with enhanced touch controls.

Genshin Impact 3.2 on GeForce NOW
Catch the conclusion of the main storyline for Sumeru, the newest region added to ‘Genshin Impact.’

Or squad up in Apex Legends: Eclipse, available to stream now on the cloud. Season 15 brings with it the new Broken Moon map, the newest defensive Legend — Catalyst — and much more.

Apex Legends on GeForce NOW
Don’t mess-a with Tressa.

Also, after working closely with Square Enix, we’re happy to share that members can stream STAR OCEAN THE DIVINE FORCE on GeForce NOW beginning this week.

Here’s the full list of games joining this week:

  • Against the Storm (Epic Games and New release on Steam)
  • Horse Tales: Emerald Valley Ranch (New release on Steam, Nov. 3)
  • Space Tail: Every Journey Leads Home (New release on Steam, Nov. 3)
  • The Chant (New release on Steam, Nov. 3)
  • The Entropy Centre (New release on Steam, Nov. 3)
  • WRC Generations — The FIA WRC Official Game (New Release on Steam, Nov. 3)
  • Filament (Free on Epic Games, Nov. 3-10)
  • STAR OCEAN THE DIVINE FORCE (Steam)
  • PAGUI (Steam)
  • RISK: Global Domination (Steam)
  • Total War: THREE KINGDOMS (Steam)

Arriving in November

But wait, there’s more! Among the total 26 games joining GeForce NOW in November is the highly anticipated Warhammer 40,000: Darktide, with support for NVIDIA RTX and DLSS.

Here’s a sneak peak:

  • The Unliving (New release on Steam, Nov. 7)
  • TERRACOTTA (New release on Steam and Epic Games, Nov. 7)
  • A Little to the Left (New Release on Steam, Nov. 8)
  • Yum Yum Cookstar (New Release on Steam, Nov. 11)
  • Nobody — The Turnaround (New release on Steam, Nov. 17)
  • Goat Simulator 3 (New release on Epic Games, Nov. 17)
  • Evil West (New release on Steam, Nov. 22)
  • Colortone: Remixed (New Release on Steam, Nov. 30)
  • Warhammer 40,000: Darktide (New Release on Steam, Nov. 30)
  • Heads Will Roll: Downfall (Steam)
  • Guns Gore and Cannoli 2 (Steam)
  • Hidden Through TIme (Steam)
  • Cave Blazers (Steam)
  • Railgrade (Epic Games)
  • The Legend of Tianding (Steam)

While The Unliving was originally announced in October, the release date of the game shifted to Monday, Nov. 7.

Howlin’ for More

October brought more treats for members. Don’t miss the 14 extra titles added last month. 

With all of these sweet new titles coming to the cloud, getting your game on is as easy as pie. Speaking of pie, we’ve got a question for you. Let us know your answer on Twitter or in the comments below.

The post Check Out 26 New Games Streaming on GeForce NOW in November appeared first on NVIDIA Blog.

Read More

Stormy Weather? Scientist Sharpens Forecasts With AI

Stormy Weather? Scientist Sharpens Forecasts With AI

Editor’s note: This is the first in a series of blogs on researchers advancing science in the expanding universe of high performance computing.

A perpetual shower of random raindrops falls inside a three-foot metal ring Dale Durran erected outside his front door (shown above). It’s a symbol of his passion for finding order in the seeming chaos of the planet’s weather.

A part-time sculptor and full-time professor of atmospheric science at the University of Washington, Durran has co-authored dozens of papers describing patterns in Earth’s ever-changing skies. It’s a field for those who crave a confounding challenge trying to express with math the endless dance of air and water.

meteorologist Dale Durran
Dale Durran

In 2019, Durran acquired a new tool, AI. He teamed up with a grad student and a Microsoft researcher to build the first model to demonstrate deep learning’s potential to predict the weather.

Though crude, the model outperformed the complex equations used for the first computer-based forecasts. The descendants of those equations now run on the world’s biggest supercomputers. In contrast, AI slashes the traditional load of required calculations and works faster on much smaller systems.

“It was a dramatic revelation that said we better jump into this with both feet,” Durran recalled.

Sunny Outlook for AI

Last year, the team took their work to the next level. Their latest neural network can process 320 six-week forecasts in less than a minute on the four NVIDIA A100 Tensor Core GPUs in an NVIDIA DGX Station. That’s more than 6x the 51 forecasts today’s supercomputers synthesize to make weather predictions.

In a show of how rapidly the technology is evolving, the model was able to forecast, almost as well as traditional methods, what became the path of Hurricane Irma through the Caribbean in 2017. The same model also could crank out a week’s forecast in a tenth of a second on a single NVIDIA V100 Tensor Core GPU.

AI forecasts Hurricane Irma's path
Durran’s latest work used AI to forecast Hurricane Irma’s path in Florida more efficiently and nearly as accurately as traditional methods.

Durran foresees AI crunching thousands of forecasts simultaneously to deliver a clearer statistical picture with radically fewer resources than conventional equations. Some suggest the performance advances will be measured in as many as five orders of magnitude and use a fraction of the power.

AI Ingests Satellite Data

The next big step could radically widen the lens for weather watchers.

The complex equations today’s predictions use can’t readily handle the growing wealth of satellite data on details like cloud patterns, soil moisture and drought stress in plants. Durran believes AI models can.

One of his graduate students hopes to demonstrate this winter an AI model that directly incorporates satellite data on global cloud cover. If successful, it could point the way for AI to improve forecasts using the deluge of data types now being collected from space.

In a separate effort, researchers at the University of Washington are using deep learning to apply a grid astronomers use to track stars to their work understanding the atmosphere. The novel mesh could help map out a whole new style of weather forecasting, Durran said.

Harvest of a Good Season

In nearly 40 years as an educator, Durran has mentored dozens of students and wrote two highly rated textbooks on fluid dynamics, the math used to understand the weather and climate.

One of his students, Gretchen Mullendore, now heads a lab at the U.S. National Center for Atmospheric Research, working with top researchers to improve weather forecasting models.

“I was lucky to work with Dale in the late 1990s and early 2000s on adapting numerical weather prediction to the latest hardware at the time,” said Mullendore. “I am so thankful to have had an advisor that showed me it’s cool to be excited by science and computers.”

Carrying on a Legacy

Durran is slated to receive in January the American Meteorological Society’s most prestigious honor, the Jule G. Charney Medal. It’s named after the scientist who worked with John von Neumann to develop in the 1950s the algorithms weather forecasters still use today.

Charney was also author in 1979 of one of the earliest scientific papers on global warming. Following in his footsteps, Durran wrote two editorials last year for The Washington Post to help a broad audience understand the impacts of climate change and rising CO2 emissions.

The editorials articulate a passion he discovered at his first job in 1976, creating computer models of air pollution trends. “I decided I’d rather work on the front end of that problem,” he said of his career shift to meteorology.

It’s a field notoriously bedeviled by effects as subtle as a butterfly’s wings that motivates his passion to advance science.

The post Stormy Weather? Scientist Sharpens Forecasts With AI appeared first on NVIDIA Blog.

Read More

GeForce RTX 40 Series Receives Massive Creator App Benefits This Week ‘In the NVIDIA Studio’

GeForce RTX 40 Series Receives Massive Creator App Benefits This Week ‘In the NVIDIA Studio’

Editor’s note: This post is part of our weekly In the NVIDIA Studio series, which celebrates featured artists, offers creative tips and tricks, and demonstrates how NVIDIA Studio technology improves creative workflows. We’re also deep diving on new GeForce RTX 40 Series GPU features, technologies and resources, and how they dramatically accelerate content creation.

Artists deploying the critically acclaimed GeForce RTX 4090 GPUs are primed to receive significant performance boosts in key creative apps. OBS Studio and Google Chrome enabled AV1 encoding; Topaz AI-powered apps and ON1 software added Tensor Core acceleration; and VTube Studio integrated NVIDIA Broadcast augmented-reality features that enable high-quality, seamless control of avatars.

Plus, a special spook-tober edition of In the NVIDIA Studio features two talented 3D artists and their Halloween-themed creations this week.

3D and special effects artist Eric Tualle, better known as ATOM, shares his short film sequence, Mr. Pumpkin, which was created to test motion-capture techniques and the render speeds of his new GeForce RTX 4090 GPU.

NVIDIA 3D artist Sabour Amirazodi might be the biggest Halloween fan ever. Look no further than the extraordinary, stunning light show he creates for his neighborhood every year. His workflow is powered by a mobile workstation equipped with an NVIDIA RTX A5000 GPU.

Finally, check out the #From2Dto3D challenge highlights, including NVIDIA Studio’s favorite inspirational artwork brought to life in beautiful 3D.

Tricks and Treats With RTX

The new GeForce RTX 40 Series GPUs feature incredible upgrades in dedicated hardware, including third-generation RT Cores, fourth-generation Tensor Cores and an eighth-generation NVIDIA dual AV1 encoder. These advancements deliver turbocharged creative workflows and accelerate creative apps in 3D modeling, video editing and more with AI-powered features.

OBS Studio sftware version 28.1 added AV1 encoding support with the new NVIDIA Encoder, known as NVENC, delivering 40% better livestreaming efficiency. These livestreams will appear as if bandwidth was increased by 40% — a big boost in image quality. Plus, AV1 added high dynamic range support.

Google Chrome released an update to enable AV1 encoding on all its browser apps, also offering a 40% gain in livestreaming efficiency.

 

VTube Studio recently integrated the NVIDIA Broadcast app, adding AI-powered face tracking. Using the app requires only a simple webcam, which eliminates the need for expensive, specialized hardware. This gives many more artists the tools to become a VTuber and existing ones better avatars that match their expressions.

Topaz Lab’s Video AI v3.0 release introduced an AI stabilization model that reduces shaky camera motion by estimating camera movement and transforming frames for smoother video footage. The update also introduced an AI slow-motion model, called Apollo, which builds on past motion models by handling nonlinear motion and motion blur.

Furthermore, v3.0 added functionality that enables multiple AI models to tackle a single project simultaneously. For example, an AI model can upscale footage while enabling stabilization. These features and models run up to 1.25x faster on NVIDIA GPUs with the adoption of the NVIDIA TensorRT software development kit. The app also now supports the popular dual NVIDIA AV1 encoder, enabling users to run previews from multiple video input files and export several projects simultaneously.

NVIDIA also collaborated with photo-editing software company ON1 to bring a massive performance boost to the ON1 Resize app. Advanced effects can now be applied more than 2x faster, and additional enhancements are in the works.

Artists Give ’Em Pumpkin to Talk About

ATOM has been creating content for more than a decade. His work is influenced by his love of older TV shows, moody artwork and the darker emotions found in human nature.

“Human emotions have always inspired me in art, especially negative ones, because that’s what makes us stronger in life,” he said. “That’s why I like dark things.”

His short film Mr. Pumpkin playfully experiments with motion capture, bringing the title character and his technical tribulations to life. ATOM wanted to ensure the atmosphere was right for this film. He created the tone of a mysterious forest at night, full of volumetric light and mist. Mr. Pumpkin himself can be instantly recognized as the hero of Halloween.

 

Photogrammetry — a method of generating 3D models using a series of photographs — continues to be adopted as a bonafide method for creating quality 3D assets quickly. It’s where ATOM’s journey began, with a real-life pumpkin.

ATOM captured video of the pumpkin within a homemade motion-square setup that rotated his prop for a complete scan. The artist then uploaded the footage to Adobe After Effects and exported the frames into an image sequence within Adobe Substance 3D Sampler before uploading them to Maxon’s Cinema 4D.

 

“It’s a real revolution to be able to do this kind of motion capture at home, when previously it would have required hiring a full motion-capture studio,” noted ATOM.

 

With a full-fidelity model, ATOM refined the pumpkin — sculpting until the shape was perfect. He then adjusted textures and colors to reach his ideal look. Even lighting the scene was quick and easy, he said, thanks to the GPU-accelerated viewport that ensures smooth interactivity with complex 3D models due to his GeForce RTX 4090 GPU.

 

ATOM applied volumetric effects such as clouds, fog and fire with virtually no slowdown, underlining the importance of GPUs in 3D content creation.

 

After animating and locking out the remaining scene elements, ATOM exported files to Topaz Labs Video AI. RTX-accelerated AI enlargement of footage retained high-fidelity details and high temporal stability while up-resing to 4K resolution.

ATOM adores sharing techniques with the creative community and helping others learn. “I’m trying to transmit as much as I can about the world of 3D, cinema and everything that is visually beautiful,” he said.

For his workflow, NVIDIA Studio and RTX GPUs remain critical or, as he says, “a central element in digital creation … its place is paramount in all critical creative apps the community uses today.”

3D and special effects artist ATOM.

Check out ATOM’s tutorials and content on his YouTube channel.

The ‘Haunted Sanctuary’ Awaits

As a creative director and visual effect producer, NVIDIA artist Sabour Amirazodi brought his 16+ years of multi-platform experience in location-based entertainment and media production to his own home, creating an incredible Halloween installation. Make sure to have the volume on when watching this video showcasing his Haunted Sanctuary:

The project required projection mapping, so the artist used GPU-accelerated MadMapper software and its structured light-scan feature to map custom visuals onto the wide surface of his house.

Amirazodi accomplished this by connecting a DSLR camera to his mobile workstation powered by an NVIDIA RTX A5000 GPU. The camera shot a series of lines, took pictures and translated to the projector’s point of view an image on which to base a 3D model. Basic camera matching tools found in Cinema 4D helped recreate the scene.

 

Amirazodi used the lidar camera on his mobile device to scan his house while walking around it. He then created a complete 3D model for more refined mapping and exported it as an FBX file.

Amirazodi worked within Cinema 4D and OTOY OctaneRender to texture, rig, animate, light and render scenes. The GPU-accelerated viewport ensured smooth interactivity with the complex 3D models.

 

Amirazodi then moved to the composite stage, importing his cache of models into Adobe After Effects. With the software’s over 45 GPU-accelerated effects, his RTX A5000 GPU assisted in touching up scenes faster, especially when correcting color and reducing unwanted noise.

To make this installation possible, Amirazodi had to render a staggering 225GB worth of video files, consisting of approximately 18,000 frames in 4K resolution, using Cinema 4D with OctaneRender.

OTOY’s OctaneRender is RTX accelerated, and ray tracing delivers lightning-quick exports. “There’s no way I would have been able to render all of those frames without my RTX A5000 GPU,” the artist said.

When asked why he went through all this effort, Amirazodi gave a touching answer: “My kids,” he said. “With the pandemic, we couldn’t fulfill our tradition of attending the Disneyland haunted house, so I had to bring the haunted house back to my home.”

NVIDIA artist Sabour Amirazodi.

Amirazodi’s advice to prospective artists is simple — pick a theme and stick with it. “Gather inspiration from your favorite creative community, like TurboSquid, ArtStation or Sketchfab, then just start creating and getting things going,” he said. “Let instincts take over to more quickly discover your flow state.”

Amirazodi specializes in video editing, 3D modeling and interactive experiences. Check out the creative savant’s work on IMDb.

2D to 3D, Easy Peasy

NVIDIA Studio extends a warm thank you to all the #From2Dto3D challenge participants, including:

@AnaCarolina_Art — The alien model that helped land your first full-time industry job is simply stunning.

@yetismack3d — The union of a minion and a xenomorph may be unholy, but it’s beautiful nonetheless.

@eyedesyn — From one of our editors, “Oh my gosh, that’s adorable!” Evoking emotion through art is an artist’s dream, well done.

Follow NVIDIA Studio on Instagram, Twitter and Facebook for regular artistic inspiration, and be the first to learn more about the upcoming winter challenge.

Access tutorials on the Studio YouTube channel and get updates directly in your inbox by subscribing to the Studio newsletter.

The post GeForce RTX 40 Series Receives Massive Creator App Benefits This Week ‘In the NVIDIA Studio’ appeared first on NVIDIA Blog.

Read More

Think Fast: Lotus Eletre Tops Charts in Driving and AI Compute Speeds, Powered by NVIDIA DRIVE Orin

Think Fast: Lotus Eletre Tops Charts in Driving and AI Compute Speeds, Powered by NVIDIA DRIVE Orin

One of the biggest names in racing is going even bigger.

Performance automaker Lotus launched its first SUV, the Eletre, earlier this week. The fully electric vehicle sacrifices little in terms of speed and outperforms when it comes to technology.

It features an immersive digital cockpit, lengthy battery range of up to 370 miles and autonomous-driving capabilities powered by the NVIDIA DRIVE Orin system-on-a-chip.

The Eletre’s autonomous-driving system is designed for more than easier commutes. Lotus plans to train the vehicle to complete the world-famous Nürburgring racetrack in Germany entirely on its own. Working with Lotus Group autonomous driving subsidiary ROBO Galaxy, Lotus is able to quickly iterate on deep neural network development to optimize the performance of the high-performance hardware system.

With a top speed of 165 miles per hour and an acceleration that starts at 0 to 62 mph in 4.5 seconds for the base trim — and can be as fast as 2.95 seconds for performance versions — this isn’t an average SUV.

Intelligent Performance

The Lotus Eletre thinks as fast as it drives.

It comes equipped with lidar to comprehensively perceive the surrounding environment. That driving data is processed by two DRIVE Orin systems-on-a-chip, for a total of 508 trillion operations per second of performance.

With this level of AI compute, the Eletre can run the deep neural networks and applications necessary for autonomous driving in real time, with additional headroom for new capabilities that can be added over the air.

Drivers of the performance Eletre S can sit back and enjoy the 23-speaker KEF premium audio system while the SUV’s intelligent-driving capabilities take over.

Eventually, they can fire all the proverbial cylinders of the 905 horsepower dual motor — and dual DRIVE Orin — and take the autonomous-driving system to the track.

Ahead of the Curve

Lotus is bringing its racing heritage into the software-defined era with the Eletre. This future is arriving in just months.

Customer deliveries will begin in China and Europe in the first half of next year, with expansion to North America and other global markets in 2024.

The post Think Fast: Lotus Eletre Tops Charts in Driving and AI Compute Speeds, Powered by NVIDIA DRIVE Orin appeared first on NVIDIA Blog.

Read More

Neural NETA: Automaker Selects NVIDIA DRIVE Orin for AI-Powered Vehicles

One of China’s popular battery-electric startups now has the brains to boot.

NETA Auto, a Zheijiang-based electric automaker, this week announced it will build its future electric vehicles on the NVIDIA DRIVE Orin platform. These EVs will be software defined, with automated driving and intelligent features that will be continuously upgraded via over-the-air updates.

This extends next-generation vehicle technology to thousands of new customers. NETA has been leading battery-EV sales among new market entrants for the past three months, delivering a total of 200,000 EVs since it began production in late 2018.

NETA aims to make travel more comfortable with innovative technologies that break the norm. Hallmarks of NETA vehicles include 5G connectivity and digital assistants.

Last year, NETA Auto released the Shanhai Platform, an independently developed smart automotive architecture. The first model based on this platform, the NETA S, launched in July.

With the addition of DRIVE Orin, these vehicles will have centralized, high-performance compute to enable even greater capabilities.

Street Smarts

Traditionally, implementing the latest technology in new vehicles requires lengthy product cycles and the updating of distributed computers throughout the car.

With centralized, software-defined compute, this process has been reimagined. The vehicle’s intelligent functions run on a single, high-performance AI compute platform. When new software is developed and validated, it can be installed via over-the-air updates, even after the car leaves the dealership.

The DRIVE Orin system-on-a-chip delivers 254 trillion operations per second — ample compute headroom for a software-defined architecture. It’s designed to handle the large number of applications and deep neural networks that run simultaneously in autonomous vehicles, while achieving systematic safety standards such as ISO 26262 ASIL-D.

Equipped with the performance of DRIVE Orin, NETA vehicles will have limitless possibilities.

Aiming Higher

In addition to designing its vehicles with DRIVE Orin, NETA is working with NVIDIA technologies to develop advanced autonomous-driving capabilities.

The companies are collaborating on the design and development of a centralized cross-domain fusion computing platform for level 4 autonomy.

Zhang Yong, NETA Auto co-founder and CEO, with Liu Tong, NVIDIA automotive general manager in China, at the October signing ceremony.

“NETA Auto is at a new stage of development and sees technological innovation as one of the biggest enablers moving this industry forward,” said Zhang Yong, co-founder and CEO of NETA. “The close cooperation with NVIDIA will give NETA Auto a strong boost in bringing intelligent, technology-rich vehicles to market worldwide.”

The post Neural NETA: Automaker Selects NVIDIA DRIVE Orin for AI-Powered Vehicles appeared first on NVIDIA Blog.

Read More

Microsoft Experience Centers Display Scalable, Real-Time Graphics With NVIDIA RTX and Mosaic Technology

When customers walk into a Microsoft Experience Center in New York City, Sydney or London, they’re instantly met with stunning graphics displayed on multiple screens and high-definition video walls inside a multi-story building.

Built to showcase the latest technologies, Microsoft Experience Centers surround customers with vibrant, immersive graphics as they explore new products, watch technical demos, get hands-on experience with the latest solutions and learn more about Microsoft.

To create these engaging visual experiences in real time and on a scalable level, Microsoft sought a solution that would allow it to power high-quality graphics spanning large multi-display walls — without any gaps, artifacts or misalignment in the visuals.

It was also important that the software allowed for simplicity when managing and monitoring the display environments. Microsoft chose NVIDIA RTX A6000 GPUs, along with NVIDIA Mosaic and Quadro Sync technology, which provided support for the demanding visualizations across displays and enabled viewers to see everything as one unified visual.

All images courtesy of Microsoft.

Putting High-Quality Graphics on Full Display

The display walls in Microsoft Experience Centers feature many detailed visuals and scenes that require powerful graphics-computing performance. The HD walls display changing, detailed renders of various Microsoft products. These graphics are created with custom camera angles and fly-throughs.

Once the visuals were created, the team had to synchronize the graphics and ensure the systems were appearing in unison. In each Microsoft Experience Center, the team uses a visualization cluster of up to six systems, with a pair of RTX A6000 GPUs in each. Unreal Engine with nDisplay technology was used to make the Microsoft Video Player work in a cluster setting.

“NVIDIA RTX A6000 GPUs provide the smooth and powerful performance that is required to run high-quality visuals across a large number of displays,” said Chris Haklitch, principal PM lead at Microsoft. “The enterprise reliability and support NVIDIA provides, along with the software and hardware only available with professional RTX GPUs, helped make our vision possible.”

With NVIDIA Mosaic multi-display technology, Microsoft can treat multiple displays as a single desktop, without application software changes or visual artifacts. This enabled the walls of HD displays to be shown as a single unified visual.

NVIDIA Quadro Sync II is a key technology that enables all displays to appear as a single continuous image. Designed for flexibility and scalability, Quadro Sync helps connect and synchronize the NVIDIA RTX GPUs to its attached displays.

Microsoft also used an NVIDIA Enterprise Management Toolkit called NVWMI, which lets IT administrators create scripts and programs for many administrative tasks and functions. With NVWMI, Microsoft can remotely monitor the GPU-powered display environments, ensuring simple access to adjust display settings. Microsoft also used NVWMI to monitor the GPU thermals and performance to meet the demands of ongoing store operation.

NVIDIA Mosaic, Quadro Sync and NVWMI are professional software features that are only available with NVIDIA RTX professional GPUs.

Learn more about NVIDIA RTX professional solutions for deploying scalable visualizations.

The post Microsoft Experience Centers Display Scalable, Real-Time Graphics With NVIDIA RTX and Mosaic Technology appeared first on NVIDIA Blog.

Read More

Make Gaming a Priority: Special Membership Discount Hits GeForce NOW for Limited Time

This spook-tacular Halloween edition of GFN Thursday features a special treat: 40% off a six-month GeForce NOW Priority Membership — get it for just $29.99 for a limited time.

Several sweet new games are also joining the GeForce NOW library.

Creatures of the night can now stream vampire survival game V Rising from the cloud. The fang-tastic fun arrives just in time to get started with the game’s “Bloodfeast” Halloween event and free weekend.

It leads the pack for 12 total games streaming this week, including new releases like Victoria 3.

Elevate Your Gaming to Priority

Through Sunday, Nov. 20, upgrade to a six-month Priority Membership for just $29.99, 40% off the standard price of $49.99.

Priority Membership Sale
Don’t miss out on this special offer.

Power up devices compatible with GeForce NOW for the boost of a full gaming rig in the cloud. Get faster access to games with priority access to gaming servers. Enjoy extended play times with six-hour gaming sessions. And take supported games to the next level with beautifully ray-traced graphics with RTX ON.

This limited-time offer is valid for new users and existing ones upgrading from a free or one-month Priority Membership, as well as for those who are on an active promotion or gift card.

Check out the GeForce NOW membership page for more information on Priority benefits.

Sink Your Teeth Into ‘V Rising’

Awaken as a vampire after centuries of slumber and survive in a vast world teeming with mythical horrors and danger streaming V Rising on GeForce NOW.

Raise a castle, gather valuable resources and weapons, develop dark powers and convert humans into loyal servants in the quest to raise a vampire empire. Make allies or enemies online, or play solo in the game of blood, power and betrayal.

The game arrives just in time for members to join in on the Bloodfeast, where all creatures of the night are invited to play for free from Oct. 28-Nov. 1. V Rising players will be able to claim the free “Halloween Haunted Nights Castle DLC Pack” through Monday, Nov. 7.

Rule the night playing V Rising across your devices, even on a mobile phone. RTX 3080 members can even stream at 4K resolution on the PC and Mac apps.

Something Wicked Awesome This Way Comes

Gamers can get right into the frightful fun by checking out the horror and thriller titles included in the Halloween games row in the GeForce NOW app.

If games with a bit of a bite aren’t your thing, that’s okay. There’s something for everyone on the cloud.

Victoria 3 on GeForce NOW
This one’s for the history books. Balance competing interests to build an ideal society in the transformative 19th century.

Look out for the 12 total games available to stream today, including 3 new releases like Victoria 3.

  • Victoria 3 (New release on Steam)
  • Star Ocean: The Divine Force (New release on Steam, Oct. 27)
  • Paper Cut Mansion (New release on Steam and Epic Games, Oct. 27)
  • Saturnalia (New release on Epic Games, Oct. 27)
  • Asterigos: Curse of the Stars (Epic Games)
  • Draw Slasher (Steam)
  • Five Nights at Freddy’s: Security Breach (Steam and Epic Games)
  • Guild Wars: Game of the Year (Steam)
  • Labyrinthine (Steam)
  • Sniper Elite 5 (Steam)
  • Volcanoids (Steam)
  • V Rising (Steam)

Also, try the new LEGO Bricktales demo before buying the full game on Steam.

Additionally, Guild Wars Trilogy, announced previously, will not be coming to the cloud.

Ready for the scary season? There’s only one question left. Let us know your choice on Twitter or in the comments below.

The post Make Gaming a Priority: Special Membership Discount Hits GeForce NOW for Limited Time appeared first on NVIDIA Blog.

Read More

Jetson-Driven Grub Getter: Cartken Rolls Out Robots-as-a-Service for Deliveries

There’s a new sidewalk-savvy robot, and it’s delivering coffee, grub and a taste of fun.

The bot is garnering interest for Oakland, Calif., startup Cartken. The company, founded in 2019, has rapidly deployed robots for a handful of customer applications, including for Starbucks and Grubhub deliveries.

Cartken CEO Chris Bersch said that he and co-founders Jonas Witt, Jake Stelman and Anjali Jindal Naik got excited about the prospect for robots because of the technology’s readiness and affordability. The four Google alumni decided the timing was right to take the leap to start a company together.

“What we saw was a technological inflection point where we could make small self-driving vehicles work on the street,” said Bersch. “Because it doesn’t make sense to build a $20,000 robot that can deliver burritos.”

Cartken is among a surge of NVIDIA Jetson-enabled autonomous mobile robot (AMR) startups making advances across agtech, manufacturing, retail and last-mile delivery.

New and established companies are seeking business efficiencies as well as labor support amid ongoing shortages in the post-COVID era, driving market demand.

Revenue from robotic last-mile deliveries is expected to grow more than 9x to $670 million in 2030, up from $70 million in 2022, according to ABI Research.

Jetson Drives Robots as a Service 

Cartken offers robots as a service (RaaS) to customers in a pay-for-usage model. This way, as a white- label technology provider, Cartken enables companies to customize the robots for their particular brand appearance and specific application features.

It’s among a growing cohort of companies riding the RaaS wave, with ambitions as far flung as on-demand remote museum visits to autonomous industrial lawn mowers.

Much of this is made possible with the powerful NVIDIA Jetson embedded computing modules, which can handle a multitude of sensors and cameras.

“Cartken chose the Jetson edge AI platform because it offers superior embedded computational performance, which is needed to run Cartken’s advanced AI algorithms. In addition, the low energy consumption allows Cartken’s robots to run a whole day on a single battery charge,” said Bersch.

The company relies on the NVIDIA Jetson AGX Orin  to run six cameras that aid in mapping and navigation as well as wheel odometry to measure its physical distance of movement.

Harnessing Jetson, Cartken’s robots run simultaneous localization and mapping, or SLAM, to automatically build maps of their surroundings for navigation. “They are basically level-4 autonomy — it’s based on visual processing, so we can map out a whole area,” Bersch said.

“The nice thing about our navigation is that it works both indoors and outdoors, so GPS is optional — we can localize based on purely visual features,” he said.

Cartken is a member of NVIDIA Inception, a program that helps startups with GPU technologies, software and business development support.

Serving Grubhub and Starbucks

Cartken’s robots are serving Grubhub deliveries at the University of Arizona and Ohio State. Grubhub users can order on the app as normally they would, and get a tracking link to follow their order’s progress. They’re informed that their delivery will be by a robot, and can use the app to unlock the robot’s lid to grab grub and go.

Some might wonder if the delivery fee for such entertaining delivery technology is the same. “I believe it’s the same, but you don’t have to tip,” Bersch said with a grin.

Mitsubishi Electric is a distributor for Cartken in Japan. It relies on Cartken’s robots for deployments in AEON Malls in Tokoname and Toki for deliveries of Starbucks coffee and food.

The companies are also testing a “smart city” concept for outdoor deliveries of Starbucks goods within the neighboring parks, apartments and homes. In addition, Mitsubishi, Cartken and others are working on deliveries inside a multilevel office building.

Looking ahead, Cartken’s CEO says the next big challenge is scaling up robot manufacturing to keep pace with orders. It has strong demand from partners, including Grubhub, Mitsubishi and U.K. delivery company DPD.

Cartken in September announced a partnership with Magna International, a global leader in automotive supplies, to help scale up manufacturing of its robots. The agreement offers production of thousands of AMRs as well as development of additional robot models for different use cases.

The post Jetson-Driven Grub Getter: Cartken Rolls Out Robots-as-a-Service for Deliveries appeared first on NVIDIA Blog.

Read More