NVIDIA Expands vGPU Software to Accelerate Workstations, AI Compute Workloads

Designers, engineers, researchers, creative professionals all need the flexibility to run complex workflows – no matter where they’re working from.

With the newest release of NVIDIA virtual GPU (vGPU) technology, enterprises can provide their employees with more power and flexibility through GPU-accelerated virtual machines from the data center or cloud.

Available now, the latest version of our vGPU software brings GPU virtualization to a broad range of workloads — such as virtual desktop infrastructure, high-performance graphics, data analytics and AI —  thanks to its support for the new NVIDIA A40  and NVIDIA A100 80GB GPUs. The new release also supports the NVIDIA GPU Operator, a software framework that simplifies GPU deployment and management.

Powerful Performance for Power Users

NVIDIA RTX Virtual Workstation (vWS) software is a major component of the vGPU portfolio, designed to help users run graphics-intensive applications on virtual workstations. With NVIDIA A40 powering NVIDIA RTX vWS, professionals can achieve up to 60 percent(1) faster virtual workstation performance per user and 2x(2) faster rendering than the previous generation RTX 6000 GPUs.

NVIDIA A40 includes second-generation RT Cores and third-generation Tensor Cores to help users accelerate workloads like photorealistic rendering of movie content, architectural design evaluations, and virtual prototyping of product designs. With 48GB of GPU memory, professionals can easily work with massive datasets and run workloads like data science or simulation with even larger model sizes.

NVIDIA A40 support with the latest vGPU software enables complex graphics workloads to be run in a virtualized environment with performance that is on par with bare metal.

“With support for NVIDIA’s latest vGPU software, and the new NVIDIA A40 with Citrix Hypervisor 8.2 and Citrix Virtual Desktops, we can continue providing the performance customers need to run graphics-intensive visualization applications as their data and workloads grow,” said Calvin Hsu, vice president of product management at Citrix. “The combination of Citrix and NVIDIA virtualization technologies provides access to these applications from anywhere, with an experience that is indistinguishable from a physical workstation.”

The NVIDIA vGPU January 2021 software release supports the NVIDIA A100 80GB to deliver increased memory bandwidth, unlocking more power for large models. This builds on the September release, which introduced compute features that included support for the NVIDIA A100 Tensor Core GPU, the most advanced GPU for AI and high performance computing.

Additional new features include simplified GPU management in Kubernetes through NVIDIA GPU Operator, which is now supported with NVIDIA Virtual Compute Server and NVIDIA RTX vWS software. Containers, including the GPU-optimized software available in the NGC catalog, can be easily deployed and managed in VMs.

With this new release, customers and IT professionals can continue managing their multi-tenant workflows running in virtual machines using popular hypervisors, like Red Hat Enterprise Linux, while the certified GPU Operator brings a similar experience to containerized deployments on top of Red Hat virtualization platforms using Red Hat OpenShift.  

“The combination of NVIDIA’s latest generation A40 GPU and NVIDIA vGPU software, supported with Red Hat Enterprise Linux and Red Hat Virtualization, offers a powerful platform capable of serving some of the most demanding workloads ranging from AI/ML to visualization in the oil and gas as well as media and entertainment industries,” said Steve Gordon, director of product management at Red Hat. “As organizations transform and increasingly use containers orchestrated by Kubernetes as key building blocks for their applications, we see Red Hat OpenShift as a likely destination for containerized and virtualized workloads alike.”

To find a certified server, see the NVIDIA vGPU Certified Server page.

Learn more about NVIDIA vGPU software portfolio, which includes:

  • NVIDIA RTX Virtual Workstation (RTX vWS) (formerly known as Quadro Virtual Data Center Workstation or Quadro vDWS)
  • NVIDIA Virtual Compute Server (vCS)
  • NVIDIA Virtual PC (vPC) (formerly known as GRID vPC)
  • NVIDIA Virtual Applications (vApps) (formerly known as GRID vApps)

 

1. Tested on a server with Intel Xeon Gold 6154 3.0GHz 3.7GHz Turbo, RHEL 8.2, vGPU 12.0 software, running four concurrent users per GPU, RTX6000P-6Q versus A40-12Q, running SPECviewperf 2020 Subtest 4K 3dsmax-07 composite.

2. Iray 2020.1. Render time (seconds) of NVIDIA Endeavor scene.

The post NVIDIA Expands vGPU Software to Accelerate Workstations, AI Compute Workloads appeared first on The Official NVIDIA Blog.

Read More

A Sense of Responsibility: Lidar Sensor Makers Build on NVIDIA DRIVE

When it comes to autonomous vehicle sensor innovation, it’s best to keep an open mind — and an open development platform.

That’s why NVIDIA DRIVE is the chosen platform on which the majority of these sensors run.

In addition to camera sensors, NVIDIA has long recognized that lidar is a crucial component to an autonomous vehicle’s perception stack. By emitting invisible lasers at incredibly fast speeds, lidar sensors can paint a detailed 3D picture from the signals that bounce back instantaneously.

These signals create “point clouds” that represent a three-dimensional view of the environment, allowing lidar sensors to provide the visibility, redundancy and diversity that contribute to safe automated and autonomous driving.

Most recently, lidar makers Baraja, Hesai, Innoviz, Magna and Ouster have developed their offerings to run on the NVIDIA DRIVE platform to deliver robust performance and flexibility to customers.

These sensors offer differentiated capabilities for AV sensing, from Innoviz’s lightweight, affordable and long-range solid-state lidar to Baraja’s long wavelength, long-range sensors.

“The open and flexible NVIDIA DRIVE platform is a game changer in allowing seamless integration of Innoviz lidar sensors in endless new and exciting opportunities,” said Innoviz CEO Omer Keilaf.

Ouster’s OS series of sensors offers high resolution as well as programmable fields of view to address autonomous driving use cases. It also provides a camera-like image with its digital lidar system-on-a-chip for greater perception capabilities.

Hesai’s latest Pandar128 sensor offers a 360-degree horizontal field of view with a detection range from 0.3 to 200 meters. In the vertical field of view, it uses denser beams to allow for high resolution in a focused region of interest. The low minimum range reduces the blind spot area close to and in front of the lidar sensor.

“The Hesai Pandar128’s resolution and detection range enable object detection at greater distances, making it an ideal solution for highly automated and autonomous driving systems,” said David Li, co-founder and CEO of Hesai. “Integrating our sensor with NVIDIA’s industry-leading DRIVE platform will provide an efficient pathway for AV developers.”

The Hesai Pandar128 clearly detects objects and street signs at varying distances.

With the addition of these companies, the NVIDIA DRIVE ecosystem addresses every autonomous vehicle development need with verified hardware.

Plug and Drive

Typically, AV developers experiment with different variations of a sensor suite, modifying the number, type and placement of sensors. These configurations are necessary to continuously improve a vehicle’s capabilities and test new features.

An open, flexible compute platform can facilitate these iterations for effective autonomous vehicle development. And the industry agrees, with more than 60 sensor makers — from camera suppliers such as Sony, to radar makers like Continental, to thermal sensing companies such as FLIR — choosing to develop their products with the NVIDIA DRIVE AGX platform.

Along with the compute platform, NVIDIA provides the infrastructure to experience chosen sensor configurations with NVIDIA DRIVE Sim — an open simulation platform with plug-ins for third-party sensor models. As an end-to-end autonomous vehicle solutions provider, NVIDIA has long been a close partner to the leading sensor manufacturers.

“Ouster’s flexible digital lidar platform, including the new OS0-128 and OS2-128 lidar sensors, gives customers a wide variety of choices in range, resolution and field of view to fit in nearly any application that needs high-performance, low-cost 3D imaging,” said Angus Pacala, CEO of Ouster. “With Ouster as a member of the NVIDIA DRIVE ecosystem, our customers can plug and play our sensors easier than ever.”

A street level view from an Ouster lidar sensor.

Laser Focused

The NVIDIA DRIVE ecosystem includes a diverse set of lidar manufacturers — including Velodyne and Luminar as well as tier-1 suppliers — specializing in different features such as wavelength, signaling technique, field of view, range and resolution. This variety gives users flexibility as well as room for customization for their specific autonomous driving application.

The NVIDIA DriveWorks software development kit includes a sensor abstraction layer that provides a simple and unified interface that streamlines the bring-up process for new sensors on the platform. The interface saves valuable time and effort for developers as they test and validate different sensor configurations.

A 3D point cloud created by a Velodyne lidar sensor.

“To achieve maximum autonomous vehicle safety, a combination of sensor technologies including radar and lidar is required to see in all conditions. NVIDIA’s open ecosystem approach allows OEMs and tier 1s to select the safest and most cost-effective sensor configurations for each application,” said Jim McGregor, principal analyst at Tirias Research.

The NVIDIA DRIVE ecosystem gives autonomous vehicle developers the flexibility to select the right types of sensors for their vehicles, as well as iterate on configurations for different levels of autonomy. As an open AV platform with premier choices, NVIDIA DRIVE puts the automaker in the driver’s seat.

The post A Sense of Responsibility: Lidar Sensor Makers Build on NVIDIA DRIVE appeared first on The Official NVIDIA Blog.

Read More

Certifiably Fast: Top OEMs Debut World’s First NVIDIA-Certified Systems Built to Crush AI Workloads

AI, the most powerful technology of our time, demands a new generation of computers tuned and tested to drive it forward.

Starting today, data centers can get boot up a new class of accelerated servers from our partners to power their journey into AI and data analytics. Top system makers are delivering the first wave of NVIDIA-Certified Systems, the industry’s only servers tested for modern workloads.

These systems speed AI thanks to NVIDIA’s latest GPUs riding NVIDIA Mellanox networks. They spin up machine learning techniques that unearth insights from growing mounds of corporate data, gems that traditional systems miss.

Dell Technologies, GIGABYTE, Hewlett Packard Enterprise, Inspur and Supermicro are all shipping certified servers today. NVIDIA is collaborating with top OEMs around the world to drive AI forward across every industry.

The first systems off the line using NVIDIA A100 Tensor Core GPUs include:

  • Dell EMC PowerEdge R7525 and R740 rack servers
  • GIGABYTE R281-G30, R282-Z96, G242-Z11, G482-Z54, G492-Z51 systems
  • HPE Apollo 6500 Gen10 System and HPE ProLiant DL380 Gen10 Server
  • Inspur NF5488A5
  • Supermicro A+ Server AS -4124GS-TNR and AS -2124GQ-NART

They all carry the NVIDIA-Certified Systems badge that gives customers confidence they’re buying systems that meet NVIDIA’s best design practices. That means they can tackle the toughest tasks in machine learning, data analytics and more.

NVIDIA-Certified Systems with logos x 1280
Leading makers of accelerated servers have NVIDIA-Certified Systems ready today.

A Tipping Point for Enterprise AI

The systems arrive as leading corporations are getting traction in AI.

American Express is using the latest AI models for real-time fraud detection. Ford taps generative adversarial networks to generate data it needs to test self-driving cars. And Dominos applies AI to improve predictions of when orders will be ready for the 3 billion pizzas it delivers every year.

They are among many companies plugging into a powerful new form of computing, born on the web and now spreading into sectors from retail and logistics to banking and healthcare.

Gartner estimates 37 percent of all organizations have AI in production today and predicts that will double to 75 percent by 2024.

Scaling a Big Data Mountain

Companies seek strategic insights hidden in a rising mountain of data. Walmart, for example, processes more than 2.5 petabytes of data every hour.

AI models to sift through that data have grown in size by nearly 30,000x in just five years, driving the need for accelerated computing. And the diversity of models and workloads using them continues to expand, so businesses need the flexibility of GPUs.

The rising tide of data and the expanding AI models to sift through it are spawning an exponential increase in network traffic both in the data center and at the network’s edge. To cope, companies need a secure, reliable and high-speed infrastructure that scales efficiently.

Acing the Test for AI

NVIDIA-Certified Systems deliver the performance, programmability and secure throughput enterprise AI needs. They combine the computing power of GPUs based on the NVIDIA Ampere architecture with secure, high-speed NVIDIA Mellanox networking.

To pass the certification, the systems are tested across a broad range of workloads, from jobs that require multiple compute nodes to tasks that only need part of the power of a single GPU.

The systems are optimized to run AI applications from the NGC catalog, NVIDIA’s hub for GPU-optimized applications.

NGC is also the home for an expanding set of software development kits that bring AI to vertical markets such as healthcare (Clara) and robotics (Isaac). In addition, it holds frameworks that help companies get started in emerging use cases like recommendation systems (Merlin) and intelligent video analytics (Metropolis).

Specifically, NVIDIA-Certified Systems must pass tests on:

  • Deep learning training and inference
  • Machine learning algorithms
  • Intelligent video analytics
  • Network and storage offload

The tests focus on real-world use cases. They use popular AI frameworks and containers, all available in the NGC catalog.

As a result, NVIDIA-Certified Systems let every company access the same hardware and software behind some of the most powerful AI computers on the planet.

All of the world’s largest cloud service providers and eight of the world’s top 10 supercomputers are powered by NVIDIA technology. And NVIDIA-based systems lead in AI benchmarks such as MLPerf.

A Peek Under the Hood

NVIDIA-Certified Systems include powerful data center servers with as many as eight A100 GPUs and high-speed InfiniBand or Ethernet network adapters. Others are mainstream AI systems tailored to run AI at the edge of the corporate network.

OEMs certify the systems using NVIDIA Mellanox cables, switches and network cards such as ConnectX-6 InfiniBand or Ethernet adapters and BlueField-2 DPUs. In addition to high throughput at low latency, these adapters support multiple layers of security from a hardware root of trust at boot time to connection tracking for applications.

Every system was certified using either an NVIDIA Mellanox 8700 HDR 200G InfiniBand switch or the Mellanox SN3700 Ethernet switch.

All NVIDIA-Certified Systems are available with enterprise support across the full software stack, including support for open source code. That’s because we want to ensure enterprises across all vertical markets can quickly enjoy the benefits of AI.

With the latest systems from Dell Technologies, GIGABYTE, HPE, Inspur, and Supermicro, every company can start its own journey to enterprise AI.

To date, 14 servers from six systems makers are certified and ready to provide accelerated computing. They are among 70 systems from at least 11 system makers engaged in the program.

Stay tuned for news of more NVIDIA-Certified Systems from more partners.

NVIDIA-Certified Systems logo x 1280

The post Certifiably Fast: Top OEMs Debut World’s First NVIDIA-Certified Systems Built to Crush AI Workloads appeared first on The Official NVIDIA Blog.

Read More

Take Note: Otter.ai CEO Sam Liang on Bringing Live Captions to a Meeting Near You

Sam Liang is making things easier for the creators of the NVIDIA AI Podcast — and just about every remote worker.

He’s the CEO and co-founder of Otter.ai, which uses AI to produce speech-to-text transcriptions in real time or from recording uploads. The platform has a range of capabilities, from differentiating between multiple people, to understanding accents, to parsing through various background noises.

And now, Otter.ai is making live captioning possible on a variety of platforms, including Zoom, Skype and Microsoft Teams. Even Liang’s conversation with AI Podcast host Noah Kravitz was captioned in real time over Skype.

This new capability has been enthusiastically received by remote workers — Liang says that Otter.ai has already transcribed tens of millions of meetings.

Liang envisions even more practical effects of Otter.ai’s live captions. The platform can already identify keywords. Soon he thinks it’ll be recognizing action items, helping manage agendas and providing notifications.

Key Points From This Episode:

  • Otter.ai was founded in 2016 and is Liang’s second startup, after Alohar, a company focused on mobile behavior services. Once Alohar was acquired, Liang reflected that he needed better tools to help transcribe and share meetings, inspiring him to found Otter.ai.
  • The company’s AI model was built from scratch. Although Siri and Alexa predate it, Otter.ai needed to comprehend multiple voices that could overlap and vary in accents — a different, more complex task than understanding and responding to just one voice.

Tweetables:

“Though it’s been growing steadily before COVID, people have been using Otter on their laptop or on iOS or Android devices … you can use it anywhere.” — Sam Liang [7:32]

“Otter is your new meeting assistant. People will have the peace of mind that they don’t have to write down everything themselves.” — Sam Liang [22:07]

You Might Also Like:

Hugging Face’s Sam Shleifer Talks Natural Language Processing

Research engineer Sam Shleifer talks about Hugging Face’s natural language processing technology, which is in use at over 1,000 companies, including Apple, Bing and Grammarly, across fields ranging from finance to medical technology.

Pod Squad: Descript Uses AI to Make Managing Podcasts Quicker, Easier

Serial entrepreneur Andrew Mason talks about his company, Descript Podcast Studio, which is using AI, NLP and automatic speech synthesis to make podcast editing easier and more collaborative.

How SoundHound Uses AI to Bring Voice and Music Recognition to Any Platform

SoundHound made its name as a music identification service. Since then, it’s leveraged its 10+ years in data analytics to create a voice recognition tool that companies can bake into any product. SoundHound VP of Product Marketing Mike Zagorsek speaks about how the company has grown into a significant player in voice-driven AI.

Tune in to the AI Podcast

Get the AI Podcast through iTunes, Google Podcasts, Google Play, Castbox, DoggCatcher, Overcast, PlayerFM, Pocket Casts, Podbay, PodBean, PodCruncher, PodKicker, Soundcloud, Spotify, Stitcher and TuneIn. If your favorite isn’t listed here, drop us a note.

Tune in to the Apple Podcast Tune in to the Google Podcast Tune in to the Spotify Podcast

Make the AI Podcast Better

Have a few minutes to spare? Fill out this listener survey. Your answers will help us make a better podcast.

The post Take Note: Otter.ai CEO Sam Liang on Bringing Live Captions to a Meeting Near You appeared first on The Official NVIDIA Blog.

Read More

On the Road Again: GeForce NOW Alliance Expanding to Turkey, Saudi Arabia and Australia

Bringing more games to more gamers, our GeForce NOW game-streaming service is coming soon to Turkey, Saudi Arabia and Australia.

Turkcell, Zain KSA and Pentanet are the latest telcos to join the GeForce NOW Alliance.

By placing NVIDIA RTX Servers on the edge, GeForce NOW Alliance partners deliver even lower latency gaming experiences. And this gives partners an opportunity to show the value of their broadband and 5G infrastructure to customers.

Gamers get more games, in more places, faster.

Ericsson recently published their 2020 mobility report, noting, “5G population coverage

is estimated to reach 15 percent, equivalent to over 1 billion people.”

Meanwhile, DFC Intelligence reports, “there are now over 3 billion video game consumers around the world.”

Cloud gaming is at the intersection of these two trends.

GeForce NOW is the world’s leading cloud-gaming platform for PC gamers, offering hundreds of games from NVIDIA’s own data centers in North America and Europe.

The service brings real-time ray tracing to today’s biggest blockbusters to underpowered PCs, Macs, Chromebooks, Android and iOS devices.

It also offers an opportunity for the world’s leading telecommunications firms to deliver high-quality, low-latency PC gaming to nearly any device from the cloud. These partners form the GeForce NOW Alliance, a partnership of operators using RTX Servers and NVIDIA cloud-gaming software to expand and improve cloud gaming globally.

It’s our commitment to deliver the best cloud-gaming experience around the world.

GeForce NOW Alliance Grows, Again

The newest GeForce NOW Alliance members join a growing group of telecommunications experts that includes Softbank, KDDI, LG Uplus, Taiwan Mobile and GFN.RU.

Bringing cloud gaming to Turkey, in collaboration with their new gaming platform, Turkcell recently announced GeForce NOW Powered by GAMEPLUS. Gamers who want to get a leg up can visit the GAME+ page to pre-register.

Zain KSA, the leading 5G telecom operator in Saudi Arabia, is nearing a formal launch of its GeForce NOW service. The launch will bring GeForce NOW into the country for the first time.

Founded by a group of avid gamers, Pentanet has built Perth, Australia’s largest and fastest-growing fixed wireless network, delivering high-bandwidth internet services to the Perth metro area as well as fiber throughout Western Australia.

Australian gamers can look forward to PC gaming on nearly any device later this year.

Stay tuned as we work with additional partners to bring cloud gaming to new regions throughout 2021 and beyond.

The post On the Road Again: GeForce NOW Alliance Expanding to Turkey, Saudi Arabia and Australia appeared first on The Official NVIDIA Blog.

Read More

A Trusted Companion: AI Software Keeps Drivers Safe and Focused on the Road Ahead

Editor’s note: This is the latest post in our NVIDIA DRIVE Labs series, which takes an engineering-focused look at individual autonomous vehicle challenges and how NVIDIA DRIVE addresses them. Catch up on all of our automotive posts, here.

Even with advanced driver assistance systems automating more driving functions, human drivers must maintain their attention at the wheel and build trust in the AI system.

Traditional driver monitoring systems typically don’t understand subtle cues such as a driver’s cognitive state, behavior or other activity that indicates whether they’re ready to take over the driving controls.

NVIDIA DRIVE IX is an open, scalable cockpit software platform that provides AI functions to enable a full range of in-cabin experiences, including intelligent visualization with augmented reality and virtual reality, conversational AI and interior sensing.

Driver perception is a key aspect of the platform that enables the AV system to ensure a driver is alert and paying attention to the road. It also enables the AI system to perform cockpit functions that are more intuitive and intelligent.

In this DRIVE Labs episode, NVIDIA experts demonstrate how DRIVE IX perceives driver attention, activity, emotion, behaviour, posture, speech, gesture and mood with a variety of detection capabilities.

A Multi-DNN Approach

Facial expressions are complex signals to interpret. A simple wrinkle of the brow or shift of the gaze can have a variety of meanings.

DRIVE IX uses multiple DNNs to recognize faces and decipher the expressions of vehicle occupants. The first DNN detects the face itself, while a second identifies fiducial points, or reference markings — such as eye location, nose, etc.

On top of these base networks, a variety of DNNs operate to determine whether a driver is paying attention or requires other actions from the AI system.

The GazeNet DNN tracks gazes by detecting the vector of the driver’s eyes and mapping it to the road to check if they’re able to see obstacles ahead. SleepNet monitors drowsiness, classifying whether eyes are open or closed, running through a state machine to determine levels of exhaustion. Finally, ActivityNet tracks driver activity such as phone usage, hands on/off the wheel and driver attention to road events. DRIVE IX can also detect whether the driver is properly sitting in their seat to focus on road events.

In addition to driver focus, a separate DNN can determine a driver’s emotions — a key indicator of their ability to safely operate the vehicle. Taking in data from the base face-detect and fiducial-point networks, DRIVE IX can classify a driver’s state as happy, surprised, neutral, disgusted or angry.

It can also tell if the driver is squinting or screaming, indicating their level of visibility or alertness and state of mind.

 

A Customizable Solution

Vehicle manufacturers can leverage the driver monitoring capabilities in DRIVE IX to develop advanced AI-based driver understanding capabilities for personalizing the car cockpit.

The car can be programmed to alert a driver if their attention drifts from the road, or the cabin can adjust settings to soothe occupants if tensions are high.

And these capabilities extend well beyond driver monitoring. The aforementioned DNNs, together with gesture DNN and speech capabilities, enable multi-modal conversational AI offerings such as automatic speech recognition, natural language processing and speech synthesis.

These networks can be used for in-cabin personalization and virtual assistant applications. Additionally, the base facial recognition and facial key point models can be used for AI-based video conferencing platforms.

The driver monitoring capabilities of DRIVE IX help build trust between occupants and the AI system as automated driving technology develops, creating a safer, more enjoyable intelligent vehicle experience.

The post A Trusted Companion: AI Software Keeps Drivers Safe and Focused on the Road Ahead appeared first on The Official NVIDIA Blog.

Read More

Electric Avenue: NVIDIA Engineer Revs Up Classic Car to Sport AI

Arman Toorians isn’t your average classic car restoration hobbyist.

The NVIDIA engineer recently transformed a 1974 Triumph TR6 roadster at his home workshop into an EV featuring AI.

Toorians built the vehicle to show a classic car can be recycled into an electric ride that taps NVIDIA Jetson AI for safety, security and vehicle management features.

He hopes the car blazes a path for others to explore electric vehicle conversions that pack AI.

“My objective is to encourage others to take on this task — it’s a lot of fun,” he said.

Where It Began

The story begins when Toorians purchased the Triumph in “junkyard condition” for $4,000 from a retired cop who gave up on ambitions of restoring the car.

The conversion cost $20,000 in parts. He acquired a motor from NetGain Motors, electrical components from EV West, Triumph parts from Moss Motors, and five Tesla batteries capable of 70 miles on a charge for his setup. NVIDIA supplied the Jetson Xavier NX developer kit.

To find spare time, the quadrilingual Persian-Armenian musician took a hiatus from playing flamenco and jazz guitar. Co-workers and friends cheered on his effort, he said, while his wife and son gave a hand and advice as needed.

Three years in the making, the car’s sophisticated build with AI may just set a new bar for the burgeoning classic-to-electric conversion space.

Jetson for Security 

The car is smart. And it may have as much in common with a Tesla as it does with future generations of EVs from automakers everywhere adopting AI.

That’s because he installed the NVIDIA Jetson NX developer kit in the little sports car’s trunk. The power-efficient Jetson edge AI platform provides compact supercomputing performance capable of handling multi-modal inference, devouring data from the car’s sensors and camera.

Toorian’s Triumph packs the compact Jetson Xavier NX in the trunk for AI.

“Jetson is a great platform that addresses many different markets and can also be very easily used to solve many problems in this DIY electric car market,” Toorians said.

Toorians, a director in the Jetson group, uses the Jetson Xavier NX for a growing list of cool features. For example, using a dash cam feed and the NVIDIA DeepStream SDK for intelligent video analytics, he’s using the Jetson to work on securing the car. Like the talking car KITT from the hit ‘80s TV series Knight Rider, the Triumph will be able to react with verbal warnings to anyone unauthorized to be in or around it.

If that doesn’t work to deter a threat to the vehicle, his plans are for Jetson to step it up a notch and activate alarms and email alerts.

Lighter, Faster, Cleaner

The sleek sports car has impressive specs. It’s now lighter, faster and cleaner on its energy usage than when it rolled out of the factory decades ago. It also doesn’t leave motor oil stains or the stench of gas and exhaust fumes in its wake.

In place of the greasy 350-pound six cylinder gas engine, Toorians installed a shiny 90-pound electrical motor. The new motor lives in the same spot under the hood and puts out 134 horsepower versus the stock engine’s 100 horses, providing peppy takeoffs.

The Triumph’s engine bay sports a lighter, cleaner and peppier motor.

He gets a lot of thumbs-up reactions to the car. But for him the big payoff is repurposing the older gasoline vehicle to electric to avoid environmental waste and exhaust pollutants.

“Electric cars help keep our air and environment clean and are the way toward a more sustainable future in transportation,” he said.

Jetson for Safety

Among its many tricks, the Triumph uses AI to recognize the driver so that only authorized users can start it — and only when seat belts are fastened by both the driver and passenger.

The dash camera running through Jetson — which can process more than 60 frames per second — is being used to generate lane departure warnings.

Front and rear lidar generate driver alerts if objects are in the Triumph’s path or too close. The front lidar and the dash cam can also be used to run the cruise control.

Jetson processes the front and rear battery temperatures and translates it for the analog temperature gauges. Jetson is also called on to read the battery capacity and display it on the analog fuel gauge, keeping it stock. Another nice one: the fuel cap now covers the charging port.

Touches like these — as well as keeping a functioning four-speed shifter — allowed the car to keep its original look.

“Rebuilding older generation cars with electric motors and computers helps us recycle good gasoline engine cars that otherwise would have to be destroyed,” he said.

DIY makers and businesses alike turn to NVIDIA Jetson for edge AI.

 

 

The post Electric Avenue: NVIDIA Engineer Revs Up Classic Car to Sport AI appeared first on The Official NVIDIA Blog.

Read More

Amid CES, NVIDIA Packs Flying, Driving, Gaming Tech News into a Single Week

Flying, driving, gaming, racing… amid the first-ever virtual Consumer Electronics Show this week, NVIDIA-powered technologies spilled out in all directions.

In automotive, Chinese automakers SAIC and NIO announced they’ll use NVIDIA DRIVE in future vehicles.

In gaming, NVIDIA on Tuesday led off a slew of gaming announcements by revealing the affordable new RTX 3060 GPU and detailing the arrival of more than 70 30-series GPUs for gamers and creatives.

In robotics, the Skydio X2 drone has received the CES 2021 Best of Innovation Award for Drones and Unmanned Systems.

And in, well, a category all its own, the Indy Autonomous Challenge, unveiled Thursday, will pit college teams equipped with sleek, swift vehicles equipped with ADLINK DLP-8000 robot controller powered by NVIDIA GPUs against each other for a $1.5 million prize.

This week’s announcements were just the latest examples of how NVIDIA is driving AI and innovation into every aspect of our lives.

Game On

Bringing more gaming capabilities to millions more gamers, NVIDIA Tuesday announced more than 70 new laptops will feature GeForce RTX 30 Series Laptop GPUs and unveiled the NVIDIA GeForce RTX 3060 graphics card for desktops, priced at just $329.

All are powered by the award-winning NVIDIA Ampere GPU architecture, the second generation of RTX with enhanced Ray Tracing Cores, Tensor Cores, and new streaming multiprocessors.

NVIDIA also announced Call of Duty: Warzone and Square Enix’s new IP, Outriders. And Five Nights at Freddy’s: Security Breach and F.I.S.T.: Forged in Shadow Torch will be adding RTX ray tracing and DLSS.

The games are just the latest to support the real-time ray tracing and AI-based DLSS (deep learning super sampling) technologies, known together called RTX, which NVIDIA introduced two years ago.

The announcements were among the highlights of a streamed presentation from Jeff Fisher, senior vice president of NVIDIA’s GeForce business.

Amid the unprecedented challenges of 2020, “millions of people tuned into gaming — to play, create and connect with one another,” Fisher said. “More than ever, gaming has become an integral part of our lives.”

Hitting the Road

In automotive, two Chinese automakers announced they’ll be relying on NVIDIA DRIVE technologies.

Just as CES was starting electric car startup NIO announced a supercomputer to power its automated and autonomous driving features, with NVIDIA DRIVE Orin at its core.

The computer, known as Adam, achieves over 1,000 trillion operations per second of performance with the redundancy and diversity necessary for safe autonomous driving.

The Orin-powered supercomputer will debut in NIO’s flagship ET7 sedan, scheduled for production in 2022, and every NIO model to follow.

And on Thursday, SAIC, China’s largest automaker, announced it’s joining forces with online retail giant Alibaba to unveil a new premium EV brand, dubbed IM for “intelligence in motion.”

The long-range electric vehicles will feature AI capabilities powered by the high-performance, energy-efficient NVIDIA DRIVE Orin compute platform.

The news comes as EV startups in China have skyrocketed in popularity, with NVIDIA working with NIO along with Li Auto and Xpeng to bolster the growth of new-energy vehicles.

Taking to the Skies

Meanwhile, Skydio, the leading U.S. drone manufacturer and world leader in autonomous flight, today announced it received the CES 2021 Best of Innovation Award for Drones and Unmanned Systems for the Skydio X2.

Skydio’s new autonomous drone offers enterprise and public sector customers up to 35 minutes of autonomous flight time.

Packing six 4k cameras and powered by the NVIDIA Jetson TX2 mobile supercomputer, it’s built to offer situational awareness, asset inspection, and security patrol.

The post Amid CES, NVIDIA Packs Flying, Driving, Gaming Tech News into a Single Week appeared first on The Official NVIDIA Blog.

Read More

IM AI: China Automaker SAIC Unveils EV Brand Powered by NVIDIA DRIVE Orin

There’s a new brand of automotive intelligence equipped with the brains — and the battery — to go the distance.

SAIC, the largest automaker in China, joined forces with etail giant Alibaba to unveil a new premium EV brand, dubbed IM, or “intelligence in motion.” The long-range electric vehicles will feature AI capabilities powered by the high-performance, energy-efficient NVIDIA DRIVE Orin compute platform.

The first two vehicles in the lineup — a flagship sedan and SUV — will have autonomous parking and other automated driving features, as well as a 93kWh battery that comes standard. SAIC will begin taking orders for the sedan at the Shanghai Auto Show in April, with the SUV following in 2022.

These models will have multi NVIDIA Orin SoCs (system-on-a-chip) at the core of a centralized computer system, achieving 500 to 1,000+ TOPS of performance for automated and autonomous capabilities in addition to in-cabin personalization that is continuously upgradable over-the-air for a truly software-defined experience.

By centralizing and unifying the compute architecture, IM vehicles will be able to receive advanced software features as they’re developed. Just like a mobile phone, which periodically gets software updates, these software-defined vehicles will do the same.

Premium Vehicles Inside and Out

Developing a top-of-the-line premium electric brand requires best-in-class in-vehicle compute.

Orin is the world’s highest-performance, most-advanced AV and robotics processor. This supercomputer on a chip is capable of delivering up to 254 trillion operations per second (TOPS) to handle the large number of applications and deep neural networks that run simultaneously in autonomous vehicles and robots, while meeting systematic safety standards such as ISO 26262 ASIL-D.

With two Orin SoCs at the center of IM vehicles, these compute platforms will be able to deliver more than 500 TOPS of performance to achieve the redundancy and diversity necessary for autonomous operation.

Like all modern computing devices, these intelligent vehicles will be supported by a large team of AI and software engineers, dedicated to improving the performance and capability of the car as technology advances.

Intelligence in Motion

The new IM vehicle lineup is upping the ante for intelligent and electric mobility.

These electric vehicles will come standard with a 93kWh battery, and a 115kWh one on premium trims.

 

The software-defined IM vehicles don’t just improve like mobile devices, they also work seamlessly with such technology. The brand includes smartphone features for personalized driver and passenger experiences, creating a smart living space inside the car.

As a new, continuously upgradeable electric vehicle lineup, SAIC’s premium IM brand will drive further innovation in intelligent, personal transportation.

The post IM AI: China Automaker SAIC Unveils EV Brand Powered by NVIDIA DRIVE Orin appeared first on The Official NVIDIA Blog.

Read More

Glassdoor Ranks NVIDIA No. 2 in Latest Best Places to Work List

NVIDIA is the second-best place to work in the U.S. according to a ranking released today by Glassdoor.

The site’s Best Places to Work in 2021 list rates the 100 best U.S. companies with more than 1,000 employees, based on how their own employees rate career opportunities, company culture and senior management.

The survey’s top finisher was Bain & Company. Right behind in NVIDIA on the list are In-N-Out Burger, HubSpot and McKinsey & Company.

“This year’s winning employers have proven, according to employees, that even during extraordinary times, they’ll rise to the challenge to support their people,” said Christian Sutherland-Wong, Glassdoor chief executive officer.

Among other recent recognitions of NVIDIA’s efforts to take care of our people and communities amid the pandemic:

The post Glassdoor Ranks NVIDIA No. 2 in Latest Best Places to Work List appeared first on The Official NVIDIA Blog.

Read More