Getting Started with the NVIDIA Jetson Orin Nano Developer Kit
Introduction
The NVIDIA Jetson Orin Nano Developer Kit is a compact yet powerful platform designed for developers building AI-powered robots, drones, smart cameras, and edge devices.
With the December 2024 JetPack update, it delivers up to 70% more performance, making it ideal for generative AI and computer vision applications.
In this tutorial, we’ll walk through:
- What comes in the box
- Extra items you’ll need
- How to flash JetPack onto your microSD card
- First boot setup
- Unlocking maximum performance
- Running your first AI project 🚀
What’s in the Box?
- Jetson Orin Nano module (with microSD slot)
- Carrier board (with Wi-Fi/Bluetooth module)
- 19 V power supply
- Quick start guide
What Else Do You Need?
- microSD card (≥64 GB, UHS-1 recommended)
- USB keyboard and mouse
- HDMI/DisplayPort monitor
- Host PC with internet access
- Optional: NVMe SSD for high-speed storage
Step 1: Update the Firmware
- Download JetPack 5.1.3 SD card image from NVIDIA’s site.
- Flash it to your microSD card.
- Boot the Jetson once with that card → firmware auto-updates.
Step 2: Flash JetPack 6.x
Use Etcher to write the JetPack image to your microSD card.
On Linux/macOS, you can also use:
# Replace /dev/sdX with your SD card
sudo dd if=jetpack-image.img of=/dev/sdX bs=4M status=progress
sync
Insert the microSD into your Jetson when done.
Step 3: First Boot Setup
- Insert flashed card → connect display, keyboard, mouse.
- Plug in power supply → Jetson boots (green LED on).
- Go through the setup wizard (language, Wi-Fi, user account).
Step 4: Unlock MAX Performance
Default mode = 25 W. For heavy AI tasks, switch to MAXN SUPER:
- Click NVIDIA logo → Power Mode → MAXN SUPER
Step 5: Verify Setup
Run these commands:
# Kernel info
uname -a
# GPU status
nvidia-smi
Step 6: Run Your First AI Project (Real-Time Object Detection)
Now the fun part — let’s run object detection on your Jetson!
Install Jetson Inference
# Update packages
sudo apt-get update
sudo apt-get install git cmake python3-pip
# Clone NVIDIA’s repo
git clone --recursive https://github.com/dusty-nv/jetson-inference
cd jetson-inference
# Build and install
mkdir build
cd build
cmake ..
make -j$(nproc)
sudo make install
sudo ldconfig
Run Live Object Detection
If you have a USB camera connected:
# Detect objects from camera feed
python3 jetson_inference/detectnet.py \
--model=ssd-mobilenet-v2 \
--input-flags="--input-uri=/dev/video0"
This will open a live video stream with bounding boxes drawn around detected objects (people, bottles, chairs, etc.).
Run on an Image
python3 jetson_inference/detectnet.py \
--model=ssd-mobilenet-v2 \
--input-flags="--input-uri=example.jpg" \
--output=output.jpg
Your Jetson will process the image and save the result with detected objects highlighted.
Next Steps
- Try NVIDIA’s Jetson AI Lab demos for LLMs and vision tasks.
- Hook up a CSI camera for robotics projects.
- Deploy models trained in PyTorch or TensorFlow.
- Explore ROS (Robot Operating System) for robotics applications.
Final Thoughts
The NVIDIA Jetson Orin Nano Developer Kit is more than just powerful hardware — it’s an ecosystem for learning, prototyping, and deploying AI at the edge.
With your setup complete and your first object detection model running, you’re ready to explore robotics, vision AI, and even generative AI models right on the edge.
👉 The future of AI doesn’t live in the cloud — it lives where you build it.
