So here’s the thing — I stream with portrait NDI sources on a landscape canvas, and zooming in on a specific part of the frame in real time has always been a pain.
You either bake it into the scene or do awkward hotkey gymnastics.
I wanted something more live, more intuitive, more point-and-shoot. So I built it.
What It Actually Does
You get a small floating window on your screen, like a virtual trackpad. You move your mouse around inside it, and your OBS source zooms and pans to follow exactly where you’re pointing. That’s the whole idea. No hotkeys, no keyframes, no fuss.
It supports up to four sources at once for now, each as its own tab in the tracker window. Switch tabs, control a different source. The zoom level, smoothing, and mode are all independent per tab — so one source can be zoomed in tight while another is just panning.


The Controls
There are three modes:
ZOOM — classic zoom-to-mouse. The source magnifies around your cursor position and pans to follow both X and Y. Great for detail shots.
PAN — no zoom, just vertical pan. This is the mode I use most. The source slides up and down as you move the mouse vertically, revealing off-screen content on a portrait source that overflows the canvas. Silky smooth. Will be adding vertical pan as well, so either vertical, horizontal or both.
OFF — nothing happens. Source sits where it is.
You can adjust zoom level with the scroll wheel (or the slider), and the smooth slider controls how snappy or floaty the motion feels. Set it near zero for instant response, crank it up for that slow cinematic drift.
The Drift Feature
This one’s my favourite. When you enable drift and move the mouse off the tracker canvas, the zoom position gently glides back toward a target — top, center, or bottom — on its own. There’s a 150ms delay before it kicks in, so a quick graze of the canvas edge won’t trigger it. Move the mouse back in and it snaps back to tracking immediately.
Super useful for portrait sources where you want to zoom in on a face, then drift back to the top of the frame when you’re done without having to click anything.
Scene Changes Are Handled Too
Switch scenes in OBS and the script notices. Any source that just left the active scene gets a smooth reset back to its original position automatically. No leftover zoom state bleeding into your next scene.
How It’s Built
It’s a Python + Lua combo. The floating tracker window is a Python/Tkinter app that writes mouse position to a temp file 60 times a second. A Lua script in OBS reads that file, pushes the coordinates to a GLSL shader filter (via obs-shaderfilter), and adjusts the source’s Y position for the pan. The two pieces talk through a file on disk — simple and completely reliable.
The Lua side was where most of the interesting engineering happened: scene item caching to avoid hammering the OBS API every tick, proper reference counting for all OBS objects, safe shutdown guards to stop OBS crashing when you close it. Boring but important stuff.
I tried many of the other solutions, but they mostly focus on capture sources and lack the flexibility or even the option to use an NDI source. As mine is build upon the shader it can be used on any source :)
- ZoomToMouse , need to work on my script name ;)
- OBS Lua Zoom and Follow
What You Need
- OBS Studio
- obs-shaderfilter plugin
- Python 3 (Homebrew on macOS, python.org on Windows)
- A source with a “ZoomToMouse” shader filter applied
That’s it. No native plugin, no compiling, no special hardware.
Might release after some battle testing.
Is It Perfect?
Nah. The tracker window is a separate Python app rather than a proper OBS dock widget, which means it floats outside OBS. But honestly? That’s useful — you can put it on a second monitor or tuck it in a corner of your screen. A full OBS plugin rewrite would take weeks and deliver a worse UI, so the current setup is staying.
The temp file communication runs at 60fps with essentially zero CPU impact. The shader does the heavy lifting on the GPU.
It works, it’s fun to use, and it makes live camera-style tracking possible for NDI sources without any hardware. That’s good enough for me. At least for now!
FAQ
What is a live zoom & pan tracker for OBS and how does it work?
A live zoom & pan tracker is a tool that lets you move your mouse to dynamically zoom in and pan an OBS source in real time. It displays a floating control window where cursor movement drives the zoom and panning behaviour without hotkeys or keyframes. It combines a Python app and an OBS Lua script to adjust the source based on mouse position.
Which OBS sources does the zoom & pan tracker support?
The tracker supports multiple video sources simultaneously. Up to four sources can be controlled independently via tabs in the tracker window. Each source can have its own zoom level, smoothing, and mode settings.
What modes does the zoom & pan tracker provide?
The tracker typically offers modes such as classic zoom-to-mouse (zoom and pan together), pan-only (moves the source vertically or horizontally), and off (source remains static). Users can adjust how smoothly the motion responds to the mouse.
What technical components are needed to run this tracker?
You need OBS Studio, a compatible scripting plugin (such as a Lua script using the obs-shaderfilter), Python 3 installed on your system, and the shader filter applied to your target source. The Python app computes mouse coordinates that the script uses to update the OBS source transform.
Can this tool replace hotkeys or manual keyframe animation?
Yes. The live pan & zoom tracker eliminates the need for manually setting hotkeys or keyframes for camera movement. Mouse movement directly drives zoom and pan behavior, making live adjustments intuitive and fluid.
Do existing OBS zoom/follow scripts and plugins offer similar features?
Yes. There are zoom & follow scripts and source zoom plugins for OBS that track the mouse or follow a specific region. Examples include dynamic mouse-follow scripts and tools like Zoominator, which provide smooth zoom and pan behaviors for tutorial or streaming use cases.
Will this tracker work with NDI or virtual camera sources?
Yes. Because the tracker operates on the source transform and not on the capture method, it can work with any OBS source that accepts scaling and position changes, including NDI sources, display capture, or virtual camera feeds.
Are there limitations or downsides to this approach?
Since the tracker uses an external Python app for the floating control and reads/writes to a temporary file, it may lack native OBS dock integration. However, this separation can be beneficial by keeping the control window accessible on a second monitor or outside the main OBS interface.
