Hello,
I saw the new JT/T 1078 streaming support added in Traccar 6.13 and I would like to understand the architecture better.
How does video streaming work internally in Traccar?
Is there a REST/WebSocket API for starting streams?
What streaming format is exposed to the frontend (HLS, FLV, WebRTC, etc.)?
Can custom apps access the stream through an API endpoint?
There's a standard command you need to send to start it. Then connecto to HLS stream. It is available through API, so you can connect from a custom app.
Thanks, that clarifies a lot.
I have a few more integration questions:
What API endpoint is used to retrieve the HLS stream URL?
I also wanted to ask how live video is accessed from the Traccar web UI.
Is there already a built-in camera/video interface in the web app, or does it require custom integration using the API and HLS stream URL?
There's an interface in the web app. I recommend checking the code for more details.
Thanks, I looked into the web UI code (StreamPage.jsx) and backend implementation, and I think I understand the flow now — please correct me if I’m wrong.
From what I see:
- The web UI triggers "videoStart" / "videoStop" commands to the device
- Then it loads the stream using "/api/stream/{deviceId}/live.m3u8"
- The frontend uses hls.js to play the HLS stream in a normal HTML5 video element
- The backend exposes HLS via "VideoStreamResource" (".m3u8" + ".ts" segments)
- VideoStreamManager handles JT1078 ingestion and converts/buffers it into HLS
- Multiple clients seem to consume the same per-device HLS stream
So effectively the Web UI is just a thin HLS client on top of the backend stream API.
Is this understanding correct, especially regarding:
- whether streams are shared per device or per viewer session
- and whether VideoStreamManager does remuxing only or any transcoding?
Hello,
I saw the new JT/T 1078 streaming support added in Traccar 6.13 and I would like to understand the architecture better.
How does video streaming work internally in Traccar?
Is there a REST/WebSocket API for starting streams?
What streaming format is exposed to the frontend (HLS, FLV, WebRTC, etc.)?
Can custom apps access the stream through an API endpoint?