How can i force to end all streams (or pipeline) at a specific timestamp

classic Classic list List threaded Threaded
4 messages Options
Reply | Threaded
Open this post in threaded view
|

How can i force to end all streams (or pipeline) at a specific timestamp

hammerwerfer
Hi,
my pipeline has an audio stream (filesrc with mp3 file with a length of 3 minutes) and a video stream (appsrc, image/jpeg ! queue jpegdec ! videoconvert ... etc.). The appsrc get it's data frame by frame, via the need-data signal callback:
    def gst_need_data(self, src, need_bytes):
        if self.curFrame == 200:
            src.emit("end-of-stream")
            return

        buf = Gst.Buffer.new_wrapped(imgData)
        buf.pts = self.curFrame * 40 * Gst.MSECOND
        buf.duration = 40 * Gst.MSECOND
        ret = src.emit("push-buffer", buf)
        if ret == Gst.FlowReturn.OK:
            self.curFrame += 1
When all images are fed to the appsrc (after 200 frames) i want to stop everything. For the appsrc i emit the "end-of-stream" signal. But the pipeline still runs until the audio stream has finished, so the resulting video file has the same duration like the input audio stream.
If i emit an eos-signal to the audioconverter (self.audioConv.send_event(Gst.Event.new_eos() right after src.emit("end-of-stream")) as well the audio stream stops. Almost like i wanted it. But the audio stream stops to early. The video takes 8 seconds (200 frames @ 25fps) and the audio stops after 5 seconds. Probably due to buffers or queues. I guess i need somehow to stop the pipeline at a specific timestamp. But how? I hope someone can give me a hint.

Thanks,
Jens
Reply | Threaded
Open this post in threaded view
|

Re: How can i force to end all streams (or pipeline) at a specific timestamp

Nicolas Dufresne-4
That's because this pipeline is not live, the video data will be
accumulated in the queue, so when you send eos, you may be ahead of
time on the audio. You may drop the queues (might not be perfect, but
it's simple), or use a pad probe to monitor each src position. When the
time of the produced buffer is passed the range you want, you can send
eos on that branch.

The being said, you need to track segment event, and use that segment
to convert the timesteamp in something meaningful (running time or
stream time, depending on what you really want).

good luck,
Nicolas

Le mercredi 24 février 2016 à 08:09 -0800, hammerwerfer a écrit :

> Hi,
> my pipeline has an audio stream (filesrc with mp3 file with a length
> of 3
> minutes) and a video stream (appsrc, image/jpeg ! queue jpegdec !
> videoconvert ... etc.). The appsrc get it's data frame by frame, via
> the
> need-data signal callback:
>
> When all images are fed to the appsrc (after 200 frames) i want to
> stop
> everything. For the appsrc i emit the "end-of-stream" signal. But the
> pipeline still runs until the audio stream has finished, so the
> resulting
> video file has the same duration like the input audio stream.
> If i emit an eos-signal to the audioconverter
> (self.audioConv.send_event(Gst.Event.new_eos() right after
> src.emit("end-of-stream")) as well the audio stream stops. Almost
> like i
> wanted it. But the audio stream stops to early. The video takes 8
> seconds
> (200 frames @ 25fps) and the audio stops after 5 seconds. Probably
> due to
> buffers or queues. I guess i need somehow to stop the pipeline at a
> specific
> timestamp. But how? I hope someone can give me a hint.
>
> Thanks,
> Jens
>
>
>
> --
> View this message in context: http://gstreamer-devel.966125.n4.nabble
> .com/How-can-i-force-to-end-all-streams-or-pipeline-at-a-specific-
> timestamp-tp4676025.html
> Sent from the GStreamer-devel mailing list archive at Nabble.com.
> _______________________________________________
> gstreamer-devel mailing list
> [hidden email]
> https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (188 bytes) Download Attachment
Reply | Threaded
Open this post in threaded view
|

Re: How can i force to end all streams (or pipeline) at a specific timestamp

hammerwerfer
Thanks for your quick response. Your advice pointed me to the right direction. First i used a pad probe for event segments. But i was not able to extract the stream time.
    def gst_probe_event(self, srcPad, probeInfo):
        evt = probeInfo.get_event()
        if evt.type == Gst.EventType.SEGMENT:
            evtSegment = evt.parse_segment()
            t1 = evtSegment.to_stream_time(evtSegment.format, evtSegment.position)
            t2 = evtSegment.to_running_time(evtSegment.format, evtSegment.position)
            t3 = evtSegment.offset_running_time(evtSegment.format, evtSegment.offset)

The to_stream_time() function (and the others) returned strange values (max long or something, and always the same values). So i gave up.
Now i am using a pad probe for type buffer (Gst.PadProbeType.BUFFER) and added it to my muxer (matroskamux). The value buffer.pts was a realistic timestamp for my audio stream So when buffer.pts reaches my final timestamp i send an eos event to my audioconverter element and the pipelines stops streaming. The resulting video file has the correct length now.

Thanks,
Jens
Reply | Threaded
Open this post in threaded view
|

Re: How can i force to end all streams (or pipeline) at a specific timestamp

Nicolas Dufresne-4
Le jeudi 25 février 2016 à 13:00 -0800, hammerwerfer a écrit :

> The to_stream_time() function (and the others) returned strange
> values (max
> long or something, and always the same values). So i gave up. 
> Now i am using a pad probe for type buffer (Gst.PadProbeType.BUFFER)
> and
> added it to my muxer (matroskamux). The value buffer.pts was a
> realistic
> timestamp for my audio stream So when buffer.pts reaches my final
> timestamp
> i send an eos event to my audioconverter element and the pipelines
> stops
> streaming. The resulting video file has the correct length now.
That's fine, may work for your case. Note that in GStreamer, PTS may
start at any point in time. So to be complete, you actually need two
pad probes, one that saves the GstSegment, and then in your BUFFER
probe, you can convert the PTS to stream time using that segment
information. This should work, if it returns TIME_NONE (max uint), it
would mean this buffer is outside the segment, and should be ignored
(like decode only buffers).

cheers,
Nicolas
_______________________________________________
gstreamer-devel mailing list
[hidden email]
https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel

signature.asc (188 bytes) Download Attachment