Hi, Can someone give me an example pipeline for processing captions from decklinkvideosrc into a h.264 or mpeg2 video? Preferably, I would like to end up with a transport stream. gst-launch-1.0 version 1.15.2 GStreamer 1.15.2 (GIT) Unknown package origin I am starting with a pipeline like: gst-launch-1.0 decklinkvideosrc device-number=${DEVICE} mode=12 skip-first-time=500 output-cc=true video-format=2 duplex-mode=0 ! ccextractor ! autovideoconvert ! queue ! x264enc interlaced=true ! 'video/x-h264, profile=(string)main' ! h264parse ! queue ! mpegtsmux name=mux ! queue ! filesink location=video.ts decklinkaudiosrc device-number=${DEVICE} channels=2 name=audio ! audioconvert ! avenc_ac3 bitrate=480000 ! ac3parse ! mux. Am I putting the ccextractor in the wrong place, or is there other elements necessary to create the GstVideoCaptionMeta for x264enc? TIA John _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
On Tue, 2019-03-26 at 16:55 -0600, John P Poet wrote:
Hi, > Can someone give me an example pipeline for processing captions from > decklinkvideosrc into a h.264 or mpeg2 video? Preferably, I would > like to end up with a transport stream. > > gst-launch-1.0 version 1.15.2 > GStreamer 1.15.2 (GIT) > Unknown package origin > > I am starting with a pipeline like: > > gst-launch-1.0 decklinkvideosrc device-number=${DEVICE} mode=12 skip- > first-time=500 output-cc=true video-format=2 duplex-mode=0 ! > ccextractor ! autovideoconvert ! queue ! x264enc interlaced=true ! > 'video/x-h264, profile=(string)main' ! h264parse ! queue ! mpegtsmux > name=mux ! queue ! filesink location=video.ts > > Am I putting the ccextractor in the wrong place, or is there other > elements necessary to create the GstVideoCaptionMeta for x264enc? So (don't have a decklink at hand right now, so haven't tested)... decklinkvideosrc will put caption metas on buffers (either TYPE_CEA708_CDP or TYPE_CEA608_S334_1A). x264enc will pick up caption metas of TYPE_CEA708_RAW (only) from input buffers and inject them into the bitstream. So in an ideal world you'd have something that would convert those metas on the buffers, but I'm not sure that exists yet. There is ccconverter, but it doesn't operate on metas but a packetised closed caption stream, so you'd have to use ccextractor to split the metas into a stand-alone stream, then convert the type to what x264enc needs, then put the result back on buffers as metas with cccombiner. So much for the theory. In practice some things may not be fully implemented yet. Your mileage may vary. Good luck :) Cheers -Tim -- Tim Müller, Centricular Ltd - http://www.centricular.com _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
On Tue, Mar 26, 2019 at 5:55 PM Tim Müller <[hidden email]> wrote: On Tue, 2019-03-26 at 16:55 -0600, John P Poet wrote: Thank you Tim. That is helpful information. What is the best way to
debug the state and type of captions attached to a buffer at any
specific place in the pipeline? There is ccconverter, but it doesn't operate on metas but a packetised Can you provide me with a theoretical pipeline that would be used for this? That would help me get on the right path and figure out what the current state actually is, and what would need to be done. So much for the theory. In practice some things may not be fully Can GStreamer output captions to a MCC or SRT file? Begin able to take the data from decklinkvideosrc and write the captions out to a file would be a good first step in validating what is happening. Thanks again, John _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel |
On Wed, 2019-03-27 at 09:10 -0600, John P Poet wrote:
> > > There is ccconverter, but it doesn't operate on metas but a > > packetised closed caption stream, so you'd have to use ccextractor > > to split the metas into a stand-alone stream, then convert the type > > to what x264enc needs, then put the result back on buffers as metas > > with cccombiner. > > > Can you provide me with a theoretical pipeline that would be used for > this? That would help me get on the right path and figure out what > the current state actually is, and what would need to be done. You'd put a ccextractor after the decklinkvideosrc, pass the video stream through a queue and then to cccombiner, pass the CC stream through a queue, ccconverter and a capsfilter (to enforce the correct CC format) and then also to cccombiner. And after cccombiner you place x264enc and the remainder of your pipeline. > > So much for the theory. In practice some things may not be fully > > implemented yet. Your mileage may vary. Good luck :) > > Can GStreamer output captions to a MCC or SRT file? Begin able to > take the data from decklinkvideosrc and write the captions out to a > file would be a good first step in validating what is happening. There's an MCC encoder/parser in gst-plugins-rs: https://gitlab.freedesktop.org/gstreamer/gst-plugins-rs/tree/master/gst-plugin-closedcaption -- Sebastian Dröge, Centricular Ltd · https://www.centricular.com _______________________________________________ gstreamer-devel mailing list [hidden email] https://lists.freedesktop.org/mailman/listinfo/gstreamer-devel signature.asc (981 bytes) Download Attachment |
Free forum by Nabble | Edit this page |