Location of a @GstAncillaryMeta.
An enumeration indicating whether an element implements color balancing operations in software or in dedicated hardware. In general, dedicated hardware implementations (such as those provided by xvimagesink) are preferred.
A set of commands that may be issued to an element providing the #GstNavigation interface. The available commands can be queried via the gstvideo.navigation.Navigation.queryNewCommands query.
Enum values for the various events that an element implementing the GstNavigation interface might send up the pipeline. Touch events have been inspired by the libinput API, and have the same meaning here.
A set of notifications that may be received on the bus when navigation related status changes.
Flags to indicate the state of modifier keys and mouse buttons in events.
Types of navigation interface queries.
Enumeration of the different standards that may apply to AFD data:
Enumeration of the various values for Active Format Description (AFD)
Different alpha modes.
Some know types of Ancillary Data identifiers.
Additional video buffer flags. These flags can potentially be used on any buffers carrying closed caption data, or video data - even encoded data.
The various known types of Closed Caption (CC).
Extra flags that influence the result from gstvideo.video_chroma_resample.VideoChromaResample.new_.
Different subsampling and upsampling methods
Different chroma downsampling and upsampling modes
Various Chroma sitings.
Flags for #GstVideoCodecFrame
The color matrix is used to convert between Y'PbPr and non-linear RGB (R'G'B')
The color primaries define the how to transform linear RGB values to and from the CIE XYZ colorspace.
Possible color range values. These constants are defined for 8 bit color values and can be scaled for other bit depths.
Flags to be used in combination with gstvideo.video_decoder.VideoDecoder.requestSyncPoint. See the function documentation for more details.
Extra flags that influence the result from gstvideo.video_chroma_resample.VideoChromaResample.new_.
Different dithering methods to use.
Field order of interlaced content. This is only valid for interlace-mode=interleaved and not interlace-mode=mixed. In the case of mixed or GST_VIDEO_FIELD_ORDER_UNKOWN, the field order is signalled via buffer flags.
Extra video flags
Enum value describing the most common video formats.
The different video flags that a format info can have.
Extra video frame flags
Additional mapping flags for gstvideo.video_frame.VideoFrame.map.
The orientation of the GL texture.
The GL texture type.
The possible values of the #GstVideoInterlaceMode describing the interlace mode of the stream.
Different color matrix conversion modes
GstVideoMultiviewFlags are used to indicate extra properties of a stereo/multiview stream beyond the frame layout and buffer mapping that is conveyed in the #GstVideoMultiviewMode.
#GstVideoMultiviewFramePacking represents the subset of #GstVideoMultiviewMode values that can be applied to any video frame without needing extra metadata. It can be used by elements that provide a property to override the multiview interpretation of a video stream when the video doesn't contain any markers.
All possible stereoscopic 3D and multiview representations. In conjunction with #GstVideoMultiviewFlags, describes how multiview content is being transported in the stream.
The different video orientation methods.
Overlay format flags.
The different flags that can be used when packing and unpacking.
Different primaries conversion modes
Different resampler flags.
Different subsampling and upsampling methods
Different scale flags.
Enum value describing the available tiling modes.
Enum value describing the most common tiling types.
Flags related to the time code information. For drop frame, only 30000/1001 and 60000/1001 frame rates are supported.
The video transfer function defines the formula for converting between non-linear RGB (R'G'B') and linear RGB
Return values for #GstVideoVBIParser
#GstMeta for carrying SMPTE-291M Ancillary data. Note that all the ADF fields (@DID to @checksum) are 10bit values with parity/non-parity high-bits set.
This interface is implemented by elements which can perform some color balance operation on video frames they process. For example, modifying the brightness, contrast, hue or saturation.
The #GstColorBalanceChannel object represents a parameter for modifying the color balance implemented by an element providing the #GstColorBalance interface. For example, Hue or Saturation.
Color-balance channel class.
Color-balance interface.
The Navigation interface is used for creating and injecting navigation related events such as mouse button presses, cursor motion and key presses. The associated library also provides methods for parsing received events, and for sending and receiving navigation related bus events. One main usecase is DVD menu navigation.
Navigation interface.
Active Format Description (AFD)
Extra buffer metadata for performing an affine transformation using a 4x4 matrix. The transformation matrix can be composed with gstvideo.video_affine_transformation_meta.VideoAffineTransformationMeta.applyMatrix.
VideoAggregator can accept AYUV, ARGB and BGRA video streams. For each of the requested sink pads it will compare the incoming geometry and framerate to define the output parameters. Indeed output video frames will have the geometry of the biggest incoming video stream and the framerate of the fastest incoming one.
An implementation of GstPad that can be used with #GstVideoAggregator.
An implementation of GstPad that can be used with #GstVideoAggregator.
Extra alignment parameters for the memory of video buffers. This structure is usually used to configure the bufferpool if it supports the #GST_BUFFER_POOL_OPTION_VIDEO_ALIGNMENT.
Video Ancillary data, according to SMPTE-291M specification.
Bar data should be included in video user data whenever the rectangular picture area containing useful information does not extend to the full height or width of the coded frame and AFD alone is insufficient to describe the extent of the image.
Extra buffer metadata providing Closed Caption.
This meta is primarily for internal use in GStreamer elements to support VP8/VP9 transparent video stored into WebM or Matroska containers, or transparent static AV1 images. Nothing prevents you from using this meta for custom purposes, but it generally can't be used to easily to add support for alpha channels to CODECs or formats that don't support that out of the box.
A #GstVideoCodecFrame represents a video frame both in raw and encoded form.
Structure representing the state of an incoming or outgoing video stream for encoders and decoders.
Structure describing the chromaticity coordinates of an RGB system. These values can be used to construct a matrix to transform RGB to and from the XYZ colorspace.
Structure describing the color info.
Content light level information specified in CEA-861.3, Appendix A.
Extra buffer metadata describing image cropping.
This base class is for video decoders turning encoded data into raw video frames.
Subclasses can override any of the available virtual methods or not, as needed. At minimum @handle_frame needs to be overridden, and @set_format and likely as well. If non-packetized input is supported or expected, @parse needs to be overridden as well.
The interface allows unified access to control flipping and rotation operations of video-sources or operators.
#GstVideoDirectionInterface interface.
GstVideoDither provides implementations of several dithering algorithms that can be applied to lines of video pixels to quantize and dither them.
This base class is for video encoders turning raw video into encoded video data.
Subclasses can override any of the available virtual methods or not, as needed. At minimum @handle_frame needs to be overridden, and @set_format and @get_caps are likely needed as well.
Provides useful functions and a base class for video filters.
The video filter class structure.
Information for a video format.
A video frame obtained from gstvideo.video_frame.VideoFrame.map
Extra buffer metadata for uploading a buffer to an OpenGL texture ID. The caller of gstvideo.video_gltexture_upload_meta.VideoGLTextureUploadMeta.upload must have OpenGL set up and call this from a thread where it is valid to upload something to an OpenGL texture.
Information describing image properties. This information can be filled in from GstCaps with gstvideo.video_info.VideoInfo.fromCaps. The information is also used to store the specific video info when mapping a video frame with gstvideo.video_frame.VideoFrame.map.
Information describing a DMABuf image properties. It wraps #GstVideoInfo and adds DRM information such as drm-fourcc and drm-modifier, required for negotiation and mapping.
Mastering display color volume information defined by SMPTE ST 2086 (a.k.a static HDR metadata).
Used to represent display_primaries and white_point of #GstVideoMasteringDisplayInfo struct. See #GstVideoMasteringDisplayInfo
Extra buffer metadata describing image properties
Extra data passed to a video transform #GstMetaTransformFunction such as: "gst-video-scale".
See #GstVideoMultiviewFlags.
The interface allows unified access to control flipping and autocenter operation of video-sources or operators.
#GstVideoOrientationInterface interface.
The #GstVideoOverlay interface is used for 2 main purposes :
Functions to create and handle overlay compositions on video buffers.
Extra buffer metadata describing image overlay data.
#GstVideoOverlay interface
An opaque video overlay rectangle object. A rectangle contains a single overlay rectangle which can be added to a composition.
Helper structure representing a rectangular area.
Extra buffer metadata describing an image region of interest
#GstVideoResampler is a structure which holds the information required to perform various kinds of resampling filtering.
H.264 H.265 metadata from SEI User Data Unregistered messages
#GstVideoScaler is a utility object for rescaling and resampling video frames using various interpolation / sampling methods.
Provides useful functions and a base class for video sinks.
The video sink class structure. Derived classes should override the @show_frame virtual function.
Description of a tile. This structure allow to describe arbitrary tile dimensions and sizes.
@field_count must be 0 for progressive video and 1 or 2 for interlaced.
Supported frame rates: 30000/1001, 60000/1001 (both with and without drop frame), and integer frame rates e.g. 25/1, 30/1, 50/1, 60/1.
A representation of a difference between two #GstVideoTimeCode instances. Will not necessarily correspond to a real timecode (e.g. 00:00:10;00)
Extra buffer metadata describing the GstVideoTimeCode of the frame.
An encoder for writing ancillary data to the Vertical Blanking Interval lines of component signals.
A parser for detecting and extracting @GstVideoAncillary data from Vertical Blanking Interval lines of component signals.