[AVStream] Synchronize renderer filter with stream timestamps ?

Graph setup:

Test image generator ------> Encoder filter

The test image generator is a user mode filter derived from the push source samples.
It generates an RGB24 test image (moving block that changes color). It has a property page on which the frame rate and resolution can be set.
When the test image is set to xxxxx@5 fps and it is connected to the enhanced video renderer,the property page of this renderer shows 5 fps.
When the test image generator is connected to our kernel mode encoder filter, the DispatchProcess callback of this filter is not activated every 200 ms, but as fast as possible - resulting in a rendering frame rate of over 100 fps.
In the documentation on timestamps:
http://msdn.microsoft.com/en-us/library/windows/desktop/dd407208(v=vs.85).aspx
There is a sentence:
“When a renderer filter receives a sample, it schedules rendering based on the time stamp. If the sample arrives late, or has no time stamp, the filter renders the sample immediately. Otherwise, the filter waits until the sample’s start time before it renders the sample. (It waits for the start time by calling the IReferenceClock::AdviseTime method.)”

When I look at the encoder side, on each process callback, the leading edge framepointer timestamp = Leading->StreamHeader->PresentationTime.
Timestamps increment with 2000000, starting from 0, which seems correct for 5fps and 100ns resolution.

What needs to be implemented at the encoder (renderer) filter side, to synchronize with this timestamp?
It was our (obviously too simple) assumption that the process callback would be synchronized accordingly?

Thank You in advance for Your time and effort,

  • Bernard Willaert
    Barco - Healthcare Division
    Belgium

xxxxx@barco.com wrote:

The test image generator is a user mode filter derived from the push source samples.
It generates an RGB24 test image (moving block that changes color). It has a property page on which the frame rate and resolution can be set.
When the test image is set to xxxxx@5 fps and it is connected to the enhanced video renderer,the property page of this renderer shows 5 fps.
When the test image generator is connected to our kernel mode encoder filter, the DispatchProcess callback of this filter is not activated every 200 ms, but as fast as possible - resulting in a rendering frame rate of over 100 fps.

What needs to be implemented at the encoder (renderer) filter side, to synchronize with this timestamp?
It was our (obviously too simple) assumption that the process callback would be synchronized accordingly?

Are you copying the timestamps to the outgoing stream in your encoder?


Tim Roberts, xxxxx@probo.com
Providenza & Boekelheide, Inc.

>> Are you copying the timestamps to the outgoing stream in your encoder?

thanks for your reply, Tim!

No, we are not copying the timestamps.
In the process callback, we memcpy the contents of the frame into onboard FPGA memory.
They are then sent over Ethernet.
That is why I compare it with a renderer. For the stream, this is an endpoint.

xxxxx@barco.com wrote:

>> Are you copying the timestamps to the outgoing stream in your encoder?
thanks for your reply, Tim!

No, we are not copying the timestamps.
In the process callback, we memcpy the contents of the frame into onboard FPGA memory.
They are then sent over Ethernet.
That is why I compare it with a renderer. For the stream, this is an endpoint.

Then I’m unclear on where you expected the frame rate management to
happen. If your generator generates frames infinitely fast, and your
renderer accepts them at an infinite rate and never checks the
timestamps, you shouldn’t be surprised that the graph runs quickly.

When you use the EVR, the EVR filter adds delays to make sure a frame is
presented based on the timestamp. If you expect similar behavior, then
you’re going to have to do that in your own renderer. The graph doesn’t
do it.


Tim Roberts, xxxxx@probo.com
Providenza & Boekelheide, Inc.

>>>The graph doesn’t do it.
Thanks for clarifying this.
The user mode test generator - derived from the push source samples - synthesizes a test image and upon the CSourceStream::FillBuffer callback it copies this frame into the IMediaSample* data buffer and puts the timestamp on it (pSample->SetTime(start, stop)).
The kernel mode encoder/renderer copies the frame buffer onto hardware memory on each Process callback.
We were under the false impression that the graph would take care of the renderer process timing based on the timestamps on each frame and a master reference clock.
We will now implement a timer in the encoder filter that restores the frame rate.
Thank You for Your invaluable input !
We are still newbies when it comes to AVStream/DShow and trying to find our way…

  • Bernard Willaert

xxxxx@barco.com wrote:

Thanks for clarifying this.
The user mode test generator - derived from the push source samples - synthesizes a test image and upon the CSourceStream::FillBuffer callback it copies this frame into the IMediaSample* data buffer and puts the timestamp on it (pSample->SetTime(start, stop)).
The kernel mode encoder/renderer copies the frame buffer onto hardware memory on each Process callback.
We were under the false impression that the graph would take care of the renderer process timing based on the timestamps on each frame and a master reference clock.

Nope, it can’t do that. The graph doesn’t really know whether you are a
a source or a transform or a renderer. (OK, technically it could figure
that out by counting your pins, but that’s not the point.)

The graph shoves buffers to you as quickly as it can. It is entirely up
to each filter to decide what to do with the contents, including the
timestamps. For example, if you write a file renderer, writing an AVI
file to disk, you don’t want the graph to be enforcing delays. You want
frames to arrive as fast as possible. The concept of “presentation
time” is something that is meaningful only to the renderer.


Tim Roberts, xxxxx@probo.com
Providenza & Boekelheide, Inc.