DirectShow过滤器开发-写MP4视频文件过滤器

下载本文代码制作的过滤器DLL

本过滤器将未压缩的视频流RGB32编码为H.264,将未压缩的音频流PCM编码为AAC,写入MP4视频文件。
编写本过滤器DLL需要:
1.使用Win7系统。
2.安装Windows SDK 7.1软件开发工具包。
3.使用Visual Studio 2010,或Visual Studio 2008,或Visual Studio 2005开发环境。编者使用的是Visual Studio 2010。

原因是这样的,Windows SDK 7.1安装后创建的文件包含DirectShow基础类定义和实现文件,开发过滤器需要使用这些基础类;Windows SDK 7.1适用于Win7;编译Windows SDK 7.1文件要求上述开发环境。

创建包含2个输入引脚的过滤器

过滤器类从CBaseFilter,CCritSec,IFileSinkFilter派生。重写GetPinCount,GetPin函数,重写Run函数。实现IFileSinkFilter接口:添加查询IFileSinkFilter接口代码,重写IFileSinkFilter接口的SetFileName,GetCurFile函数。
视频引脚类从CBaseInputPin派生。重写CheckMediaType,SetMediaType,Receive,EndOfStream函数。
音频引脚从CBaseInputPin派生。重写CheckMediaType,SetMediaType,Receive,EndOfStream函数。
CheckMediaType函数用于确定连接引脚使用的媒体类型;SetMediaType函数向基类设置引脚当前的媒体类型;Receive函数被上游输出引脚调用,接收上游引脚传递的媒体样本;EndOfStream函数报告流结束。IFileSinkFilter接口用于设置和获取要创建的MP4文件路径。

创建接收写入器工作线程

在线程内初始化COM库,初始化媒体基础。使用MFCreateSinkWriterFromURL函数创建接收写入器;Win7,8,10系统下都可以创建接收写入器。创建成功后获得IMFSinkWriter接口。
创建并设置视频流输出媒体类型。视频流输出媒体类型即写到MP4文件使用的类型。使用IMFSinkWriter接口的AddStream方法添加视频流。当添加视频流成功后,将实例化视频编码器。编码器类型是由输出文件的扩展名决定,即MFCreateSinkWriterFromURL函数的参数1。这里为H.264视频编码器。
创建并设置视频流输入媒体类型。使用IMFSinkWriter接口的SetInputMediaType方法设置视频流输入媒体类型。视频流输入媒体类型,即视频编码器输入的媒体类型。它应与视频输入引脚接收的媒体类型一致。
创建并设置音频流输出媒体类型。音频流输出媒体类型即写到MP4文件使用的类型。使用IMFSinkWriter接口的AddStream方法添加音频流。当添加音频流成功后,将实例化音频编码器。编码器类型同样是由输出文件的扩展名决定,即MFCreateSinkWriterFromURL函数的参数1。这里为AAC音频编码器。
创建并设置音频流输入媒体类型。使用IMFSinkWriter接口的SetInputMediaType方法设置音频流输入媒体类型。音频流输入媒体类型,即音频编码器输入的媒体类型。它应与音频输入引脚接收的媒体类型一致。
创建并设置媒体基础音频媒体样本。将音频引脚获取到的数据复制到共享缓冲区,再从共享缓冲区复制数据到音频媒体样本,设置样本的显示时间,持续时间,数据的有效长度。
写音频媒体样本到接收写入器。
创建并设置媒体基础视频媒体样本。将视频引脚获取到的数据复制到共享缓冲区,再从共享缓冲区复制数据到视频媒体样本,设置样本的显示时间,持续时间,数据的有效长度。
写视频媒体样本到接收写入器。
当音频引脚或视频引脚EndOfStream函数被调用时,结束写样本,并退出线程。

本过滤器工作原理

视频引脚接收函数与音频引脚接收函数工作在不同的线程中,虽然两个接收函数被调用的时间不一定相同,但接收视频数据和接收音频数据是分别同时进行的。即,这边接收着视频数据,那边接收着音频数据。而接收写入器工作在一个线程中,只能这段时间接收音频,下一段时间接收视频。这就需要同步视频接收线程,音频接收线程,接收写入器工作线程。具体过程如下:

运行过滤器后,让视频音频Receive函数运行一次,初始化数据长度,并复制数据到共享缓冲区。完成后,发出“视频引脚第1次接收完成”信号和“音频引脚第1次接收完成”信号。在第2次Receive函数的调用中,音频线程在WaitForSingleObject(hBeginAudio,0);(CAudioPin.cpp75行)和Agan(CAudioPin.cpp61行)之间反复循环,相当于被阻塞。视频线程也同样。

接收写入器工作线程运行后,被阻塞在WaitForSingleObject(hABegin,INFINITE);(CMp4Writer.cpp228行)和WaitForSingleObject(hVBegin,INFINITE);(CMp4Writer.cpp229行),只有接收到“视频引脚第1次接收完成”信号和“音频引脚第1次接收完成”信号,才可往下运行。

接收写入器工作线程创建音频样本,从共享缓冲区复制数据到样本,写样本到接收写入器,完成第1个样本的写入。然后,发出“音频引脚开始接收数据”信号(CMp4Writer.cpp461行)。然后,线程在WaitForSingleObject(AudioEndReceive,0);(CMp4Writer.cpp467行)和Agan1:(CMp4Writer.cpp462行)之间反复循环,相当于阻塞在此处。

音频接收线程收到“音频引脚开始接收数据”信号,消除阻塞,复制数据到音频共享缓冲区。发出“音频引脚结束接收数据”信号。在音频Receive函数的下次调用中,仍被阻塞在61和75行之间。

接收写入器工作线程收到“音频引脚结束接收数据”信号,消除阻塞。再次创建样本,从共享缓冲区复制数据到样本,写样本到接收写入器。并再次发出“音频引脚开始接收数据”信号。

视频线程的同步过程与此类似。

设置图像平均编码比特率

如果在应用程序中使用本过滤器,可以调整并设置图像平均编码比特率,具体方法如下:
HMODULE h=LoadLibrary(L"写MP4.dll");
FARPROC Proc1 = GetProcAddress(h, “GetParamAddress”);//获取编码比特率变量的地址
if (Proc1 != NULL)//如果地址不为空
{
UINT32* pUINT32=(UINT32*)Proc1();
*pUINT32=8000000;//设置新的编码比特率
}

下面是本过滤器DLL的全部代码:

DLL头文件:WriteMP4.h


#ifndef  STAR_FILE
#define STAR_FILE#include 
#include #if _DEBUG#pragma comment(lib, "Strmbasd")//C:\Program Files\Microsoft SDKs\Windows\v7.1\Samples\multimedia\directshow\baseclasses\Debug\strmbasd.lib
#else#pragma comment(lib, "Strmbase")//C:\Program Files\Microsoft SDKs\Windows\v7.1\Samples\multimedia\directshow\baseclasses\Release\strmbase.lib
#endif
#pragma comment(lib, "Winmm")//C:\Program Files\Microsoft SDKs\Windows\v7.1\Lib\Winmm.lib
#pragma comment(lib, "msvcrt")//C:\Program Files (x86)\Microsoft Visual Studio 10.0\VC\lib\msvcrt.lib// {5255147D-ADB5-438D-A938-CB94B8C01B17}
DEFINE_GUID(CLSID_MP4Writer, 
0x5255147d, 0xadb5, 0x438d, 0xa9, 0x38, 0xcb, 0x94, 0xb8, 0xc0, 0x1b, 0x17);#endif // STAR_FILE

DLL源文件:WriteMP4.cpp

#include  "WriteMP4.h"
#include  "CMp4Writer.h"const AMOVIESETUP_MEDIATYPE sudPin1Types =   // 引脚1媒体类型
{&MEDIATYPE_Video,           // 主要类型&MEDIASUBTYPE_RGB32         // 子类型
};const AMOVIESETUP_MEDIATYPE sudPin2Types =   // 引脚2媒体类型
{&MEDIATYPE_Audio,           // 主要类型&MEDIASUBTYPE_PCM           // 子类型
};const AMOVIESETUP_PIN sudPins[]  =  // 引脚信息
{{L"Video",                   //引脚名称FALSE,                      // 必须渲染输入引脚FALSE,                      // 输出引脚FALSE,                      // 具有该引脚的零个实例FALSE,                      // 可以创建一个以上引脚的实例&CLSID_NULL,                //该引脚连接的过滤器的类标识NULL,                       //该引脚连接的引脚名称1,                          //引脚支持的媒体类型数&sudPin1Types               //媒体类型信息},{L"Audio",                   //引脚名称FALSE,                      // 必须渲染输入引脚FALSE,                      // 输出引脚FALSE,                      // 具有该引脚的零个实例FALSE,                      // 可以创建一个以上引脚的实例&CLSID_NULL,                //该引脚连接的过滤器的类标识NULL,                       //该引脚连接的引脚名称1,                          //引脚支持的媒体类型数&sudPin2Types               //媒体类型信息}
} ;            const AMOVIESETUP_FILTER sudAudioEndpoint =  //过滤器的注册信息
{&CLSID_MP4Writer,               //过滤器的类标识L"写MP4",                       //过滤器的名称MERIT_DO_NOT_USE,               //过滤器优先值2,                              // 引脚数量sudPins                         // 引脚信息
};CFactoryTemplate g_Templates []  = {{L"写MP4", &CLSID_MP4Writer, CMp4Writer::CreateInstance, NULL, &sudAudioEndpoint }
};int g_cTemplates = 1;STDAPI DllRegisterServer()//注册DLL
{return AMovieDllRegisterServer2(TRUE);
} STDAPI DllUnregisterServer()//删除DLL注册
{return AMovieDllRegisterServer2(FALSE);
}extern "C" BOOL WINAPI DllEntryPoint(HINSTANCE, ULONG, LPVOID);BOOL APIENTRY DllMain(HANDLE hModule, DWORD  dwReason, LPVOID lpReserved)
{return DllEntryPoint((HINSTANCE)(hModule), dwReason, lpReserved);
}extern UINT32 VIDEO_BIT_RATE[2];
STDAPI GetParamAddress()//获取编码比特率变量地址
{return (HRESULT)VIDEO_BIT_RATE;
}

过滤器头文件:CMp4Writer.h

#ifndef  FILTER_FILE
#define FILTER_FILE#include  "WriteMP4.h"
#include  "CVideoPin.h"
#include  "CAudioPin.h" class CMp4Writer : public CBaseFilter, public CCritSec, public IFileSinkFilter
{friend class CVideoPin;friend class CAudioPin;
public:CMp4Writer(LPUNKNOWN pUnk,HRESULT *phr);~CMp4Writer();static CUnknown * WINAPI CreateInstance(LPUNKNOWN pUnk, HRESULT *phr);int GetPinCount();CBasePin *GetPin(int n);STDMETHODIMP Run(REFERENCE_TIME tStart);CVideoPin* pCVideoPin;//视频输入引脚指针CAudioPin* pCAudioPin;//音频输入引脚指针DECLARE_IUNKNOWN
private:STDMETHODIMP NonDelegatingQueryInterface(REFIID riid, void ** ppv);virtual HRESULT STDMETHODCALLTYPE  SetFileName(LPCOLESTR pszFileName, const AM_MEDIA_TYPE *pmt);virtual HRESULT STDMETHODCALLTYPE  GetCurFile(LPOLESTR * ppszFileName, AM_MEDIA_TYPE *pmt);  
};#endif // FILTER_FILE

过滤器源文件:CMp4Writer.cpp

#include "CMp4Writer.h"#include #include 
#include 
#include 
#include 
#include 
#include #pragma comment(lib, "mfreadwrite")
#pragma comment(lib, "mfplat")
#pragma comment(lib, "mfuuid")template <class T> void SafeRelease(T** ppT)
{if (*ppT){(*ppT)->Release();*ppT = NULL;}
}#define EC_MP4_COMPLETE EC_USER+1111//自定义”写MP4完成“事件通知
#define EC_MP4_ERROR EC_USER+1112//自定义”发生错误“事件通知HANDLE hThread=NULL;//接收写入器工作线程句柄HANDLE hVBegin;//“视频Receive第1次调用完成”事件句柄
HANDLE hABegin;//“音频Receive第1次调用完成”事件句柄
HANDLE hOver;//"写MP4完成”事件句柄
HANDLE hBeginVideo;//”视频引脚开始接收数据“事件句柄
HANDLE hBeginAudio;//”音频引脚开始接收数据“事件句柄
HANDLE VideoEndReceive;//”视频引脚结束接收数据“事件句柄
HANDLE AudioEndReceive;//”音频引脚结束接收数据“事件句柄//下面是视频流参数
UINT32 VIDEO_BIT_RATE[2]={(UINT32)0,(UINT32)0};//第一个元素为图像输出编码比特率
UINT32 VIDEO_FPS=0;//每秒帧数
UINT32 VIDEO_WIDTH;//视频画面宽度,以像素为单位
UINT32 VIDEO_HEIGHT;//视频画面高度,以像素为单位
//下面是视频流样本参数
BYTE pBy[20000000];//视频数据共享内存20M
REFERENCE_TIME star, end;//视频样本时间戳
LONG Len;//视频样本大小,以字节为单位//下面是音频流参数
UINT32 AUDIO_CHANNELS;//声道数
UINT32 BYTES_PER_SECOND=12000;//音频输出编码比特率
UINT32 SAMPLES_PER_SECOND;//音频采样率
//下面是音频流样本参数
BYTE pABy[2000000];//音频数据共享内存2M
REFERENCE_TIME Astar, Aend;//音频样本时间戳
LONG ALen;//音频样本大小,以字节为单位LPOLESTR m_pFileName=NULL;//要创作的MP4文件路径
CMp4Writer* pCMp4Writer=NULL;//过滤器指针
#pragma warning(disable:4355 4127)//禁用警告C4355
CMp4Writer::CMp4Writer(LPUNKNOWN pUnk,HRESULT *phr) : CBaseFilter(NAME("写MP4"), pUnk, (CCritSec *) this,CLSID_MP4Writer)
{pCVideoPin= new CVideoPin(NAME("Video"), this, (CCritSec*)this, phr, L"Video");//创建视频引脚if(*phr!=S_OK)return;pCAudioPin=new CAudioPin(NAME("Audio"), this, (CCritSec*)this, phr, L"Audio");//创建音频引脚if(*phr!=S_OK)return;//创建事件,下面为自动重置,初始状态无信号hBeginVideo = CreateEvent( NULL, FALSE, FALSE, NULL); hBeginAudio = CreateEvent( NULL, FALSE, FALSE, NULL); VideoEndReceive = CreateEvent( NULL, FALSE, FALSE, NULL);  AudioEndReceive = CreateEvent( NULL, FALSE, FALSE, NULL);  //创建事件,下面为手动重置,初始状态无信号hVBegin = CreateEvent( NULL, TRUE, FALSE, NULL);  hABegin = CreateEvent( NULL, TRUE, FALSE, NULL);hOver= CreateEvent( NULL, TRUE, FALSE, NULL);  
}CMp4Writer::~CMp4Writer()
{//关闭所有事件句柄CloseHandle(hBeginVideo);CloseHandle(hBeginAudio);CloseHandle(VideoEndReceive);CloseHandle(AudioEndReceive);CloseHandle(hVBegin);CloseHandle(hABegin);CloseHandle(hOver);if(m_pFileName!=NULL)delete m_pFileName;
}CUnknown * WINAPI CMp4Writer::CreateInstance(LPUNKNOWN pUnk, HRESULT *phr)
{pCMp4Writer=new CMp4Writer(pUnk, phr);//创建过滤器return pCMp4Writer;
} int CMp4Writer::GetPinCount()//获取引脚数量
{return 2;
}CBasePin *CMp4Writer::GetPin(int n)//获取引脚
{if(n==0)return (CBasePin*)pCVideoPin;else if(n==1)return (CBasePin*)pCAudioPin;else return NULL;
} STDMETHODIMP CMp4Writer::NonDelegatingQueryInterface(REFIID iid, void ** ppv)
{if(iid==IID_IFileSinkFilter){return GetInterface(static_cast<IFileSinkFilter*>(this), ppv);}else if(iid == IID_IBaseFilter){return GetInterface(static_cast<IBaseFilter*>(this), ppv);}elsereturn CBaseFilter::NonDelegatingQueryInterface(iid, ppv);
}HRESULT CMp4Writer::GetCurFile( LPOLESTR *ppszFileName, AM_MEDIA_TYPE *pmt)
{CheckPointer(ppszFileName, E_POINTER);*ppszFileName = NULL;if (m_pFileName != NULL) {size_t len = 1+lstrlenW(m_pFileName);*ppszFileName = (LPOLESTR)QzTaskMemAlloc(sizeof(WCHAR) * (len));if (*ppszFileName != NULL) {HRESULT hr = StringCchCopyW(*ppszFileName, len, m_pFileName);}}if(pmt) {ZeroMemory(pmt, sizeof(*pmt));pmt->majortype = MEDIATYPE_NULL;pmt->subtype = MEDIASUBTYPE_NULL;}return S_OK;
}HRESULT CMp4Writer::SetFileName( LPCOLESTR pszFileName, const AM_MEDIA_TYPE *pmt)
{CheckPointer(pszFileName,E_POINTER);if(wcslen(pszFileName) > MAX_PATH || wcslen(pszFileName)<4)return ERROR_FILENAME_EXCED_RANGE;size_t len = 1+lstrlenW(pszFileName);m_pFileName = new WCHAR[len];if (m_pFileName == 0)return E_OUTOFMEMORY;HRESULT hr = StringCchCopyW(m_pFileName, len, pszFileName);if(m_pFileName[len-2]!='4' || m_pFileName[len-3]!='p' || m_pFileName[len-4]!='m' || m_pFileName[len-5]!='.')//如果不是MP4文件{delete m_pFileName; m_pFileName=NULL;return VFW_E_INVALID_FILE_FORMAT;//设置文件名失败}return hr;
}DWORD WINAPI  ThreadProc(LPVOID pParam);//接收写入器工作线程函数声明STDMETHODIMP CMp4Writer::Run(REFERENCE_TIME tStart)
{if(m_pFileName==NULL){MessageBox(NULL,L"未指定输出文件",NULL,MB_OK);return VFW_E_INVALID_FILE_FORMAT;}if(!pCVideoPin->IsConnected() || !pCAudioPin->IsConnected())return S_FALSE;//如果视频引脚或音频引脚没有连接,不运行过滤器if(hThread==NULL)//如果没有接收写入器工作线程运行{//在过滤器运行之前,设置下面内容ResetEvent(hVBegin);//将“视频Receive第1次调用完成”设置为无信号ResetEvent(hABegin);//将“音频Receive第1次调用完成”设置为无信号star=0;end=0;Astar=0;Aend=0;//将所有时间戳设置为0Len=0;ALen=0;//将数据长度置0}CBaseFilter::Run(tStart);//运行过滤器if(hThread != NULL)//如果接收写入器线程正在运行,防止再创建新的线程。用于从暂停状态运行过滤器return S_OK;else//从停止状态运行过滤器{ResetEvent(hOver);//将”写MP4完成“设置为无信号hThread=CreateThread(NULL,sizeof(pBy)+sizeof(pABy)+1000000,(LPTHREAD_START_ROUTINE)ThreadProc,NULL,0,NULL);//创建接收写入器工作线程if(hThread==NULL)//如果创建线程失败{Stop();//停止过滤器return S_FALSE;}}return S_OK;
}DWORD WINAPI  ThreadProc(LPVOID pParam)//接收写入器工作线程
{HRESULT hr = CoInitializeEx(NULL, COINIT_APARTMENTTHREADED);if (hr != S_OK){MessageBox(NULL,L"初始化COM库失败",NULL,MB_OK); CloseHandle(hThread);hThread=NULL; return 0;}hr = MFStartup(MF_VERSION);if (hr != S_OK){MessageBox(NULL,L"初始化媒体基础失败",NULL,MB_OK); CoUninitialize();//关闭COM库CloseHandle(hThread);hThread=NULL; return 0; }IMFSinkWriter* pSinkWriter;//接收写入器接口hr = MFCreateSinkWriterFromURL(m_pFileName, NULL, NULL, &pSinkWriter);//创建接收写入器if (hr != S_OK){MessageBox(NULL,L"创建接收写入器失败",NULL,MB_OK); MFShutdown();//关闭媒体基础CoUninitialize();//关闭COM库CloseHandle(hThread);hThread=NULL; return 0; } 	WaitForSingleObject(hABegin,INFINITE);//如果没有“视频Receive第1次调用完成”信号,阻塞线程WaitForSingleObject(hVBegin,INFINITE);//如果没有“音频Receive第1次调用完成”信号,阻塞线程if(Len==0 || ALen==0 || end-star==0 || Aend-Astar==0 || VIDEO_FPS==0)//判断这些变量是否有了初始值,如果没有退出线程{MessageBox(NULL,L"首次获取样本失败",NULL,MB_OK); pSinkWriter->Release();//释放接收写入器MFShutdown();//关闭媒体基础CoUninitialize();//关闭COM库CloseHandle(hThread);//关闭线程句柄hThread=NULL; return 0; }//下面设置视频流输出媒体类型IMFMediaType* pMTVideoOut = NULL;//视频流输出媒体类型if (SUCCEEDED(hr)){hr = MFCreateMediaType(&pMTVideoOut);//创建媒体类型}if (SUCCEEDED(hr)){hr = pMTVideoOut->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video);//设置主要类型为视频}if (SUCCEEDED(hr)){hr = pMTVideoOut->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_H264);//设置子类型H.264}if (SUCCEEDED(hr)){if(VIDEO_BIT_RATE[0] != 0)//如果用户提供了编码比特率,使用用户提供的值{hr = pMTVideoOut->SetUINT32(MF_MT_AVG_BITRATE, VIDEO_BIT_RATE[0]);//设置平均编码比特率}else//如果用户没有提供,使用默认值{UINT32 DEFAULT_VIDEO_BIT_RATE = VIDEO_WIDTH*VIDEO_HEIGHT*VIDEO_FPS/6;//计算默认图像输出编码比特率hr = pMTVideoOut->SetUINT32(MF_MT_AVG_BITRATE, DEFAULT_VIDEO_BIT_RATE);//设置平均编码比特率}}if (SUCCEEDED(hr)){hr = MFSetAttributeRatio(pMTVideoOut, MF_MT_FRAME_RATE, VIDEO_FPS, 1);//设置帧速率}if (SUCCEEDED(hr)){hr = MFSetAttributeSize(pMTVideoOut, MF_MT_FRAME_SIZE, VIDEO_WIDTH, VIDEO_HEIGHT);//设置帧宽高}if (SUCCEEDED(hr)){hr = pMTVideoOut->SetUINT32(MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive); //设置交错模式}if (SUCCEEDED(hr)){hr = pMTVideoOut->SetUINT32(MF_MT_MPEG2_PROFILE, eAVEncH264VProfile_Main); //设置H.264编码配置文件}if (SUCCEEDED(hr)){hr = MFSetAttributeRatio(pMTVideoOut, MF_MT_PIXEL_ASPECT_RATIO, 1, 1);//设置媒体类型的像素宽高比}if (SUCCEEDED(hr)){DWORD streamIndex;hr = pSinkWriter->AddStream(pMTVideoOut, &streamIndex);//添加视频流。第1次添加的流,流索引为0}//下面设置视频流输入媒体类型IMFMediaType* pMTVideoIn = NULL;//视频流输入媒体类型if (SUCCEEDED(hr)){hr = MFCreateMediaType(&pMTVideoIn);//创建媒体类型}if (SUCCEEDED(hr)){hr = pMTVideoIn->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Video);//设置主要类型视频}if (SUCCEEDED(hr)){hr = pMTVideoIn->SetGUID(MF_MT_SUBTYPE, MFVideoFormat_RGB32);//设置子类型RGB32}if (SUCCEEDED(hr)){hr = pMTVideoIn->SetUINT32(MF_MT_INTERLACE_MODE, MFVideoInterlace_Progressive);//设置交错模式为渐进式帧}if (SUCCEEDED(hr)){hr = MFSetAttributeSize(pMTVideoIn, MF_MT_FRAME_SIZE, VIDEO_WIDTH, VIDEO_HEIGHT);//设置帧大小}if (SUCCEEDED(hr)){hr = MFSetAttributeRatio(pMTVideoIn, MF_MT_FRAME_RATE, VIDEO_FPS, 1);//设置帧速率}if (SUCCEEDED(hr)){hr = MFSetAttributeRatio(pMTVideoIn, MF_MT_PIXEL_ASPECT_RATIO, 1, 1);//设置宽高比}if (SUCCEEDED(hr)){hr = pSinkWriter->SetInputMediaType(0, pMTVideoIn, NULL);//设置视频流输入媒体类型;参数1为视频流的索引}//设置音频输出类型IMFMediaType* pMediaTypeAudioOut = NULL;if (SUCCEEDED(hr)){hr = MFCreateMediaType(&pMediaTypeAudioOut);//创建媒体类型}if (SUCCEEDED(hr)){hr = pMediaTypeAudioOut->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Audio);//设置主要类型音频}if (SUCCEEDED(hr)){hr = pMediaTypeAudioOut ->SetGUID(MF_MT_SUBTYPE, MFAudioFormat_AAC);//设置子类型MFAudioFormat_AAC}if (SUCCEEDED(hr)){hr = pMediaTypeAudioOut->SetUINT32(MF_MT_AUDIO_BITS_PER_SAMPLE, 16);//设置每个样本的位数16}if (SUCCEEDED(hr)){hr = pMediaTypeAudioOut->SetUINT32(MF_MT_AUDIO_SAMPLES_PER_SECOND, SAMPLES_PER_SECOND);//设置每秒样本数}if (SUCCEEDED(hr)){hr = pMediaTypeAudioOut->SetUINT32(MF_MT_AUDIO_NUM_CHANNELS, AUDIO_CHANNELS);//设置通道数}if (SUCCEEDED(hr)){hr = pMediaTypeAudioOut->SetUINT32(MF_MT_AUDIO_AVG_BYTES_PER_SECOND, BYTES_PER_SECOND);//设置编码AAC流的比特率}if (SUCCEEDED(hr)){DWORD streamIndex;hr = pSinkWriter->AddStream(pMediaTypeAudioOut, &streamIndex);//添加音频流。第2次添加的流,流索引为1}//设置音频输入类型IMFMediaType* pMediaTypeAudioIn = NULL;if (SUCCEEDED(hr)){hr = MFCreateMediaType(&pMediaTypeAudioIn);//创建媒体类型}if (SUCCEEDED(hr)){hr = pMediaTypeAudioIn->SetGUID(MF_MT_MAJOR_TYPE, MFMediaType_Audio);//设置主要类型音频}if (SUCCEEDED(hr)){hr = pMediaTypeAudioIn->SetGUID(MF_MT_SUBTYPE, MFAudioFormat_PCM);//设置子类型MFAudioFormat_PCM}if (SUCCEEDED(hr)){hr = pMediaTypeAudioIn->SetUINT32(MF_MT_AUDIO_BITS_PER_SAMPLE, 16);//设置每个样本的位数16}if (SUCCEEDED(hr)){hr = pMediaTypeAudioIn->SetUINT32(MF_MT_AUDIO_SAMPLES_PER_SECOND, SAMPLES_PER_SECOND);//设置每秒样本数}if (SUCCEEDED(hr)){hr = pMediaTypeAudioIn->SetUINT32(MF_MT_AUDIO_NUM_CHANNELS, AUDIO_CHANNELS);//设置通道数}if (SUCCEEDED(hr)){hr = pSinkWriter->SetInputMediaType(1, pMediaTypeAudioIn, NULL); //设置音频输入类型;参数1为音频流的索引}if (SUCCEEDED(hr)){hr = pSinkWriter->BeginWriting();//接收写入器开始接收数据}FILTER_STATE fs;if (SUCCEEDED(hr)){
jump1:if(Astar<=star)//如果样本时间戳音频小于视频{
CreateAudioBuffer:IMFMediaBuffer* pBuffer = NULL;hr = MFCreateMemoryBuffer(ALen, &pBuffer);//创建缓冲区if(hr!=S_OK || pBuffer==NULL)//如果创建失败{Sleep(1);goto CreateAudioBuffer;//再次创建}BYTE* pData = NULL;if (SUCCEEDED(hr)){hr = pBuffer->Lock(&pData, NULL, NULL);//锁定缓冲区}if (SUCCEEDED(hr)){CopyMemory(pData, pABy,ALen);//从共享缓冲区复制数据到样本缓冲区}if (SUCCEEDED(hr) && pBuffer){hr = pBuffer->Unlock();//解锁缓冲区}if (SUCCEEDED(hr)){hr = pBuffer->SetCurrentLength(ALen);//设置缓冲区的数据长度}
CreateAudioSample:IMFSample* pSample = NULL;if (SUCCEEDED(hr)){hr = MFCreateSample(&pSample);//创建样本if(hr!=S_OK || pSample==NULL)//如果创建失败{Sleep(1);goto CreateAudioSample;//再次创建}}if (SUCCEEDED(hr)){hr = pSample->AddBuffer(pBuffer);//添加缓冲区到样本}if (SUCCEEDED(hr)){hr = pSample->SetSampleTime(Astar);//设置样本显示时间}if (SUCCEEDED(hr)){hr = pSample->SetSampleDuration(Aend-Astar);//设置样本持续时间}if (SUCCEEDED(hr) && pSample!=NULL){hr = pSinkWriter->WriteSample(1, pSample);//写音频样本到接收写入器}SafeRelease(&pBuffer);SafeRelease(&pSample);//释放缓冲区,释放样本if (SUCCEEDED(hr)){SetEvent(hBeginAudio);//发送“音频引脚开始接收数据”信号
Agan1:pCMp4Writer->GetState(0,&fs);//获取过滤器状态if(fs!=State_Running)goto jump2;DWORD dw = WaitForSingleObject(hOver,0);//获取“写MP4完成”信号if(dw==WAIT_OBJECT_0)goto complete;//有“写MP4完成”信号,转到completeDWORD dw2=WaitForSingleObject(AudioEndReceive,0);//获取“音频引脚结束接收数据”信号if(dw2!=WAIT_OBJECT_0)goto Agan1;//如果没有“音频引脚结束接收数据”信号,再次获取}}else//写视频数据{
CreateVideoBuffer:IMFMediaBuffer* pBuffer = NULL;hr = MFCreateMemoryBuffer(Len, &pBuffer);//创建样本缓冲区if(hr!=S_OK || pBuffer==NULL)//如果创建失败{Sleep(1);goto CreateVideoBuffer;//再次创建}BYTE* pData = NULL;if (SUCCEEDED(hr)){hr = pBuffer->Lock(&pData, NULL, NULL);//锁定样本缓冲区}if (SUCCEEDED(hr)){CopyMemory(pData, pBy, Len);//从共享缓冲区复制视频数据到样本缓冲区}if (SUCCEEDED(hr) && pBuffer){hr = pBuffer->Unlock();//解锁样本缓冲区}if (SUCCEEDED(hr)){hr = pBuffer->SetCurrentLength(Len);//设置缓冲区的数据长度}
CreateVideoSample:IMFSample* pSample = NULL;if (SUCCEEDED(hr)){hr = MFCreateSample(&pSample);//创建媒体样本if(hr!=S_OK || pSample==NULL)//如果创建失败{Sleep(1);goto CreateVideoSample;//再次创建}}if (SUCCEEDED(hr)){hr = pSample->AddBuffer(pBuffer);//添加缓冲区到样本}if (SUCCEEDED(hr)){hr = pSample->SetSampleTime(star);//设置样本显示时间}if (SUCCEEDED(hr)){hr = pSample->SetSampleDuration(end-star);//设置样本持续时间}if (SUCCEEDED(hr)){hr = pSinkWriter->WriteSample(0, pSample);//写视频样本到接收写入器}SafeRelease(&pBuffer);SafeRelease(&pSample);//释放缓冲区,释放样本if (SUCCEEDED(hr)){SetEvent(hBeginVideo);//发送“视频引脚开始接收数据”信号
Agan2:pCMp4Writer->GetState(0,&fs);//获取过滤器状态if(fs!=State_Running)goto jump2;DWORD dw = WaitForSingleObject(hOver,0);//获取“写MP4完成”信号if(dw==WAIT_OBJECT_0)goto complete;//有“写MP4完成”信号,转到completeDWORD dw2=WaitForSingleObject(VideoEndReceive,0);//获取“视频引脚结束接收数据”信号if(dw2!=WAIT_OBJECT_0)goto Agan2;//如果没有“视频引脚结束接收数据”信号,再次获取}}jump2:pCMp4Writer->GetState(0,&fs);//获取过滤器状态switch(fs){case State_Stopped:goto Stop;//如果停止,结束写入样本,退出线程break;case State_Paused:goto jump2;//如果暂停,继续获取过滤器状态。暂停状态下,线程从jump2到此处反复循环break;case State_Running:goto jump1;//如果运行,继续下一次写样本到接收写入器break;}}
complete:   //流到达末尾if (SUCCEEDED(hr)){hr=pCMp4Writer->NotifyEvent(EC_MP4_COMPLETE,NULL,NULL);//向过滤器图发送EC_MP4_COMPLETE自定义“写MP4完成”事件通知}else{hr=pCMp4Writer->NotifyEvent(EC_MP4_ERROR,NULL,NULL);//向过滤器图发送EC_MP4_ERROR自定义“发生错误”事件通知}Stop:   //用户停止了过滤器hr = pSinkWriter->NotifyEndOfSegment(MF_SINK_WRITER_ALL_STREAMS);//向所有流设置结束标记hr = pSinkWriter->Finalize();//结束接收写入器写样本//释放所有接口SafeRelease(&pMediaTypeAudioOut);SafeRelease(&pMediaTypeAudioIn);SafeRelease(&pMTVideoOut);SafeRelease(&pMTVideoIn);SafeRelease(&pSinkWriter);MFShutdown();//关闭媒体基础CoUninitialize();//关闭COM库CloseHandle(hThread);//关闭线程句柄hThread=NULL; return 1;
}

视频引脚头文件:CVideoPin.h

#ifndef  PIN_FILE1
#define PIN_FILE1#include  "WriteMP4.h"class CMp4Writer;class CVideoPin : public CBaseInputPin 
{friend class CMp4Writer;
public:CVideoPin(LPCTSTR pObjectName, CBaseFilter* pFilter, CCritSec *pLock, HRESULT *phr, LPCWSTR pPinName);~CVideoPin();HRESULT CheckMediaType(const CMediaType* pmt);HRESULT SetMediaType(const CMediaType* pmt);STDMETHODIMP Receive(IMediaSample *pSample);STDMETHODIMP EndOfStream();
};#endif // PIN_FILE1

视频引脚源文件:CVideoPin.cpp

#include "CVideoPin.h"extern HANDLE hVBegin;//“视频Receive第1次调用完成”事件句柄
extern HANDLE hOver;//"写MP4完成”事件句柄extern UINT32 VIDEO_FPS;//每秒帧数
extern UINT32 VIDEO_WIDTH;//视频画面宽度,以像素为单位
extern UINT32 VIDEO_HEIGHT;//视频画面高度,以像素为单位extern BYTE pBy[20000000];//视频共享缓冲区20M
extern REFERENCE_TIME star, end;//视频样本时间戳
extern LONG Len;//视频样本大小,以字节为单位extern HANDLE hBeginVideo;//”视频引脚开始接收数据“事件句柄
extern HANDLE VideoEndReceive;//”视频引脚结束接收数据“事件句柄CVideoPin::CVideoPin(LPCTSTR pObjectName, CBaseFilter* pFilter, CCritSec *pLock, HRESULT *phr, LPCWSTR pPinName) : CBaseInputPin(pObjectName, pFilter, pLock, phr, pPinName)
{}CVideoPin::~CVideoPin()
{}HRESULT CVideoPin::CheckMediaType(const CMediaType* pmt)
{if(*(pmt->Type())==MEDIATYPE_Video && *(pmt->Subtype())==MEDIASUBTYPE_RGB32 && *(pmt->FormatType())==FORMAT_VideoInfo&& pmt->IsFixedSize() && pmt->IsTemporalCompressed()==FALSE && pmt->cbFormat==sizeof(VIDEOINFOHEADER)){VIDEOINFOHEADER* pVH=(VIDEOINFOHEADER*)pmt->Format();if(pVH->AvgTimePerFrame==0)return E_INVALIDARG;return S_OK;}elsereturn E_INVALIDARG;
}HRESULT CVideoPin::SetMediaType(const CMediaType* pmt)
{VIDEOINFOHEADER* pVH=(VIDEOINFOHEADER*)pmt->Format();VIDEO_FPS=(UINT32)(10 * 1000 * 1000 /pVH->AvgTimePerFrame);//计算每秒帧数BITMAPINFOHEADER* pBH=&pVH->bmiHeader;VIDEO_WIDTH=pBH->biWidth;//视频画面宽度,以像素为单位VIDEO_HEIGHT=pBH->biHeight;//视频画面高度,以像素为单位return CBaseInputPin::SetMediaType(pmt);//为基类设置媒体类型
}STDMETHODIMP CVideoPin::Receive(IMediaSample *pSample)//接收样本函数
{HRESULT hr;BYTE* pBuffer=NULL;if(Len==0)//如果Len为0,执行视频引脚第1次接收样本{hr=pSample->GetPointer(&pBuffer);hr=pSample->GetTime(&star, &end);Len=pSample->GetActualDataLength();if(Len<=sizeof(pBy))//只有样本缓冲区小于或等于共享缓冲区时,才向共享缓冲区输出数据;防止操作数组越界CopyMemory(pBy,pBuffer,Len);SetEvent(hVBegin);//发出“视频引脚第1次接收完成”信号return CBaseInputPin::Receive(pSample);}
Agan:DWORD dOver = WaitForSingleObject(hOver,0);//获取“写MP4完成”信号if(dOver==WAIT_OBJECT_0)return E_FAIL;//有“写MP4完成”信号,返回E_FAILFILTER_STATE fs;hr = m_pFilter->GetState(0,&fs);//获取过滤器状态switch(fs){case State_Stopped:return E_FAIL;break;case State_Paused:goto Agan;break;}DWORD dw=WaitForSingleObject(hBeginVideo,0);//获取“视频引脚开始接收数据”信号if(dw != WAIT_OBJECT_0)goto Agan;//如果没有“视频引脚开始接收数据”信号,再次获取hr=pSample->GetPointer(&pBuffer);hr=pSample->GetTime(&star, &end);Len=pSample->GetActualDataLength();if(Len<=sizeof(pBy))//只有样本缓冲区小于或等于共享缓冲区时,才向共享缓冲区输出数据;防止操作数组越界CopyMemory(pBy,pBuffer,Len);SetEvent(VideoEndReceive);//发出“视频引脚结束接收数据”信号return CBaseInputPin::Receive(pSample);
}STDMETHODIMP CVideoPin::EndOfStream()//流结束时EndOfStream被调用
{SetEvent(hOver);//发出“写MP4完成”信号return CBaseInputPin::EndOfStream();
} 

音频引脚头文件:CAudioPin.h

#ifndef  PIN_FILE2
#define PIN_FILE2#include  "WriteMP4.h"class CMp4Writer;class CAudioPin : public CBaseInputPin
{friend class CMp4Writer;
public:CAudioPin(LPCTSTR pObjectName, CBaseFilter* pFilter, CCritSec *pLock, HRESULT *phr, LPCWSTR pPinName);~CAudioPin();HRESULT CheckMediaType(const CMediaType* pmt);HRESULT SetMediaType(const CMediaType* pmt);STDMETHODIMP Receive(IMediaSample *pSample);STDMETHODIMP EndOfStream();
};#endif // PIN_FILE2

音频引脚源文件:CAudioPin.cpp

#include "CAudioPin.h"extern HANDLE hABegin;//“音频引脚第1次接收完成”事件句柄
extern HANDLE hOver;//“写MP4完成”事件句柄extern UINT32 AUDIO_CHANNELS;//声道数
extern UINT32 SAMPLES_PER_SECOND;//音频采样率extern BYTE pABy[2000000];//音频共享缓冲区2M
extern LONG ALen;//音频样本大小,以字节为单位
extern REFERENCE_TIME Astar, Aend;//音频样本时间戳extern HANDLE hBeginAudio;//“音频引脚开始接收数据”事件句柄
extern HANDLE AudioEndReceive;//“音频引脚结束接收数据”事件句柄CAudioPin::CAudioPin(LPCTSTR pObjectName, CBaseFilter* pFilter, CCritSec *pLock, HRESULT *phr, LPCWSTR pPinName) : CBaseInputPin(pObjectName, pFilter, pLock, phr, pPinName)
{}CAudioPin::~CAudioPin()
{}HRESULT CAudioPin::CheckMediaType(const CMediaType* pmt)
{WAVEFORMATEX* p = (WAVEFORMATEX*)pmt->Format();if(*(pmt->Type())==MEDIATYPE_Audio && *(pmt->Subtype())==MEDIASUBTYPE_PCM && *(pmt->FormatType())==FORMAT_WaveFormatEx &&  pmt->IsTemporalCompressed()==FALSE && p->wBitsPerSample==16 && (p->nSamplesPerSec==48000 || p->nSamplesPerSec==44100) ){return S_OK;}elsereturn E_INVALIDARG;
}HRESULT CAudioPin::SetMediaType(const CMediaType* pmt)
{WAVEFORMATEX* p = (WAVEFORMATEX*)pmt->Format();AUDIO_CHANNELS=p->nChannels;//声道数SAMPLES_PER_SECOND=p->nSamplesPerSec;//采样率return CBaseInputPin::SetMediaType(pmt);//为基类设置媒体类型
}STDMETHODIMP CAudioPin::Receive(IMediaSample *pSample)//音频引脚接收函数
{HRESULT hr;BYTE* pBuffer=NULL;if(ALen==0)//如果ALen为0,执行音频引脚第1次接收样本{hr=pSample->GetPointer(&pBuffer);hr=pSample->GetTime(&Astar, &Aend);ALen=pSample->GetActualDataLength();if(ALen<=sizeof(pABy))//只有样本缓冲区小于或等于共享缓冲区时,才向共享缓冲区输出数据;防止操作数组越界CopyMemory(pABy,pBuffer,ALen);SetEvent(hABegin);//发出“音频引脚第1次接收完成”信号return CBaseInputPin::Receive(pSample);}
Agan:DWORD dOver = WaitForSingleObject(hOver,0);//获取“写MP4完成”信号if(dOver==WAIT_OBJECT_0)return E_FAIL;//有“写MP4完成”信号,返回E_FAILFILTER_STATE fs;m_pFilter->GetState(0,&fs);//获取过滤器状态switch(fs){case State_Stopped:return E_FAIL;break;case State_Paused:goto Agan;break;}DWORD dw=WaitForSingleObject(hBeginAudio,0);//获取“音频引脚开始接收数据”信号if(dw!=WAIT_OBJECT_0)goto Agan;//如果没有“音频引脚开始接收数据”信号,再次获取hr=pSample->GetPointer(&pBuffer);hr=pSample->GetTime(&Astar, &Aend);ALen=pSample->GetActualDataLength();if(ALen<=sizeof(pABy))//只有样本缓冲区小于或等于共享缓冲区时,才向共享缓冲区输出数据;防止操作数组越界CopyMemory(pABy,pBuffer,ALen);SetEvent(AudioEndReceive);//发出“音频引脚结束接收数据”信号return CBaseInputPin::Receive(pSample);
}STDMETHODIMP CAudioPin::EndOfStream()//当音频流结束时被调用
{SetEvent(hOver);//发出“写MP4完成”信号return CBaseInputPin::EndOfStream();
} 

下载本文代码制作的过滤器DLL


本文来自互联网用户投稿,文章观点仅代表作者本人,不代表本站立场,不承担相关法律责任。如若转载,请注明出处。 如若内容造成侵权/违法违规/事实不符,请点击【内容举报】进行投诉反馈!

相关文章

立即
投稿

微信公众账号

微信扫一扫加关注

返回
顶部