2015年4月6日 星期一

[Android] How to use MediaExtractor and MediaCodec? - Video part

MediaExtractor is used to separate the video data and audio data from media sources.
It can support HTTP streaming or local file.
After separating the video and audio, MediaCodec will take care of the decoding jobs.

Let's check the video part first.
For video, we need to render it after decoding.
Therefore we have to incorporate SurfaceView in our layout.

private MediaExtractor extractorVideo;
private MediaCodec decoderVideo;

extractorVideo = new MediaExtractor();
extractorVideo.setDataSource("myTest.mp4"); 

for (int i = 0; i < extractorVideo.getTrackCount(); i++) {
 MediaFormat format = extractorVideo.getTrackFormat(i);
 String mime = format.getString(MediaFormat.KEY_MIME);
 Log.d(TAG, "mime=>"+mime);
 if (mime.startsWith("video/")) {
  videoTrack = i;  
  extractorVideo.selectTrack(videoTrack);
  decoderVideo = MediaCodec.createDecoderByType(mime);
  decoderVideo.configure(format, surface, null, 0);
  break;
 }
}

if (videoTrack >=0) {
 if(decoderVideo == null)
 {
  Log.e(TAG, "Can't find video info!");
  return;
 }
 else
  decoderVideo.start();
}
extractorVideo  will find out the video track according to the MIME information.
After that, we can pass the video format and surface to render on to decoderVideo using decoderVideo.configure().
decoderVideo will take care of decoding and rendering jobs almost automatically.

Below is the decoding part:
ByteBuffer[] inputBuffersVideo=null;
ByteBuffer[] outputBuffersVideo=null;
BufferInfo infoVideo=null;

if (videoTrack >=0)
{
 inputBuffersVideo = decoderVideo.getInputBuffers();
 outputBuffersVideo = decoderVideo.getOutputBuffers();
 infoVideo = new BufferInfo();
}

boolean isEOS = false;
long startMs = System.currentTimeMillis();

while (!Thread.interrupted()) {
 if (videoTrack >=0)
 {     
  if (!isEOS) {      
   int inIndex=-1;
   try {
    inIndex = decoderVideo.dequeueInputBuffer(10000);
   } catch (Exception e) {
    e.printStackTrace();    
   }

   if (inIndex >= 0) {
    ByteBuffer buffer = inputBuffersVideo[inIndex];
    int sampleSize = extractorVideo.readSampleData(buffer, 0);
    if (sampleSize < 0) {
     // We shouldn't stop the playback at this point, just pass the EOS
     // flag to decoder, we will get it again from the dequeueOutputBuffer     
     decoderVideo.queueInputBuffer(inIndex, 0, 0, 0, MediaCodec.BUFFER_FLAG_END_OF_STREAM);
     buffer.clear();
     isEOS = true;
    } else {     
     long current = System.currentTimeMillis();
     decoderVideo.queueInputBuffer(inIndex, 0, sampleSize, extractorVideo.getSampleTime(), 0);     
     buffer.clear();
     extractorVideo.advance();
    }
   }
  }
  int outIndex=-1;
  try {
   outIndex = decoderVideo.dequeueOutputBuffer(infoVideo,10000);
  } catch (Exception e) {
   e.printStackTrace();   
  }
  switch (outIndex) {
  case MediaCodec.INFO_OUTPUT_BUFFERS_CHANGED:
   Log.d(TAG, "INFO_OUTPUT_BUFFERS_CHANGED");
   outputBuffersVideo = decoderVideo.getOutputBuffers();
   break;
  case MediaCodec.INFO_OUTPUT_FORMAT_CHANGED:
   Log.d(TAG, "New format " + decoderVideo.getOutputFormat());
   break;
  case MediaCodec.INFO_TRY_AGAIN_LATER:
   Log.d(TAG, "dequeueOutputBuffer timed out!");
   break;
  default:
   if(outIndex >=0)
   {
    ByteBuffer buffer = outputBuffersVideo[outIndex];    
    buffer.clear();
    decoderVideo.releaseOutputBuffer(outIndex, true);
    // We use a very simple clock to keep the video FPS, or the video
    // playback will be too fast
    while (infoVideo.presentationTimeUs / 1000 > (System.currentTimeMillis() - startMs)) {
     try {      
      sleep(10);
     } catch (InterruptedException e) {
      e.printStackTrace();
      Thread.currentThread().interrupt();
      break;
     }
    }
   }
   break;
  }

  // All decoded frames have been rendered, we can stop playing now
  if ((infoVideo.flags & MediaCodec.BUFFER_FLAG_END_OF_STREAM) != 0) {
   Log.d(TAG, "OutputBuffer BUFFER_FLAG_END_OF_STREAM");
   break;
  }
 }
}
if (videoTrack >=0)
{
 decoderVideo.stop();
 decoderVideo.release();
}

extractorVideo.release();

We can use decoderVideo.dequeueInputBuffer(10000) to get a input buffer first.
Then use extractorVideo.readSampleData(buffer, 0) to feed the video data to buffer from media source.
For decoding, we then use decoderVideo.queueInputBuffer() to queue the data.
decoderVideo will start to decode after that.

We use decoderVideo.dequeueOutputBuffer(infoVideo,10000) to get the result.
If the return value is not -1, it means that decoderVideo has decoded the video data successfully.

To render the decoded data to surface, we use  decoderVideo.releaseOutputBuffer(outIndex, true). The second parameter true means that we need to render.

We have finished the decode and render jobs.
Forward to the next video data again using extractorVideo.advance().

沒有留言:

張貼留言