我有 Logitech 的高级网络摄像头 (C930e),这款摄像头支持直接在摄像头本身中内置 mjpeg 处理。我的目的是使用 ffmpeg 获取该流并将其传输到 ffserver,以便可以将其以 mjpeg 格式传输到网络浏览器。问题是,虽然我的网络摄像头已经生成了 mjpeg,但 ffmpeg 或 ffserver 仍在进行转码。因此,它的无用进程消耗了我所有的 CPU。
这是我的 ffserver 配置
HTTPPort 8090
HTTPBindAddress 0.0.0.0
MaxClients 20
MaxBandwidth 500000
NoDaemon
<Feed webcam.ffm>
file /tmp/webcam.ffm
FileMaxSize 10M
</Feed>
<Stream webcam.mjpeg>
NoDefaults
Feed webcam.ffm
Format mpjpeg
VideoSize 640x360
VideoFrameRate 30
VideoBitRate 10280
VideoQMin 1
VideoQMax 1
NoAudio
</Stream>
<Stream index.html>
Format status
</Stream>
这是我的 ffmpeg 输出:
sudo ~/bin/ffmpeg -v verbose -r 30 -s 640x360 -f v4l2 -input_format mjpeg -i /dev/video0 -c:v copy http://localhost:8090/webcam.ffm
ffmpeg version N-78619-g778439b Copyright (c) 2000-2016 the FFmpeg developers
built with gcc 4.9.2 (Raspbian 4.9.2-10)
configuration: --prefix=/home/pi/ffmpeg_build --pkg-config-flags=--static --extra-cflags=-I/home/pi/ffmpeg_build/include --extra-ldflags=-L/home/pi/ffmpeg_build/lib --bindir=/home/pi/bin --enable-gpl --enable-libass --enable-libfdk-aac --enable-libfreetype --enable-libmp3lame --enable-libopus --enable-libtheora --enable-libvorbis --enable-libvpx --enable-libx264 --enable-nonfree
libavutil 55. 18.100 / 55. 18.100
libavcodec 57. 24.105 / 57. 24.105
libavformat 57. 26.100 / 57. 26.100
libavdevice 57. 0.101 / 57. 0.101
libavfilter 6. 34.100 / 6. 34.100
libswscale 4. 0.100 / 4. 0.100
libswresample 2. 0.101 / 2. 0.101
libpostproc 54. 0.100 / 54. 0.100
[video4linux2,v4l2 @ 0x231a290] fd:3 capabilities:84200001
[mjpeg @ 0x231ae10] Changing bps to 8
[mjpeg @ 0x231ae10] EOI missing, emulating
Input #0, video4linux2,v4l2, from '/dev/video0':
Duration: N/A, start: 5997.091177, bitrate: N/A
Stream #0:0: Video: mjpeg, 1 reference frame, yuvj422p(pc, bt470bg/unknown/unknown), 640x360, -5 kb/s, 30 fps, 30 tbr, 1000k tbn, 1000k tbc
[tcp @ 0x231f7a0] Connection to tcp://localhost:8090 failed (Connection refused), trying next address
[tcp @ 0x2325f20] Connection to tcp://localhost:8090 failed (Connection refused), trying next address
[graph 0 input from stream 0:0 @ 0x231def0] w:640 h:360 pixfmt:yuvj422p tb:1/30 fr:30/1 sar:0/1 sws_param:flags=2
[scaler for output stream 0:0 @ 0x231e1f0] w:640 h:360 flags:'bicubic' interl:0
[graph 0 input from stream 0:0 @ 0x231def0] TB:0.033333 FRAME_RATE:30.000000 SAMPLE_RATE:nan
[scaler for output stream 0:0 @ 0x231e1f0] w:640 h:360 fmt:yuvj422p sar:0/1 -> w:640 h:360 fmt:yuvj422p sar:0/1 flags:0x4
Output #0, ffm, to 'http://localhost:8090/webcam.ffm':
Metadata:
creation_time : 2016-02-23 01:32:00
encoder : Lavf57.26.100
Stream #0:0: Video: mjpeg, 1 reference frame, yuvj422p(pc), 640x360, q=1-1, 10280 kb/s, 30 fps, 1000k tbn, 30 tbc
Metadata:
encoder : Lavc57.24.105 mjpeg
Side data:
cpb: bitrate max/min/avg: 0/0/10280000 buffer size: 0 vbv_delay: -1
Stream mapping:
Stream #0:0 -> #0:0 (mjpeg (native) -> mjpeg (native))
Press [q] to stop, [?] for help
[mjpeg @ 0x231b750] overread 3
[mjpeg @ 0x231b750] EOI missing, emulating
frame= 4000 fps= 15 q=24.8 size= 228156kB time=00:02:13.33 bitrate=14017.9kbits/s speed=0.501x
如你看到的
流 #0:0 -> #0:0 (mjpeg (原生) -> mjpeg (原生))
相反应该是
流 #0:0 -> #0:0 (复制)
我不明白,是什么原因导致 ffmpeg 忽略 -c:v 复制标志... 是因为 ffserver 强制它转码,而不管它已经是相同的格式吗?还是我的命令有问题?
请帮忙...:) 我需要将我的 mjpeg 网络摄像头传输到网络上而无需重新转码,因为我购买该摄像头是为了帮助我的 Raspberry Pi 2(无需在 Pi 中进行转码)
答案1
我还没有尝试过,但它似乎就是你想要的。
从https://ffmpeg.org/ffserver.html#Stream-specifiers-1:
空的流说明符匹配所有流。例如,
-codec copy
或-codec: copy
将复制所有流而不重新编码。