[Howto] Odroid RTSP webcam

Moderators: mdrjr, odroid, meveric

[Howto] Odroid RTSP webcam

Unread postby mad_ady » Wed Sep 07, 2016 4:45 pm

There have been some articles in the past year detailing how you can set up your odroid with a webcam to perform various tasks - detecting fires (http://magazine.odroid.com/wp-content/u ... df#page=10), augmented reality (http://magazine.odroid.com/wp-content/u ... df#page=19) or as a security camera (http://magazine.odroid.com/wp-content/u ... pdf#page=6), but I wanted to use my webcam as a general purpose IP webcam. Off-the-shelf IP webcams usually allow you to view the video stream in real time with sound over RTSP. Additionally they usually expose a web API to allow you to grab images from them or control the camera settings such as pan/tilt. This way the webcam can be integrated in a bigger monitoring project (like a network DVR), or can be queried over the network on demand. Android has plenty of apps that manage all of these needs, but we'll focus on Linux instead because you might want to use your odroid for other tasks as well. By the end of this article you will learn how to grab images from your webcam over the web, how to view real time streams with sound and how to do recordings. Let's begin!

Knowing your camera's capabilities

Most modern cameras are supported under Linux with the generic "uvc" driver. The driver exposes several devices when a webcam is plugged in - like a /dev/video0 Video4Linux interface, a new input device in ALSA and maybe a button that acts like a HID keyboard. By installing the v4l-utils package you can list the supported modes of your camera (an example listing for HardKernel's 720p webcam is here: http://pastebin.com/L1VwZZFs):
Code: Select all
$ sudo apt-get install v4l-utils
$ v4l2-ctl --list-formats-ext


You may notice that most cameras can output in YUV (uncompressed mode) with lower frames per second, or MJPEG (compressed). High-end cameras can also output H264 video encoded directly inside the camera. This tutorial assumes you have a MJPEG camera available, but would like to view H264 streams from your system.

The v4l2-ctl utility also allows you to list and change some of the camera's parameters, such as brightness, contrast or gamma which is useful if you don't have optimal lighting conditions. You can list these parameters with:
Code: Select all
$ v4l2-ctl --list-ctrls

If your camera doesn't expose a /dev/video0 pseudo-file but you can grab images with a custom API you can use v4l2loopback (https://github.com/umlaeute/v4l2loopback) to feed your data to a virtual /dev/videoX device so that you can read it with standard tools.

Getting still images
Now that the camera is up and running the first task is to grab images from it, either to be saved on disk, or to be viewed remotely. Even if the task seems simple and there are various tools to help you do it, the devil is in the details. Tools like uvccapture or streamer can do the job, but I found that in practice both suffer from the following problems:
Grabbing a picture initializes the camera and it can take a variable amount of time (between 1-30s)
Pictures are usually dark because the camera hasn't had enough time to balance the light level. Streamer can compensate this by "recording" for a specified time (e.g. 1 second) before snapping the picture.
Sometimes the camera may return incomplete frames (e.g. only the top part)
Also, if you are using the camera for something else (e.g. live streaming or motion detection) the tools can't connect to /dev/video0 to grab still images, so there's a need to multiplex access to the camera.

The best tool for the job needs to have exclusive access to the video device and allow other tools to grab images and video (at the same time). Also, it needs to have the camera active so that it compensates dark images. For me, this miraculous tool was mjpg-streamer: https://github.com/jacksonliam/mjpg-streamer.git

To install it under /usr/local follow these steps:
Code: Select all
$ git clone https://github.com/jacksonliam/mjpg-streamer.git
$ cd mjpg-streamer/mjpg-streamer-experimental
$ sudo apt-get install cmake libjpeg62-dev
$ make
$ sudo make install


It's best to test mjpg-streamer before enabling it at startup. The program has a configurable number of inputs (cameras) and output settings. It can run as a HTTP server, output to file or to UDP/RTSP streams. In my tests the RTSP function was not reliable and did not work with any RTSP client (RTSP protocol does not support MJPEG data in standard implementations). We will use it as a HTTP server and use other processes to read from mjpg-streamer.

To start mjpg-streamer as a web server with authentication and read from the first camera run the following command:
Code: Select all
$ sudo /usr/local/bin/mjpg_streamer  -i 'input_uvc.so -r 1280x720 -m 50000 -n -f 25 -d /dev/video0' -o 'output_http.so -p 8090 -w /usr/local/share/mjpg-streamer/www/ -c odroid:odroidpass'


The command is complex, so let's explain what all the switches do. "-i" specifies input plugin, which is input_uvc.so (grabbing from a UVC camera). Next comes the camera's desired resolution, while "-m" is the minimum size of the input. I've set this to 50kB, so mjpg-streamer will drop jpeg frames smaller than that (720p frames are around 120kB in size). This is a good thing because sometimes the camera starts outputting incomplete frames. This also has the side-effect of not being to capture anything in low light conditions, when the frames are mostly dark and the jpeg compression reduces them under 50kB. You will need to tune this parameter according to your input resolution.

The "-n" parameter disables dynamic controls in the UVC driver, while "-f" specifies the input framerate. "-d" points to the video device (/dev/video0 by default). On the output side of the equation we use output_http.so module on port "-p" 8090 and serving HTTP files from the directory pointed to by "-w". You can optionally add password protection with the "-c" parameter and specifying the username:password combination. You can find detailed usage information at these pages: https://github.com/jacksonliam/mjpg-str ... /README.md, https://github.com/jacksonliam/mjpg-str ... /README.md.

Once you successfully start mjpg_streamer you will be able to access with a browser http://odroid-ip:8090/. You will be asked for your user/password combination and be presented with the demo page in figure 1. You can of course create your own page, but the demo page gives you the necessary information to access the camera.

Image
Figure 1 - MJPEG Streamer web interface featuring some Starcraft action

To grab a single image from the camera you can do:
Code: Select all
$ sudo apt-get install curl
$ curl -s -f -m 5 http://odroid:odroidpass@odroid-ip:8090/?action=snapshot > /tmp/snapshot.jpeg

You can use this together with crond to take pictures at a set interval of time. You can use the timestamp as a filename or use a tool like montage to add the time as a watermark on top of the image. Here is a small script that snaps pictures in a specific directory and adds the date and time: https://github.com/mad-ady/odroid-webca ... m-image.sh. You can further use ffmpeg to combine all these pictures in a video for easier review later:
https://github.com/mad-ady/odroid-webca ... ideshow.sh.

Image
Figure 2 - A snapshot with timestamp superimposed

To get a MJPEG video stream from the camera (a sequence of JPEG images) you can do:
Code: Select all
$ vlc http://odroid:odroidpass@odroid-ip:8090/?action=stream

If all is well and you are getting an image, it's time to add a systemd startup script for mjpeg_streamer. Create a file called /etc/systemd/system/mjpg_streamer.service with the contents downloaded from here: https://github.com/mad-ady/odroid-webca ... er.service (adjust for your needs).

To activate the service run:
Code: Select all
$ sudo systemctl enable mjpg_streamer.service
$ sudo systemctl start mjpg_streamer.service

To check that the service is running you can query systemd:
Code: Select all
$ sudo systemctl status mjpg_streamer.service


Getting videos - now with 20% more sound!

Motion JPEG is widely supported in all browsers, so that's a good thing, but it doesn't support sound and compression is poor. The bitrate of a 25 fps 720p MJPEG stream is around 13Mbps, which may be high for internet use. In order to get videos with sound we'll need to multiplex the MJPEG stream with a sound stream from the camera's microphone into a supported media format.

Since we know how to get the video stream, let's concentrate on the microphone. You can list the current devices supported by ALSA in your system with arecord -L. You should see several inputs relating to a USB 2.0 Camera with varying capabilities (figure 3). We will need the name in order to configure it in ffmpeg later on (in our case we'll use the last one - plughw:CARD=Camera,DEV=0).

Code: Select all
$ arecord -L


Image
Figure 3 - Listing audio devices

Before we start recording we need to check that the microphone is unmuted and is at an acceptable level. I really like the microphone on Hardkernel's 720 Webcam - it has adaptive gain so that I can hear whispers in a room followed by kids shouting, without going deaf. To tune volume we will use alsamixer - use F6 to select sound card and use F4 to go to the Capture tab. Use arrow keys to adjust the audio level (I use it at maximum).

Image
Figure 4 - Alsa mixer audio level

We can now build our ffmpeg query that grabs a video stream from MJPEG Streamer, adds audio from ALSA and produces a file on disk:
Code: Select all
$ sudo apt-get install ffmpeg
$ ffmpeg -framerate 5 -f mjpeg -i 'http://odroid:odroidpass@127.0.0.1:8090/?action=stream' -f alsa -i plughw:CARD=Camera,DEV=0 -acodec libmp3lame -c:v libx264 -preset ultrafast -r 5 -pix_fmt yuv420p -b:v 1500k -async 1 myvideo.mp4

The command above uses software encoding and specifies that the input framerate should be 5 FPS, input is mjpeg stream from the address above, "-f" specifies that you should use ALSA for audio, from the device mentioned next. Audio should be encoded with mp3lame and video with h264 using the ultrafast preset and an output framerate of 5 FPS. The video bandwidth is limited to 1500kbps (otherwise ffmpeg starts calculating what's best and you can't really do real-time encoding). The async option tries to synchronize video and audio (but drifts often occur anyway) and the last parameter is the output filename to write to.

If only your script will be using the microphone, than ALSA is good enough. Otherwise, if other processes (e.g. Mycroft) need to use the microphone at the same time, you will need PulseAudio. To set up PulseAudio you can follow these guides: viewtopic.php?f=54&t=27271&p=195770#p195776. To record from ffmpeg with pulseaudio sound, you can run:
Code: Select all
$ ffmpeg -framerate 5 -f mjpeg -i 'http://odroid:odroidpass@127.0.0.1:8090/?action=stream' -f pulse -server 127.0.0.1 -i alsa_input.usb-Sonix_Technology_Co.__Ltd._USB_2.0_Camera-02.analog-mono -acodec libmp3lame -c:v libx264 -preset ultrafast -r 5 -pix_fmt yuv420p -b:v 1500k -async 1 myvideo.mp4


With a C2 you can (almost) do software encoding up to 10 FPS at 720p in real-time, but audio gets garbled and the safest bet was to keep framerate low. I have compiled an optimized version of ffmpeg for the C2 (using -march=armv8-a+crypto+crc+fp+simd -mtune=cortex-a53), but there was no noticeable change in encoding performance. Depending on your needs this might be acceptable or not. For hardware encoding, read on.

The best results I got with mjpg_streamer set to 640x480 and with ffmpeg recording at 10fps with a video bandwidth of 1Mbps using software encoding. Curiously, going lower than this results in poorer FPS - 6fps. In case you hear choppy sound in your recordings it means that ffmpeg can't keep up with the imposed framerate. As far as I've seen if you try to record to a higher framerate than what ffmpeg can do in real time you will get choppy audio. Worst of all the encoding performance depends on system load, so on higher loads you'll get lower FPS in real time...
To get a few recipes I tried and also to see how to record audio only consult this cheatsheet: https://github.com/mad-ady/odroid-webca ... -tests.txt

I also redid the tests after Hardkernel pushed their overclock boot.ini settings with the C2 running at 1.75GHz and 4 cores and I was able to get stable sound @720p with 8fps (instead of 5), and 15fps with a resolution of 640x480, which is nice. I wasn't able to test performance at higher frequencies with less cores due to too much instability - but I expect things to improve over time. Also, if you increase the RAM frequency to 1104MHz you can gain 1-2 fps.

If you want to bypass mjpg_streamer completely you can read directly from /dev/video0 like this:
Code: Select all
$ ffmpeg -r 5 -f v4l2 -video_size 640x480 -i /dev/video0 -f alsa -i plughw:CARD=Camera,DEV=0 -acodec libmp3lame -c:v libx264 -preset ultrafast -r 5 -pix_fmt yuv420p -b:v 1000k -async 1 myvideo.mp4

In fact, the user @crashoverride just created a library to use the hardware encoder on the C2 and you can use it to stream to ffmpeg, but it's a work in progress, which you can follow on his thread: http://forum.odroid.com/viewtopic.php?f=136&t=23680. Update: crashoverride's work has materialized into hardware encoding support - read on.

Hardware encoding on C0/C1/C2
The good news is that thanks to @crashoverride's work we now have two programs that can do h264 encoding on the C0/C1/C2: c2cap (http://forum.odroid.com/viewtopic.php?f=136&t=23680) and c2enc (http://forum.odroid.com/viewtopic.php?f=136&t=24293). c2cap is designed to work directly with a webcam, while c2enc can encode arbitrary video sources. To use them with your webcam (and with the mjpg_streamer setup detailed before) you have two options, based on the quality of your webcam. Encoding on the C0/C1 has some quirks (e.g. sometimes the video freezes), but they're being looked into. Note that even if you have hardware encoding you're not gaining much in terms of delay. Because of the various pipelines (2 ffmpeg process and an encoder) there are various sizes of queues in the process pipeline that cause video to come out with a delay of about 5s, so - still not usable for remote control.

1. If you have a webcam which doesn't corrupt or drop frames, use c2enc

To do this, install c2enc on your system
Code: Select all
$ git clone https://github.com/OtherCrashOverride/c2_vpcodec
$ cd c2_vpcodec
$ make
$ sudo cp libvpcodec.so /usr/lib
$ cd ..
$ git clone https://github.com/OtherCrashOverride/c2enc
$ cd c2enc
$ make
$ sudo cp c2enc /usr/local/bin


Next, install the ffmpeg helper script (ffmpeg-hardware-encoder-c2-c2enc.sh):
Code: Select all
$ sudo wget https://raw.githubusercontent.com/mad-ady/odroid-webcam-scripts/master/ffmpeg-hardware-encoder-c2-c2enc.sh -O /usr/local/bin/ffmpeg-hardware-encoder-c2-c2enc.sh
$ sudo chmod a+x /usr/local/bin/ffmpeg-hardware-encoder-c2-c2enc.sh


This script has a small configuration header where you can set things like desired framerate, dimensions of the video, output bitrate and user/password to access the mjpeg stream. Edit those values to match your installation. Internally, the script starts two threads - a "feeder" which reads data from mjpg_streamer and writes it to a named pipe and a "consumer", which reads the pipe, sends the data to c2enc and outputs the encoded video and muxes it with audio and sends it to ffserver. The named pipe was used in order to allow the two ffmpeg processes to run on different cores (http://www.linuxquestions.org/questions ... pe-718145/). Sometimes reading from mjpg_streamer fails (ffmpeg crashes on some bad input), so that thread is restarted when needed - you should experience a minor "frozen image". The whole pipeline is killed when streaming stops.

To enable this, replace /etc/systemd/system/ffmpeg.service with:
Code: Select all
$ sudo wget https://raw.githubusercontent.com/mad-ady/odroid-webcam-scripts/master/ffmpeg-c2enc.service -O /etc/systemd/system/ffmpeg.service


Now, if you turn on the ffmpeg service you should get a steady hardware encoded stream from your webcam.

If your webcam drops frames or creates corrupt frames you might experience intermittent sound loss because the incoming framerate in the encoder is lower than what is output. In this case you need to either lower the framerate in the script, or switch to the next method. With Hardkernel's 720p webcam and this method I can get an output stream with almost 10fps reliably at 720p (9 is more like it).


2. If you have a webcam which sometimes drops frames, use c2cap

Because c2cap expects to read from a camera device directly, and because we want to keep mjpg_streamer, we will need to create a fake webcam, read data from mjpg_streamer, write to the fake webcam and read from the fake webcam with c2cap and pass the data to ffmpeg for packaging and streaming. To create a fake webcam we'll need v4l2loopback:
Code: Select all
$ sudo apt-get install linux-headers-c2
$ sudo apt-get install v4l2loopback-dkms v4l2loopback-utils
$ sudo modprobe v4l2loopback video_nr=6 devices=1 card_label="Fake webcam"
$ v4l2loopback-ctl set-fps 25 /dev/video6


The dkms package sould take care and recompile the correct v4l2loopback module automatically for each kernel update.

Next, you have to prepare the system to load the v4l2loopback module at startup and set the correct permissions. This could be done from /etc/rc.local, by adding the lines before exit 0:
Code: Select all
$ sudo vi /etc/rc.local
chmod a+rw /dev/ion /dev/ge2d
modprobe v4l2loopback video_nr=6 devices=1 card_label="Fake webcam"
v4l2loopback-ctl set-fps 25 /dev/video6


And next, either reboot, or execute /etc/rc.local manually:
Code: Select all
$ sudo /etc/rc.local


Next, install c2cap:
Code: Select all
$ git clone https://github.com/OtherCrashOverride/c2_vpcodec
$ cd c2_vpcodec
$ make
$ sudo cp libvpcodec.so /usr/lib
$ cd ..
$ git clone -b beta1  https://github.com/OtherCrashOverride/c2cap.git
$ cd c2cap
$ sudo apt-get install libjpeg-turbo8-dev libasound2-dev
$ make
$ sudo cp c2cap /usr/local/bin
$ sudo chmod a+rw /dev/ion /dev/ge2d
$ cd ..


We'll also be installing a similar helper script (ffmpeg-hardware-encoder-c2-c2cap.sh):
Code: Select all
$ sudo wget https://raw.githubusercontent.com/mad-ady/odroid-webcam-scripts/master/ffmpeg-hardware-encoder-c2-c2cap.sh -O /usr/local/bin/ffmpeg-hardware-encoder-c2-c2cap.sh
$ sudo chmod a+x /usr/local/bin/ffmpeg-hardware-encoder-c2-c2cap.sh


You can edit the script and set your desired resolution, frame rate and credentials for mjpg_streamer. The script is very similar to the one presented above with the same pipeline, except the named pipe has been replaced by a /dev/video device. The advantage over the named pipe is that the /dev/video device duplicates missing frames, allowing the reader to read at 25fps, even if the writer writes only 10fps.

To enable this, replace /etc/systemd/system/ffmpeg.service with:
Code: Select all
sudo wget https://raw.githubusercontent.com/mad-ady/odroid-webcam-scripts/master/ffmpeg-c2cap.service -O /etc/systemd/system/ffmpeg.service


Now, if you turn on the ffmpeg service you should get a steady hardware encoded stream from your webcam.

Hardware encoding on XU3/XU4
In order to get hardware encoding support on XU3/4 you need to be running a more recent kernel (4.8 or 4.9). First you'll need to set up your system, install the new kernel and install ffmpeg with hardware encoding support. You can find all the instructions on this thread: http://forum.odroid.com/viewtopic.php?f=95&t=24366. Once you are able to transcode a test video using the hardware encoder you can continue to set up your webcam for streaming.

As before, you will need to replace the /etc/systemd/system/ffmpeg.service with:
Code: Select all
$ sudo wget https://raw.githubusercontent.com/mad-ady/odroid-webcam-scripts/master/ffmpeg-xu4.service -O /etc/systemd/system/ffmpeg.service


You may also need to edit it to fit your needs.

Streaming RTSP on demand
The main use of an IP camera is to be able to view the video stream on demand. Ideally, it should be viewable by multiple concurrent users at the same time. In order to do this we will use ffserver to create a RTSP stream that plays on demand.
RTSP (Real Time Streaming protocol) is a protocol similar to SIP that handles signalling and media transport between a client and a server. Usually signalling is done on TCP port 554 and the data streams over UDP with the client and server negotiating a suitable port. However NAT and firewall environments sometimes interferes with this negotiation so there is a way to transport the data over TCP interleaved with control traffic. This transport method will be used for our tests.

Ffserver provides a way of serving RTSP client requests based on ffmpeg video feeds. It is part of the ffmpeg package so you already have it installed. To be able to start the server you need a suitable configuration and a systemd startup script. The configuration needs to be saved to /etc/ffserver.conf and you can get one from here: https://github.com/mad-ady/odroid-webca ... erver.conf

If you browse through the configuration, it sets up a listener on RTSP port 554, defines a feed called mjpg-streamer.ffm and ties it to an output stream called live.h264.sdp. FFServer allows you to set up different output formats, but for this example it will pass through the input stream which will be already h264.

To start ffserver at startup you will need to add the following systemd service in /etc/systemd/system/ffserver.service: https://github.com/mad-ady/odroid-webca ... er.service

To enable it and see its status do:
Code: Select all
$ sudo systemctl enable ffserver
$ sudo systemctl start ffserver
$ sudo systemctl status ffserver

Right now you have a RTSP server listening for requests, but no video is being processed. To start a video feed you will need to run ffmpeg like this:
Code: Select all
$ /usr/bin/ffmpeg -loglevel 8 -r 5 -f mjpeg -i 'http://odroid:odroidpass@127.0.0.1:8090/?action=stream' -f alsa -i plughw:CARD=Camera,DEV=0 -acodec libmp3lame -c:v libx264 -preset ultrafast -r 5 -pix_fmt yuv420p -b:v 1500k -async 1 -x264-params keyint=30:no-scenecut=1 -vf "drawtext=fontfile=/usr/share/fonts/truetype/dejavu/DejaVuSans-Bold.ttf: text='Webcam feed %{localtime\\:%F %T}': fontcolor=white@0.8: x=7: y=5" -override_ffserver http://localhost:8099/mjpg-streamer.ffm

Before you freak out, the command is similar to the one you've seen before with the addition that we add an overlay text in the top left corner with the current date and time - just like "professional" IP webcams do. Ffmpeg writes the output to ffserver, specifying the feed name.

You should now be able to connect with a RTSP viewer and enjoy your video feed (for Android you can use RTSP Viewer https://play.google.com/store/apps/deta ... .app&hl=en):
Code: Select all
$ vlc rtsp://odroid-ip:554/live.h264.sdp


Image
Figure 5 RTSP streaming with sound

To make things more permanent you can add the following systemd ffmpeg service in /etc/systemd/system/ffmpeg.service: https://github.com/mad-ady/odroid-webca ... eg.service. Note, you can skip downloading this service if you're using hardware encoding as described above.

To enable it and see its status do:
Code: Select all
$ sudo systemctl enable ffmpeg
$ sudo systemctl start ffmpeg
$ sudo systemctl status ffmpeg


Improve idle performance
The video feeds are expected to be up at all times, which means ffmpeg must be transcoding even if there is no viewer connected. This may be fine if you expect to have a lot of viewers connected simultaneously, but if you intend to connect rarely (e.g. 5 minutes/day), it's not worth it to have the stream transcode in the background when not used. It would be best if we had a system that could trigger the start of the video stream when a viewer connects and could trigger the stop of the stream when all viewers disconnect. Enter the ffserver-trigger script!

The script runs in the background and runs a tail -f on /var/log/syslog. It will pick up messages from ffserver such as "PLAY live.h264.sdp", it will check if the stream is already running and start it if not. It will also look for stop messages such as [] " RTP/TCP" and stop the stream if necessary. It logs its actions under Syslog as well. Note that this trigger is customized for a single stream and follows the naming convention used in the article. It might need tweaking if you want to use it for other setups.

To install ffserver-trigger do the following:
Code: Select all
$ sudo apt-get install libfile-tail-perl
$ sudo perl -MCPAN -e 'install Linux::Proc::Net::TCP'
$ sudo wget -O /usr/local/bin/ffserver-trigger.pl https://raw.githubusercontent.com/mad-ady/ffserver-trigger/master/ffserver-trigger.pl
$ sudo chmod a+x /usr/local/bin/ffserver-trigger.pl
$ sudo wget -O /etc/systemd/system/ffserver-trigger.service https://raw.githubusercontent.com/mad-ady/ffserver-trigger/master/ffserver-trigger.service
$ sudo systemctl enable ffserver-trigger
$ sudo systemctl start ffserver-trigger
$ sudo systemctl status ffserver-trigger


Since you're using the ffserver-trigger now, you should disable the ffmpeg service so that it doesn't start automatically on boot - it will be started by ffserver-trigger when needed.
Code: Select all
$ sudo systemctl disable ffmpeg


In Figure 6 you can see the whole workflow.

Image
Figure 6. Streaming pipeline

If you want to be able to also record your stream to a file, you can connect to it as a regular RTSP viewer and dump it to a file without transcoding (you have the advantage to do this even when other clients are connected):
Code: Select all
$ ffmpeg -i rtsp://127.0.0.1:554/live.h264.sdp -acodec copy -vcodec copy rtsp-recording.mp4


In terms of video processing delays, mjpg_streamer adds about 1 second delay, while ffmpeg + ffserver will add between 2-3 extra seconds. So your experience will not be real-time, and not suitable for remote controlling a robot, but should be good enough for remote viewing.

Troubleshooting

    [Q]Unable to get images from mjpg_streamer/ffmpeg seems stuck
    [A]Check the value of -m parameter. Lower it to fit your needs
    [Q]Audio/video out of sync/choppy sound
    [A]Try 640x480 @ 10fps, or reduce the framerate in ffmpeg.service
    [Q]Stopping a RTSP stream stops all connected clients
    [A]Sometimes ffserver will crash with a segfault when a client stops. It gets restarted automatically by systemd but will disconnect all clients.
    [Q]Pressing Play as the first connected client does not start the RTSP stream when using ffserver-trigger
    [A]This is a known issue - the RTSP stream will timeout in about 10s before ffserver manages to send data back to the client. Press Play again after the timeout. If a client connects when a stream is active this issue doesn't happen. The trigger script has a 20s cooldown period in which it will ignore stop requests after a stream start to mitigate this.
    [Q]Sometimes connecting to a stream doesn't work - ffmpeg seems stuck
    [A]The cause is mjpg_streamer - it sometimes gets stuck and needs to be restarted. There are two lines you can uncomment in ffserver-trigger.pl to restart it automatically when ffmpeg is restarted to avoid this.
    [Q]But wait, an off the shelf webcam has pan and tilt support. How do I add those?
    [A]With some motors and PWM pins or an Arduino: http://hackaday.com/2016/08/30/pan-and- ... ntrollers/

If you run into other problems or if you find better ways to achieve this let me know on this thread.
Last edited by mad_ady on Fri Jul 14, 2017 10:54 pm, edited 12 times in total.
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Wed Sep 07, 2016 4:46 pm

[Reserved for XU4 with hardware encoding distributed server instructions]
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby odroid » Wed Sep 07, 2016 6:57 pm

Very useful and detail instruction for many people who want to build an IP(Internet Protocol) camera.
Really appreciate your help. Time to ping to Robroy. ;)
User avatar
odroid
Site Admin
 
Posts: 28292
Joined: Fri Feb 22, 2013 11:14 pm
languages_spoken: English
ODROIDs: ODROID

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Wed Sep 07, 2016 7:18 pm

I was ahead of you :P Hope it helps :)
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Thu Sep 08, 2016 8:27 pm

I also redid the tests after Hardkernel pushed their overclock boot.ini settings with the C2 running at 1.75GHz and 4 cores and I was able to get stable sound @720p with 8fps (instead of 5), and 15fps with a resolution of 640x480 instead of 10fps, which is nice. I wasn't able to test performance at higher frequencies with less cores due to too much instability - but I expect things to improve over time.

I've updated the first post as well
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby odroid » Thu Sep 08, 2016 8:40 pm

Good to know the performance improvement. Thank you for the test.

There will be some boot-blob updates early next week to increase the DRAM clock speed from 933Mhz to 1104Mhz.
I hope it works stably on your board.
I think 9~10FPS @720p since most image processing needs a lot of memory bandwidth.
User avatar
odroid
Site Admin
 
Posts: 28292
Joined: Fri Feb 22, 2013 11:14 pm
languages_spoken: English
ODROIDs: ODROID

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Wed Sep 14, 2016 11:07 pm

I've applied the DDR patch and I can run stably at 1104MHz. When encoding 720p content I could get 9fps (instead of 8) relatively reliably (with very few sound interruptions), but for 10fps I don't get audio because the board can't do 10fps (@1.7GHz). If I were to use one core @2GHz I expect to be able to encode at 10fps because ffmpeg mostly uses one core (even if it runs in paralel), but going over 1.7GHz results in boot issues for me.
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby crashoverride » Mon Sep 19, 2016 2:59 pm

@mad_ady, this should provide 640x480@30fps real-time compression on Odroid C2:
http://forum.odroid.com/viewtopic.php?f=136&t=23680
crashoverride
 
Posts: 3713
Joined: Tue Dec 30, 2014 8:42 pm
languages_spoken: english
ODROIDs: C1

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Mon Sep 19, 2016 3:41 pm

Wow, thank you! That is a game-changer :) I'll need to try it out :)

Update: 720p realtime seems to work as well. Anyway, for now I will continue the discussion about using c2cap on its thread and post something definitive here when it's more mature.
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Wed Nov 09, 2016 8:35 pm

Ok, I finally managed to update the first post with details regarding how to do hardware encoding with the webcam on C0/C1/C2/XU3/4 platforms. This is what I added:

Hardware encoding on C0/C1/C2
The good news is that thanks to @crashoverride's work we now have two programs that can do h264 encoding on the C0/C1/C2: c2cap (viewtopic.php?f=136&t=23680) and c2enc (viewtopic.php?f=136&t=24293). c2cap is designed to work directly with a webcam, while c2enc can encode arbitrary video sources. To use them with your webcam (and with the mjpg_streamer setup detailed before) you have two options, based on the quality of your webcam. Encoding on the C0/C1 has some quirks (e.g. sometimes the video freezes), but they're being looked into. Note that even if you have hardware encoding you're not gaining much in terms of delay. Because of the various pipelines (2 ffmpeg process and an encoder) there are various sizes of queues in the process pipeline that cause video to come out with a delay of about 5s, so - still not usable for remote control.

1. If you have a webcam which doesn't corrupt or drop frames, use c2enc

To do this, install c2enc on your system
Code: Select all
$ git clone https://github.com/OtherCrashOverride/c2_vpcodec
$ cd c2_vpcodec
$ make
$ sudo cp libvpcodec.so /usr/lib
$ cd ..
$ git clone https://github.com/OtherCrashOverride/c2enc
$ cd c2enc
$ make
$ sudo cp c2enc /usr/local/bin


Next, install the ffmpeg helper script (ffmpeg-hardware-encoder-c2-c2enc.sh):
Code: Select all
$ sudo wget https://raw.githubusercontent.com/mad-ady/odroid-webcam-scripts/master/ffmpeg-hardware-encoder-c2-c2enc.sh -O /usr/local/bin/ffmpeg-hardware-encoder-c2-c2enc.sh
$ sudo chmod a+x /usr/local/bin/ffmpeg-hardware-encoder-c2-c2enc.sh


This script has a small configuration header where you can set things like desired framerate, dimensions of the video, output bitrate and user/password to access the mjpeg stream. Edit those values to match your installation. Internally, the script starts two threads - a "feeder" which reads data from mjpg_streamer and writes it to a named pipe and a "consumer", which reads the pipe, sends the data to c2enc and outputs the encoded video and muxes it with audio and sends it to ffserver. The named pipe was used in order to allow the two ffmpeg processes to run on different cores (http://www.linuxquestions.org/questions ... pe-718145/). Sometimes reading from mjpg_streamer fails (ffmpeg crashes on some bad input), so that thread is restarted when needed - you should experience a minor "frozen image". The whole pipeline is killed when streaming stops.

To enable this, replace /etc/systemd/system/ffmpeg.service with:
Code: Select all
$ sudo wget https://raw.githubusercontent.com/mad-ady/odroid-webcam-scripts/master/ffmpeg-c2enc.service -O /etc/systemd/system/ffmpeg.service


Now, if you turn on the ffmpeg service you should get a steady hardware encoded stream from your webcam.


If your webcam drops frames or creates corrupt frames you might experience intermittent sound loss because the incoming framerate in the encoder is lower than what is output. In this case you need to either lower the framerate in the script, or switch to the next method. With Hardkernel's 720p webcam and this method I can get an output stream with almost 10fps reliably at 720p (9 is more like it).


2. If you have a webcam which sometimes drops frames, use c2cap

Because c2cap expects to read from a camera device directly, and because we want to keep mjpg_streamer, we will need to create a fake webcam, read data from mjpg_streamer, write to the fake webcam and read from the fake webcam with c2cap and pass the data to ffmpeg for packaging and streaming. To create a fake webcam we'll need v4l2loopback:
Code: Select all
$ sudo apt-get install linux-headers-c2
$ sudo apt-get install v4l2loopback-dkms v4l2loopback-utils
$ sudo modprobe v4l2loopback video_nr=6 devices=1 card_label="Fake webcam"
$ v4l2loopback-ctl set-fps 25 /dev/video6


The dkms package sould take care and recompile the correct v4l2loopback module automatically for each kernel update.

Next, you have to prepare the system to load the v4l2loopback module at startup and set the correct permissions. This could be done from /etc/rc.local, by adding the lines before exit 0:
Code: Select all
$ sudo vi /etc/rc.local
chmod a+rw /dev/ion /dev/ge2d
modprobe v4l2loopback video_nr=6 devices=1 card_label="Fake webcam"
v4l2loopback-ctl set-fps 25 /dev/video6


And next, either reboot, or execute /etc/rc.local manually:
Code: Select all
$ sudo /etc/rc.local


Next, install c2cap:
Code: Select all
$ git clone https://github.com/OtherCrashOverride/c2_vpcodec
$ cd c2_vpcodec
$ make
$ sudo cp libvpcodec.so /usr/lib
$ cd ..
$ git clone -b beta1  https://github.com/OtherCrashOverride/c2cap.git
$ cd c2cap
$ make
$ sudo cp c2cap /usr/local/bin
$ sudo chmod a+rw /dev/ion /dev/ge2d
$ cd ..


We'll also be installing a similar helper script (ffmpeg-hardware-encoder-c2-c2cap.sh):
Code: Select all
$ sudo wget https://raw.githubusercontent.com/mad-ady/odroid-webcam-scripts/master/ffmpeg-hardware-encoder-c2-c2cap.sh -O /usr/local/bin/ffmpeg-hardware-encoder-c2-c2cap.sh
$ sudo chmod a+x /usr/local/bin/ffmpeg-hardware-encoder-c2-c2cap.sh


You can edit the script and set your desired resolution, frame rate and credentials for mjpg_streamer. The script is very similar to the one presented above with the same pipeline, except the named pipe has been replaced by a /dev/video device. The advantage over the named pipe is that the /dev/video device duplicates missing frames, allowing the reader to read at 25fps, even if the writer writes only 10fps.

To enable this, replace /etc/systemd/system/ffmpeg.service with:
Code: Select all
sudo wget https://raw.githubusercontent.com/mad-ady/odroid-webcam-scripts/master/ffmpeg-c2cap.service -O /etc/systemd/system/ffmpeg.service


Now, if you turn on the ffmpeg service you should get a steady hardware encoded stream from your webcam.

Hardware encoding on XU3/XU4
In order to get hardware encoding support on XU3/4 you need to be running a more recent kernel (4.8 or 4.9). First you'll need to set up your system, install the new kernel and install ffmpeg with hardware encoding support. You can find all the instructions on this thread: viewtopic.php?f=95&t=24366. Once you are able to transcode a test video using the hardware encoder you can continue to set up your webcam for streaming.

As before, you will need to replace the /etc/systemd/system/ffmpeg.service with:
Code: Select all
$ sudo wget https://raw.githubusercontent.com/mad-ady/odroid-webcam-scripts/master/ffmpeg-xu4.service -O /etc/systemd/system/ffmpeg.service


You may also need to edit it to fit your needs.
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Fri Jul 14, 2017 11:00 pm

I've made some small changes to the hardware streaming scripts - so that they use pulseaudio for microphone access, instead of ALSA. This way you can share the microphone between multiple processes.
If only your script will be using the microphone, than ALSA is good enough. Otherwise, if other processes (e.g. Mycroft) need to use the microphone at the same time, you will need PulseAudio. To set up PulseAudio you can follow these guides: viewtopic.php?f=54&t=27271&p=195770#p195776. To record from ffmpeg with pulseaudio sound, you can run:
Code: Select all
$ ffmpeg -framerate 5 -f mjpeg -i 'http://odroid:odroidpass@127.0.0.1:8090/?action=stream' -f pulse -server 127.0.0.1 -i alsa_input.usb-Sonix_Technology_Co.__Ltd._USB_2.0_Camera-02.analog-mono -acodec libmp3lame -c:v libx264 -preset ultrafast -r 5 -pix_fmt yuv420p -b:v 1500k -async 1 myvideo.mp4



Here are the commits to the recording scripts:
https://github.com/mad-ady/odroid-webca ... 30d9fec7f3
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby rabeb » Sun Mar 25, 2018 12:43 am

i have an odroid-c2 and an usb camera 720p when i execute this commande $ sudo /usr/local/bin/mjpg_streamer -i 'input_uvc.so -r 1280x720 -m 50000 -n -f 25 -d /dev/video0' -o 'output_http.so -p 8090 -w /usr/local/share/mjpg-streamer/www/ -c odroid:odroidpass' then when i tried to execute l'URL http://odroid-ip:8090/ the resukt was : server not found !! can u help me please
rabeb
 
Posts: 11
Joined: Sun Mar 25, 2018 12:22 am
languages_spoken: english
ODROIDs: odroid-c2

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Sun Mar 25, 2018 2:14 am

What output do you get when you run mjpg_streamer?
And odroid-ip should be replaced by the actual IP address.
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby rabeb » Sun Mar 25, 2018 4:30 am

when i run mjpeg_streamer i got :
odroid@odroid64:~$ mjpg_streamer
MJPG Streamer Version: git rev: 8cc9d22c1e79905d529a248ccf05bbf0625e0bf3
o: www-folder-path......: disabled
o: HTTP TCP port........: 8080
o: HTTP Listen Address..: (null)
o: username:password....: disabled
o: commands.............: enabled
bind: Address already in use
bind: Address already in use
o: server_thread(): bind(8080) failed
and about ipconfig i got :
odroid@odroid64:~$ ifconfig
eth0 Link encap:Ethernet HWaddr 00:1e:06:34:98:9a
UP BROADCAST MULTICAST MTU:1500 Metric:1
RX packets:0 errors:0 dropped:0 overruns:0 frame:0
TX packets:11 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:1000
RX bytes:0 (0.0 B) TX bytes:1506 (1.5 KB)
Interrupt:40

lo Link encap:Local Loopback
inet addr:127.0.0.1 Mask:255.0.0.0
inet6 addr: ::1/128 Scope:Host
UP LOOPBACK RUNNING MTU:4096 Metric:1
RX packets:2394 errors:0 dropped:0 overruns:0 frame:0
TX packets:2394 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:0
RX bytes:417357 (417.3 KB) TX bytes:417357 (417.3 KB)

wlan0 Link encap:Ethernet HWaddr 00:11:7f:40:51:0d
inet addr:192.168.1.6 Bcast:192.168.1.255 Mask:255.255.255.0
inet6 addr: fe80::211:7fff:fe40:510d/64 Scope:Link
UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1
RX packets:81984 errors:0 dropped:5402 overruns:0 frame:0
TX packets:36547 errors:0 dropped:0 overruns:0 carrier:0
collisions:0 txqueuelen:1000
RX bytes:75997713 (75.9 MB) TX bytes:5559610 (5.5 MB)

and when i execute the url http://192.168.1.6:8090/ i got "enable to connect " ??
rabeb
 
Posts: 11
Joined: Sun Mar 25, 2018 12:22 am
languages_spoken: english
ODROIDs: odroid-c2

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Sun Mar 25, 2018 8:03 pm

mjpeg_streamer needs to be executed with a list of parameters as you first posted. Also, port 8080 seems to be in use by something else (sudo netstat -tpan | grep :8080 will tell you who it is), so try a different port.
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby rabeb » Mon Mar 26, 2018 6:13 am

when i tried to execute this command i saw nothing
Attachments
29527580_1970527853197318_304399956_o (1).jpg
29527580_1970527853197318_304399956_o (1).jpg (216.14 KiB) Viewed 1780 times
rabeb
 
Posts: 11
Joined: Sun Mar 25, 2018 12:22 am
languages_spoken: english
ODROIDs: odroid-c2

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Mon Mar 26, 2018 4:32 pm

Ok, I got a bit of time and played with it on my development N1.

I started mjpg_streamer with:
Code: Select all
sudo /usr/local/bin/mjpg_streamer  -i 'input_uvc.so -r 1280x720 -n -f 30 -d /dev/video0' -o 'output_http.so -p 8090 -w /usr/local/share/mjpg-streamer/www/ -c odroid:odroidpass'

So, the port was 8090, not 8080 as I previously had you check - sorry.

Output should be:
Code: Select all
root@n1-pre:~# sudo /usr/local/bin/mjpg_streamer  -i 'input_uvc.so -r 1280x720 -n -f 30 -d /dev/video0' -o 'output_http.so -p 8090 -w /usr/local/share/mjpg-streamer/www/ -c odroid:odroidpass'
MJPG Streamer Version: git rev: 8cc9d22c1e79905d529a248ccf05bbf0625e0bf3
 i: Using V4L2 device.: /dev/video0
 i: Desired Resolution: 1280 x 720
 i: Frames Per Second.: 30
 i: Format............: JPEG
 i: TV-Norm...........: DEFAULT
 o: www-folder-path......: /usr/local/share/mjpg-streamer/www/
 o: HTTP TCP port........: 8090
 o: HTTP Listen Address..: (null)
 o: username:password....: odroid:odroidpass
 o: commands.............: enabled



I am able to access the web interface at http://192.168.228.15:8090/
If it doesn't output something similar when you try it, post the output of the following commands:
Code: Select all
sudo /usr/local/bin/mjpg_streamer  -i 'input_uvc.so -r 1280x720 -n -m 50000 -f 25 -d /dev/video0' -o 'output_http.so -p 8090 -w /usr/local/share/mjpg-streamer/www/ -c odroid:odroidpass'
ls -l /usr/local/share/mjpg-streamer/www/
ls -l /dev/video*
netstat -tpan | grep :8090
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby rabeb » Tue Mar 27, 2018 2:04 am

thanks for ur help i succeed to resolve the problem now i'm able to access to the interface web i got videos but without sound ! (we use only odroid-c2+ wifi + usb camera) haw can i solve it ??
rabeb
 
Posts: 11
Joined: Sun Mar 25, 2018 12:22 am
languages_spoken: english
ODROIDs: odroid-c2

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Tue Mar 27, 2018 2:30 am

Keep reading the first post. mjpg_streamer handles only video. You need ffmpeg to stream video + audio in a single format.
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby rabeb » Tue Mar 27, 2018 6:56 am

yes i did them all even the hardware encoding and finally i got a steady hardware encoded stream but without sound same problem can u help me to solve it please ? and thank u
rabeb
 
Posts: 11
Joined: Sun Mar 25, 2018 12:22 am
languages_spoken: english
ODROIDs: odroid-c2

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Tue Mar 27, 2018 12:50 pm

Show me your ffmpeg command, arecord -L and run mediainfo (install it first) on a file outputted by ffmpeg and show me its output. Also check recording volume with alsamixer (make sure it's not muted).
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby rabeb » Wed Mar 28, 2018 1:19 am

here are the outputs of ffmpeg , arecord -L and mediainfo and for the recording volume with alsamixer i use it at maximum like it was mentioned on the first post
Attachments
mdiainfo2.png
mdiainfo2.png (61.38 KiB) Viewed 1709 times
mediainfo.png
mediainfo.png (111.45 KiB) Viewed 1709 times
listeaudio2.png
listeaudio2.png (56.23 KiB) Viewed 1709 times
listeaudio.png
listeaudio.png (102.99 KiB) Viewed 1709 times
ffmpeg.png
ffmpeg.png (137.55 KiB) Viewed 1709 times
rabeb
 
Posts: 11
Joined: Sun Mar 25, 2018 12:22 am
languages_spoken: english
ODROIDs: odroid-c2

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Wed Mar 28, 2018 1:41 am

Your commands and output looks fine. You should get sound, unless you're trying to do software encoding at a too high framerate which ffmpeg can't sustain in realtime. In this case you should see alsa overflow errors and an effective framerate lower than what you requested. Try this to record audio only and see if you can hear anything:
Code: Select all
ffmpeg -f alsa -i plughw:CARD=Camera,DEV=0 -acodec libmp3lame test.mp3
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby rabeb » Wed Mar 28, 2018 10:36 pm

i did it and i got a record audio but i can't hear anything :( i did exactly the same steps mentioned in the first post and i got the same outputs except i didn't do the creation of the file /etc/systemd/system/mjpg_streamer.service does it has any cause on my problem ?
rabeb
 
Posts: 11
Joined: Sun Mar 25, 2018 12:22 am
languages_spoken: english
ODROIDs: odroid-c2

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Wed Mar 28, 2018 10:42 pm

No, the service file doesn't make any difference.
Show me some screenshots of alsamixer, especially recording section of the usb webcam.
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby rabeb » Thu Mar 29, 2018 4:42 am

thank u so much for your help the problem was solved when i try to watch the recording videos on my pc i hear sound so was that i didn't use any accessory for sound with my odroid-c2 did u advice me to use the Stereo Boom Bonnet ? if no which other accessory can i use for that ?
rabeb
 
Posts: 11
Joined: Sun Mar 25, 2018 12:22 am
languages_spoken: english
ODROIDs: odroid-c2

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Thu Mar 29, 2018 5:16 am

Glad to hear it works.
You can use either the cheap usb soundcard or the boom bonnet. The disadvantage of the boom bonnet is that you need to put the speakers in an enclosure, otherwise sound is not that good.
I'm using the boom bonnet on one odroid (chose it so that I wouldn't need powered speakers) and usb soundcard on another + cheap usb powered speakers. Both perform equally well for my needs, though I would stay away from usb on c2.
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1

Re: [Howto] Odroid RTSP webcam

Unread postby rabeb » Fri Mar 30, 2018 5:08 am

I want to thank you so much for taking time out to answer my questions , I greatly appreciate the assistance you have provided me :D :D
rabeb
 
Posts: 11
Joined: Sun Mar 25, 2018 12:22 am
languages_spoken: english
ODROIDs: odroid-c2

Re: [Howto] Odroid RTSP webcam

Unread postby rabeb » Mon Apr 16, 2018 3:22 am

hello mad_ady please i have another question have u any idea how can i import my usbcam in a python code ( i mean imort picamera how can i change it to import my usbcam )
rabeb
 
Posts: 11
Joined: Sun Mar 25, 2018 12:22 am
languages_spoken: english
ODROIDs: odroid-c2

Re: [Howto] Odroid RTSP webcam

Unread postby mad_ady » Mon Apr 16, 2018 4:07 am

Sorry, I have no idea. Ideally start a new thread and look for python modules that deal with images (like pil), not speciffically picamera. At worst case you can save snapsots to files and load those in your code.
User avatar
mad_ady
 
Posts: 4035
Joined: Wed Jul 15, 2015 5:00 pm
Location: Bucharest, Romania
languages_spoken: english
ODROIDs: XU4, C1+, C2, N1


Return to Ubuntu (All Linux'es)

Who is online

Users browsing this forum: No registered users and 1 guest