Showing posts with label nvidia. Show all posts
Showing posts with label nvidia. Show all posts

Wednesday, May 25, 2016

impressive CPU offload and FPS boost from NVidia "nvenc" codec

As I just purchased an NVidia GeForce GTX 970, I installed the FFmpeg "nvenc" codec to see if I would gain any benefits from the hardware encoding acceleration on the card.  From the results, I think I did!

H264 encode
In the picture below, you can see that the H264 maxes out my CPU at 92% while I transcode a 2.7K GoPro video.  Also, final FPS of the encode is 23:

Now here is the encode using the Maxwell class CPU on the GTX 970.  CPU utilization is down to 15% and my FPS goes from 20 to 38!  Impressive!

Encoding Parameters
The params are be a bit different between the two, but I think they might be good enough for a comparison.

H264
ffmpeg -i GOPR0099.MP4 -profile:v high -preset medium -vcodec libx264 -acodec copy output2.mp4

NVenc
ffmpeg -i GOPR0099.MP4 -c:v nvenc -acodec copy nvenc.mp4

Update 12/14/22
Compare these parameters:
FFMPEG_PRESET="-c:v libx264 -f mp4 -pix_fmt yuv420p -profile:v baseline -preset slow "
FFMPEG_PRESET="-c:v h264_nvenc -f mp4 -pix_fmt yuv420p -preset p2 -tune hq -b:v 5M -bufsize 5M -maxrate 10M -qmin 0 -g 250 -bf 3 -b_ref_mode middle -temporal-aq 1 -rc-lookahead 20 -i_qfactor 0.75 -b_qfactor 1.1"

H264 encode is roughly 133 fps and takes twice as long. Notice the two tall CPU spikes.

Nvenc encode is roughly 327 fps and half the time of H264. The small CPU spikes on the right that are 1/2 the length of the tall ones.
Actual timing boost is impressive, from 44s to 14s
Regular libx264

real    0m44.101s
user    9m51.905s
sys    0m5.016s

H264_nvenc
frame= 5898 fps=417 q=22.0 size=  126976kB time=00:03:18.14 bitrate=5249.7kbits/s dup=1 drop=0 speed=  14x No more output streams to write to, finishing.

real    0m14.730s
user    1m27.067s
sys    0m1.804s

Of course, scaling functions need special tweaks to command line that I haven't figured out and interpolate commands are still extremely slow.

 Custom command used:
reset;/home/sodo/scripts/reviewCut/renderEdit.sh 20221105.ts 1 200 2 5 "Blump" E

Friday, September 10, 2010

making sure opengl is available

This troubleshooting is listed in this post:
https://init.linpro.no/pipermail/skolelinux.no/cinelerra/2010-January/016493.html

But I thought I'd repost here so that I always have this information at hand:

I have a GeForce 8800GT card installed in my box. I see glxinfo says I have OpenGL:
[sfr...@ogre my_cinelerra]$ glxinfo | head -20
name of display: :0.0
display: :0 screen: 0
direct rendering: Yes
server glx vendor string: NVIDIA Corporation
server glx version string: 1.4


I see glxgears points to the libGL.so.1 in /usr/lib64/nvidia:
[sfr...@ogre my_cinelerra]$ ldd `which glxgears`
linux-vdso.so.1 => (0x00007fff85dff000)
libGL.so.1 => /usr/lib64/nvidia/libGL.so.1 (0x00007f791fa75000)
libm.so.6 => /lib64/libm.so.6 (0x0000003f1bc00000)
libX11.so.6 => /usr/lib64/libX11.so.6 (0x0000003f1e800000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x0000003f1c400000)
libc.so.6 => /lib64/libc.so.6 (0x0000003f1b800000)

libGLcore.so.1 => /usr/lib64/nvidia/libGLcore.so.1 (0x00007f791e5a3000) libnvidia-tls.so.1 => /usr/lib64/nvidia/tls/libnvidia-tls.so.1 (0x00007f791e4a1000)

libXext.so.6 => /usr/lib64/libXext.so.6 (0x0000003f1ec00000)
libdl.so.2 => /lib64/libdl.so.2 (0x0000003f1c000000)
libxcb.so.1 => /usr/lib64/libxcb.so.1 (0x0000003f1e400000)
/lib64/ld-linux-x86-64.so.2 (0x0000003f1b400000)
libXau.so.6 => /usr/lib64/libXau.so.6 (0x0000003f1e000000



I see that the libGL.so.1 in that directory has the appropriate OpenGL hooks:
[sfr...@ogre usr]$ strings -a /usr/lib64/nvidia/libGL.so.1 | grep glDeleteShader
glDeleteShader
[sfr...@ogre usr]$ strings -a /usr/lib64/nvidia/libGL.so.1 | grep glUseProgram
glUseProgram
glUseProgramObjectARB


I've tried manually pointing configure.in to the /usr/lib64/nvidia directory:
AC_CHECK_LIB([GL], [glUseProgram],
[OPENGL_LIBS="-lGL"; libGL=yes],
# On SUSE/OpenSUSE, NVidia places the OpenGL 2.0 capable library in
/usr/X11R6/lib
# but it doesn't place a libGL.so there, so the linker won't pick it up
# we have to use the explicit libGL.so.1 path.
save_LIBS="$LIBS"
for l in /usr/lib64/nvidia /usr/X11R6/lib /usr/X11R6/lib64; do
LIBS="$l/libGL.so.1"
AC_MSG_CHECKING(for glUseProgram in $l/libGL.so.1)
AC_TRY_LINK([],[extern int glUseProgram();
glUseProgram();],
[OPENGL_LIBS="$l/libGL.so.1";
libGL=yes],[libGL=no])
AC_MSG_RESULT([$libGL])
test $libGL = yes && break
done
LIBS="$save_LIBS"


In the end, even though ./configure did not recognize that I had openGL properly installed, I had to explicitly enable opengl on my ./configure line:
./configure --enable-opengl

Who knew?
da mule

Thursday, January 24, 2008

capturing WinAmp visuals

Like a lot of us who are into film and video editing, I am also a part-time musician. For Christmas, I received a very exciting present: a digital model of an old analog synthesizer, the Creamware Prodyssey:
Sound on Sound review of Prodyssey

Oh, how lovely it is to create sounds with this box of sliders and switches! The music part will come later, but right now I am enjoying creating rythymic bleeps and bloops the old fashioned way using oscillators, filters and envelopes. Attack, decay, sustain, release, LFO and all that lot. As I was creating this cacaphony, I thought that it would be cool to have some visualizations to go along with the audio.

Years ago using Windows XP, I streamed music visualizations from WinAmp out through S-Video to a second computer that would capture the pretty pictures. The quality wasn't that great. Of course, if WinAmp had an export feature, that would be the best. But working with what I had today, I figured now that I've purchased a mighty dual quad core, I should be able to capture the screens within the machine itself. Of course, my primary OS is now Linux. And WinAmp is not made for Linux. And I didn't know of a Linux based visualizer that looked quite as good as WinAmp. What to do? I remember using Wine, the Windows emulator for Linux, back in 2001, but it wasn't very stable. I wonder if it's more stable these days?

With thoughts of sugar-plum visualizations in my head, I installed Wine the other day. Given my past experiences, I wasn't very hopeful. I was pleasantly surprised to find that the latest version of Wine was easy to setup on my Fedora 7 x86-64 system. Once Wine was setup, I installed WinAmp through a short series of steps. Excellent! I opened an MP3 file and was again, glad to hear the MP3 playing. Wow! Two in a row! Can we go for a trifecta? With the audio coming out of WinAmp through Wine, I decided to go for broke and started up the Advanced Visualization Studio visualizer. I could hardly believe my eyes when AVS played in gorgeous colors in a window on one of my monitors!


So now I have WinAmp working under Wine. But how to capture the beautiful screens? Here, Cinelerra came to the rescue. Using my BFG Geforce 8500 GT PCI Express 256MB card and a dual-head configuration, I opened Cinelerra in my right monitor, started WinAmp under Wine and kicked off the visualization in the left monitor. I expanded the window to roughly DVD resolution (720x480) and tweaked my Cinelerra screencapture recording settings to match the visualization window. Lo and behold, Cinelerra was capturing the resolutions at DVD quality! I looked carefully for frame drops. This is listed in the Cinelerra Recording window as "Frames Behind". I did not see one frame drop! Awesome!


I synchronized the results of the screen capture with the music on the Cinelerra timeline and output the audio and video to DVD-compatible formats. I then burnt a menuless DVD and was off to the races! I had created some "music", added a visualization track and burned a DVD! Sweet!

Next up:
How cool would it be to capture the visualizations in HD and play them on my HDTV at 720P resolution? Ooooh. Aaaah.

Tuesday, October 23, 2007

NVidia in da house (er, new server)!

So I've been on a quest to get a modern PCI Express card working in my new Dell SC1430. With this goal in mind, I had ordered a PCI Express 8x to 16x Adapter like this one off of eBay:
http://www.orbitmicro.com/global/pciexpressx8tox16adapter-p-755.html

The SC1430 has 8x connectors (running 4x speed PCI Express). In the hopes that it would work in the box, I bought a BFG Geforce 8500 GT PCI Express 256MB card from BestBuy. I checked and it is cheaper on Amazon.

Here's a good article from Tom's Hardware on PCI Express scaling:
http://www.tomshardware.com/2007/03/27/pci_express_scaling_analysis/

I figured that even if the card didn't work, I'd use it later in the next box I build. Now, I understood that the adapter would raise the card in the slot. The Dell has a hinged metal door that holds all the expansion cards in place, but I noticed that that little door can be left open while the case is closed. This would allow me to use the card for a while until I had the chance to machine a new bracket for the card.

Last night, I attached the adapter to the new BFG card (with a satisfying "click", no less) and put it in the first PCI Express slot (SLOT1_PCIE) of the Dell. I left the hinged door open, but was able to close the case. I hooked up my FP1901 to the digital output and my FP1907 to the analog output. I can tell you I was quite surprised when I started the server and the FP1901 that was connected to the digital output came to life!

I booted into runlevel 3 (nongraphical, multiuser mode) in Fedora and grabbed the latest NVidia driver installer for my 64-bit OS via lynx. I ran the installer. There was not a default kernel module for my particular kernel, so the installer created one and then asked if I wanted to create a new xorg.conf for the X Windows. I said yes and the installer finished. I was elated to see X startup with the NVidia splash screen! I soon had Twin View setup and GLXgears gave me 6200FPS! Unlike the ATI card I had running in the box previously, mplayer and xine ran my HDV videos like champs! Hoohah! Cinelerra runs well, but I've decided to not compile OpenGL just yet, as it did cause some instability on my previous box.

I tell you, NVidia drivers are an absolute joy to setup and use. As I've reiterated many times on this blog, most recently here:
/2007/10/year-later-ati-linux-drivers-still-suck.html

ATIs Linux drivers are riddled with bugs; hence, I returned the ATI card, a VisionTek x1550, to the store.

In sum, the BFG Geforce 8500 GT PCI Express 256MB card works in the Dell with a PCI Express 8x to 16x adapter card. If you use a PCI Express adapter, be aware that the card will be raised in the slot when it is seated.

the mule

ps - Now I just have to debug and fix a nagging audio noise with Fedora and the Dell and I will be one happy dude.

Wednesday, September 13, 2006

ATI OpenGL 2.0 implementation incomplete..hello NVidia!

Sad to say, but all you ATI lovers (like I USED to be) are in for a rude surprise with ATIs' "OpenGL2.0" Linux driver implementation. After 1 1/2 days of finally getting my ATI All in Wonder 9800 Pro to actually use Direct Rendering and OpenGL2.0 and see it working well (5000FPS on glxgears), it was very disheartening to find out that ATI had not provided the proper 2.0 hooks for Cinelerra to use. Specifically, programming hooks like glDeleteShader were not implemented. The troubleshooting was quite informative, as I am not an active C programmer.

The solution was typical of today's PC market. If it don't work, buy one that does! So I went to B&H Photo and bought a BFG nVidia GeForce 7600GS, 512MB. Twenty minutes after getting home, the card was in the box, drivers were compiled, installed and working. Fifteen minutes after that, Cinelerra was compiled and I was using OpenGL2.0! Yeehoo!! And at twice the playback speed of what I was getting previously with the ATI card. I will report back on the performance of this card once I actually USE it.