Monitor progress of cp, mv, gzip, mysql and CLI commands

To monitor progress of an ongoing copy or mysql dump import you might try progress CLI tool. Track progress of commands like cp, mv, dd, tar, cat, rsync, grep, fgrep, egrep, cut, sort, md5sum, sha1sum, sha224sum, sha256sum, sha384sum, sha512sum, adb, gzip, gunzip, bzip2, bunzip2, xz, unxz, lzma, unlzma, 7z, 7za, zcat, bzcat, lzcat, split, gpg, etc.

Advertisements

Dual monitor Awesome WM with xrandr

I struggled a couple of days until I figured it out that setting up dual monitor via xrandr is needed after X started and before aweome started, so in between.

The solution was adding xrandr exec line into my .xinitrc

# ~/.xinitrc
xrandr --setprovideroutputsource modesetting NVIDIA-0
xrandr --auto

function VGAConnected {
   ! xrandr | grep "^VGA-1" | grep disconnected
}
if VGAConnected; then
    xrandr --output eDP-1-1 --mode 1920x1080 --primary \
           --output VGA-1-1 --mode 1920x1080 --rotate normal --right-of eDP-1-1
fi

exec awesome -c .config/awesome/rc.lua

to get the screen names (eDP-1-1, VGA-1-1, etc.) just run xrandr without args.

Tinkering with GNU parallel and wget for broken link checking

Finally found a parallel spidering solution. Online solutions didn’t really fit, because I don’t want to overload the production site and they can’t reach http://localhost. Trying out parallel + wget snippet from https://www.gnu.org/software/parallel/man.html#EXAMPLE:-Breadth-first-parallel-web-crawler-mirrorer looks promising.

#!/bin/bash

URL=$1
 # Stay inside the start dir
 BASEURL=$(echo $URL | perl -pe 's:#.*::; s:(//.*/)[^/]*:$1:')
 URLLIST=$(mktemp urllist.XXXX)
 URLLIST2=$(mktemp urllist.XXXX)
 SEEN=$(mktemp seen.XXXX)

# Spider to get the URLs
 echo $URL >$URLLIST
 cp $URLLIST $SEEN

while [ -s $URLLIST ] ; do
 cat $URLLIST |
 parallel lynx -listonly -image_links -dump {} \; \
 wget -qm -l1 -Q1 {} \; echo Spidered: {} \>\&2 |
 perl -ne 's/#.*//; s/\s+\d+.\s(\S+)$/$1/ and
 do { $seen{$1}++ or print }' |
 grep -F $BASEURL |
 grep -v -x -F -f $SEEN | tee -a $SEEN > $URLLIST2
 mv $URLLIST2 $URLLIST
 done

rm -f $URLLIST $URLLIST2 $SEEN

Great exercise for the CPUs
htop gnu parallel

When the command finishes then the next step is parsing access_log

grep -r ' 404 ' /var/log/httpd/access_log | cut -d ' ' -f 7 | sed -r 's/^\//http\:\/\/localhost\//g'

Video cutting from start to end time using ffmpeg

Sometimes I need pieces from movies to upload to YouTube for example.

It’s possible with the following nice oneliner:

ffmpeg -i IN.mp4 -ss 01:12:55 -t 35 -async 1 OUT.mp4

The solution came from http://stackoverflow.com/questions/18444194/cutting-the-videos-based-on-start-and-end-time-using-ffmpeg